KEMBAR78
Lecture 1 | PDF | Digital Electronics | Electronic Circuits
0% found this document useful (0 votes)
10 views18 pages

Lecture 1

ENGR 210 is a course on Fundamentals of Digital Design taught by Dr. Kewei Li, covering topics from combinational and sequential digital logic to computer processor and memory organization. Students are expected to have a background in electrical circuits, programming, and advanced mathematics, with grading based on attendance, homework, exams, and projects. The course emphasizes the importance of digital design in modern computing systems and prepares students for advanced studies in embedded systems and computer architecture.

Uploaded by

Tonalli Nava
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views18 pages

Lecture 1

ENGR 210 is a course on Fundamentals of Digital Design taught by Dr. Kewei Li, covering topics from combinational and sequential digital logic to computer processor and memory organization. Students are expected to have a background in electrical circuits, programming, and advanced mathematics, with grading based on attendance, homework, exams, and projects. The course emphasizes the importance of digital design in modern computing systems and prepares students for advanced studies in embedded systems and computer architecture.

Uploaded by

Tonalli Nava
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

1/22/2025

ENGR 210:
Fundamentals of Digital Design

Dr. Kewei “Isaac” Li


Department of Digital Engineering

Jan 22, 2024

Outline
• Course Staff
• Instructor, No TA
• Course Information
• Course website, Textbooks, Prerequisites, Grading Policy, Syllabus
• Motivation
• Course Overview

1
1/22/2025

Course Instructor
Instructor: Kewei “Isaac” Li
Ph.D. University of Connecticut, CT
Post-Doctoral, Graz University of Technology, Columbia University
• Mechanical Engineering:
• Finite Element Analysis
• Nonlinear Elasticity (Hyperelasticity)
• Experimental Biomechanics
• Computer Programming
• Research Interests:
• Constitutive modeling for soft biological tissues
• Modeling of prosthetic heart valves
• Physics-informed machine learning

Course Information
• Course textbook (required)
• Digital Design and Computer Architecture:
ARM Edition, Sarah Harris and David Harris
• An old version is available at lib website
• Useful websites
• Brightspace LMS: lms.liu.edu

2
1/22/2025

Course Prerequisites
• A basic understanding of electrical circuits and fundamental concepts
in electronics.
• Familiarity with binary numbers and Boolean algebra
• Computer programming, algorithms
• Working knowledge of
• Advanced Mathematics (vectors, matrices)

Attendance and Participation


• Attendance will be taken at the beginning of the class
• Students will be downgraded for poor attendance or lateness
• Zoom meeting duration is used for grading

3
1/22/2025

Learning Outcomes
Upon completion of this course, students will be able to
• Convert a specification into an implementation
• Hierarchically decompose a complex specification
• Into simpler components
• Understand concurrency
• develop concurrent system specs and implementations
• Test a given design for correctness
• develop a test bench and performing simulation
• Use state-of-the-art software tools for hardware design
• “computer-aided-design” (CAD) tools

Grading Policy
• The final letter grade will be calculated based on this table

Attendance/Participation Homework Midterm Project Final Exam


Weights 10% 30% 20% 20% 20%

• Example, if Student A received 8/10 for A&P, 60/70 for homework, 90/100
for midterm, 90/100 for project, and 90/100 for final

• Letter grade is assigned based on the total points

4
1/22/2025

Grading Policy
• Homework
• Posted on Brightspace, due one week after the assignment
• Group discussion is encouraged. But homework should be done
individually
• Late submission without reasonably justified explanation will not be
accepted
• No credit after the solution is posted
• For questions about homework/project grading, please communicate
with me within two weeks after receiving the grade

Written Exams
• Written Exams will be closed-book and closed-notes
• No cellphone, no computer
• You may bring one formula sheet with you
• Only non-programmable calculators (no memory function) are
allowed

5
1/22/2025

How to Succeed in This Course


• Attend every lecture and participate in the class
• Read the book sections before each class
• Keep up with the weekly assignments
• Seek help if needed

Course Syllabus

6
1/22/2025

Course Problems…Cheating
• What is cheating?
• Studying together in groups is not cheating but encouraged
• Turned-in work must be completely your own
• Copying someone else’s solution on a HW or Exam is cheating
• Both “giver” and “receiver” are equally culpable

• We will be better off to work on the problems alone during the exam
• We have to address the issue once the cheating is reported by
someone

Use of Generative AI
Risks of Generative AI
• Accuracy and Quality: Content generated may be of poor quality, and
generic in nature.
• Learning: Assignments are designed to help you learn, and relying on AI to
complete tasks denies you the opportunity to learn
• Over-reliance: Using AI to do your work may achieve the short-term goal,
but over-reliance on AI may prevent you from being prepared for later
exams, or future job opportunities
• Motivation: Some students are less motivated for tasks that AI can do. It is
important to understand that you need to master simple tasks (which AI can
do) before you can solve more complex problems (which AI cannot do).

7
1/22/2025

Use of Generative AI
Examples of permitted use
Despite the risks, you can still use AI to learn course materials
• Explain a given topic, or to provide an example of how programming
constructs are used
• Explain your program one line at a time
• Produce an example that is similar to assignment questions
• Explain the meaning of error messages
• Generate code to complete tasks that you have already mastered
from previous study

Use of Generative AI
Examples of inappropriate use
• Asking AI to complete an assignment for you – that you were meant
to complete
• Using AI during exams where uses have been expressly forbidden
• Any use that may prevent your personal academic growth or may
prevent you from understanding a topic or idea
• Writing a code solution in a language you know and then asking AI to
translate it into the language required for the assignment

8
1/22/2025

Motivation and Overview

Motivation
• Digital design is the foundation of all modern computing systems
• Every microprocessor, memory unit, and digital system—from simple
calculators to complex supercomputers—relies on digital circuits

9
1/22/2025

Motivation
• Semiconductor market revenue worldwide from 1987 to 2025

https://www.statista.com/statistics/266973/global-semiconductor-sales-since-1988

Motivation
• Digital design allows one to understand how
computing systems function
• Fields such as AI hardware acceleration, IoT
devices, and robotics often rely on optimized
digital circuits to achieve high performance
• Understanding digital design allows students to
contribute to innovations in those areas

• So, are you motivated to study Digital Design?

https://bostondynamics.com

10
1/22/2025

What is Digital Design?


• Generally, digital design is a type of visual communication that
presents information or a product or service via a digital interface
• In computer engineering, digital (logic) design is the process of
creating circuits that perform specific functions based on binary logic
• It involves using logic gates and other components to build systems
that can perform complex computations and data manipulations

Computing Systems

History of Computing Systems


• Electrical switches can be turned on (1) or off (0)
• Early computers made from switches were electro-mechanical
computers that used relays/switches to perform operations

• Programming was done by manually setting switches and plugs

11
1/22/2025

History of Computing Systems


• Early computers became increasingly complex, leading to the first
electronic computer consisting of ~10,000 electronic switches(1940s)
• Early computers performed thousands of calculations per second
• Transistor was invented as a semiconductor that amplifies or switches
electronic signals

History of Computing Systems


• Robert Noyce (1927-1990) co-invented the integrated circuit (IC)
• An IC is a small silicon chip that integrates multiple transistors,
resistors, capacitors, and other components into a single unit

12
1/22/2025

Introduction to Computing Systems

Complexity of Computing Systems

13
1/22/2025

Types of Computing Systems


Computing systems can be categorized by the processor type
• MCU (Microcontroller Unit)
• Embedded Systems, IoT Devices, Wearable Technology
• MPU (Microprocessor Unit)
• Personal computer, Servers, HPC
• GPU (Graphics Processing Unit)
• Rendering graphics and AI/ML computations
• Hybrid Systems
• Autonomous Vehicles, Gaming Consoles
• DSP (Digital Signal Processor)

Types of Computing Systems


Computing systems can be categorized by the processor architecture
• x86 Architecture
• Widely used in personal computers/servers, developed by Intel and AMD
• ARM Architecture (this course)
• A power-efficient, reduced instruction set computing (RISC) architecture
commonly used in mobile devices, embedded systems, and some high-
performance servers
• RISC-V Architecture
• An open-source, RISC-based architecture increasingly used in research,
embedded systems, and IoT devices.
• Others: PowerPC, Quantum Processors, AI-Specific Processors

14
1/22/2025

Why ARM Architecture


• Been around since 1980s, building a robust ecosystem over decades
• It has a wide range of proven designs, tools (like ARM Cortex-M for
IoT), and well-established supply chains.
• Extensive support by manufacturers (e.g., Qualcomm,
STMicroelectronics)
• RISC-V Architecture: low adoption, less popular
• x86 Architecture: too complex and overwhelming for beginners
• Foundation for future courses
• ENGR 341: Introduction to Vision and Robotics
• ENGR 371: Principles and Design of IoT Systems
• ENGR 411: Internet of Things (IoT) and Digital Implementation

Modern Computing Systems


Computer Engineering
• The development of the abstraction layers
that allow us to execute information-
processing applications efficiently using
available manufacturing technologies

15
1/22/2025

Modern Computing Systems

Modern Computing Systems

16
1/22/2025

Modern Computing Systems

Abstraction in EE, CS, and CE

17
1/22/2025

Course Structure
• Part 1: Combinational Digital Logic
• Transistors; logic gates; Boolean algebra; logic minization; decoders;
multiplexors; arithmetic units
• Part 2: Sequential Digital Logic
• Latches and flip-flops; Finite-state machines; counters; shift registers;
memory arrays
• Part 3: Computer Processor Organization
• Instruction set architecture; arithmetic, memory, control instructions; single-
cycle processor; FSM multi-cycle processor; pipelined processor
• Part 4: Computer Memory Organization
• Main memory; virtual memory; caches
• Part 5: Application: Raspberry Pi, Arduino

Summary
• Digital design transforms low-level circuits into hardware blocks
dedicated to processing, storing, and moving digital data
• Computer organization transforms these hardware blocks into
programmable computing systems capable of executing high-level
software
• We are now in the accelerator era which requires carefully
management of the tension between specialization and
programmability
• This course will serve as a foundation for more advanced courses
in embedded systems and computer architecture allowing
students to eventually contribute to this new era

18

You might also like