Complementary Metal Oxide Semiconductor (CMOS) has been the dominant technology for digital
microelectronic systems. Over a period of four decades, the dimensions of devices were scaled down
from 10,000nm to 22nm (a 500X reduction), density increased by 65,000X, and the technology
transitioned through 18 process generations. The dominance of CMOS and further scaling is
expected to continue for the foreseeable future.
Would the design of a 1B transistor circuit be possible by individually designing each transistor,
or each logic gate or each functional block such as an adder? That would be like writing an OS in
machine code or assembly language. The design of a system of such complexity is only possible by
abstraction and automation.
It is the advances in design automation and CAD algorithms for VLSI that have made it possible
for designers to fully exploit the advances in IC scaling. The high performance general purpose
processors, ASICs (Application Specific Integrated Circuits) and FPGAs (Field Programmable Gate
Arrays) that we have today would not have been possible without the development of methods for
automating nearly all phases of the design flow. That design flow involves many layers of abstraction
and mapping of one abstraction to another, more detailed or less abstract representation.
This graduate level course provides the algorithmic underpinnings of digital VLSI CAD tools - from
high level algorithmic specifications down to an optimized netlist of logic cells. It covers the theory
and algorithms that have been the result of decades of research and development, and have been
incorporated into the many commercial tools over the past two decades. This is NOT a circuits class. It is an "applied algorithms" course, focusing on representations (abstractions) and algorithms that manipulate the representations to optimize some objective function (i.e. delay, throughput, area, power, etc.).
The knowledge and techniques learned in this course have been to be useful for many branches of computer science and engineering, not just chip design. So if you enjoy developing algorithms to solve practical problems, this would a course for you.
Prerequisites: - A undergraduate course covering the essentials of logic design including Boolean algebra,
SOP and POS forms, design of basic combinational and sequential logic components (e.g.
flip-flops and latches and their characteristics), combinational function minimization, finite
state machine design.
- Strong background in data structures and algorithms, good experience in programming in C and/or C++ can be helpful. Programming exercises will be done in Python -
a extremely easy to learn and powerful interpreted, object-oriented, high-level programming
language.
- Elementary UG background in linear circuits and knowledge of basic electrical concepts.
- This course also requires some mathematical maturity, especially in discrete mathematics.
However, all that is needed for the course will be covered in the lectures.
Main topics include: - High Level Synthesis: Behavioral level ⇒ Register Transfer Level (RTL)
- Algorithmic specification of digital system behavior
- Control and Dataflow graphs • RTL Specification
- Optimization problems in mapping behavioral specification to RTL specification: Scheduling, Resource allocation, Module Binding, targeting area/latency, cycle time, power, etc., using dataflow graphs.
- RTL ⇒ Logic
Technology independent multi-level combinational logic synthesis and optimization: Representations of Boolean functions, Exact and heuristic logic minimization; algebraic and Boolean methods, Kernel theory, decomposition based methods, including BDD based methods. Graph based cell library based technology mapping of combinational logic for area, and delay.
- Sequential Network Optimization
- Re-timing of sequential networks for delay, area and power.
- Clock skew optimization
- State Machines
- State machine equivalence, reduction, traversal
- Timing Analysis and Optimization (if Time Permits)
- Logic and Interconnect delay models
- Circuit simulation: solutions to linear systems; incorporating non-linear elements, iterative
numerical techniques; Elmore delay approximation
- Topological timing analysis of combinational and sequential networks: static timing
analysis; clocking disciplines with ETFFs and Latches
- Clock tree design and schedule optimization
Sources: The topics covered in this course can be found spread over several textbooks. None of
them are cover all the topics in this course. The material for my lectures comes from the sources
listed below and many journal and conference papers.
Reference Books: - Synthesis and Optimization of Digital Circuits, Giovanni De Michelli, McGraw Hill,
1994. Available on-line.
- Logic Synthesis and Verification Algorithms, Gary Hactel and Fabio Somenzi, Klumwer
Academic Publishers 1996. Available on-line through ASU library.
- Electronic Design Automation, Laung-Terng Wang, Yao-Wen Chang, Kwang-Ting Cheng.
Morgan Kaufmann, 2008.
- Logic Synthesis, Srinivas Devadas, Abijit Ghosh and Kurt Keutzer, McGraw Hall, 1994.
- Algorithms and Data Structures for VLSI Design, Christoph Meinel and Throsten
Theobald, Springer-Verlag 1998.
- Timing: Analysis and Optimization of Sequential Circuits, Naresh Maheshwari,
Sachin Sapatnekar. Kluwer Academic 1999.
- Electronic Circuit & System Simulation Methods, T. L. Pillage, R. A. Rohrer, C. Visweshwariah
All course materials will be made available on ASU's Blackboard system |