Clear and realistic specifications should be defined as soon as possible in an integrated circuit development program. Thorough documentation is a virtue. However, the documentation should not be so rigid that the participants are put under undue pressure to "deliver to specs". This very often stretches technologies and design disciplines beyond their capabilities and frequently leads to project failures. The concept of an "objective specification" is often useful. An Objective Specification is one that describes the functional and performance objectives of a project. It is a document that is meant to be modified and adjusted as business goals evolve and the technological limitations are better understood.

The Objective Specification is a document that attempts to communicate across organizational and scientific discipline boundaries. It describes the electrical attributes of the integrated circuit that are believed to be necessary to achieve business objectives or system performance goals. More often than not it is populated with parameters which are at or beyond the limits of technology. If that were not so, a device would likely be available in the commercial marketplace and a development project would not be required. It is most often written by the marketing function or by systems engineering. It frequently overlooks the most fundamental technological limitations of the available fabrication technologies. For example, it might define mixed signal circuit parameters that can only be implemented with passive component values which are not available in integrated circuit form e.g. megaohm resistors or capacitors in the micro farad region. It might call for speeds or levels of integration that are beyond the state-of-the-art. In any case it is a document that can be used to encourage communications, sometimes spirited, between the various disciplines which must be involved in a project.

The end-to-end integrated circuit development process is illustrated at the right. Concepts and/or block diagrams are captured as drawings or schematics. The electrical behavior and environmental responses of the circuits are simulated to verify that design objectives are met. Layout is performed and computer checks verify that the detailed topologies match the schematics and satisfy process ground rules. Devices are fabricated (an eight inch wafer is shown). The wafer is scribed and the die are packaged (high pin-count packages are shown placed on the wafer carrier). The devices are mounted on circuit boards and used in a system. A pen plot of a mixed signal layout is shown in the background -- although pen plots are seldom used for anything other than "wall hangings" in modern design. The Objective Specification provides a communications channel between the various levels and disciplines in the design.

Electrical circuit behavior is predicted by simulation. The industry standard simulation tool is Berkeley SPICE and its commercial equivalents. The SPICE programs have engendered a great deal of suspicion by some because of its inability to converge to initial operating conditions and/or its tendency to "blow up" during simulation runs. Entertaining stories regarding quirky behavior or inaccurate results abound in the industry. However, the program is capable of reliable circuit simulation and is an extremely valuable analysis tool. It does require a certain amount of "care and feeding" to run smoothly but the far majority of convergence problems yield to rather straightforward techniques. SPICE users should be good design engineers and the program is not a substitute for experience and in-depth circuit understanding. But it is a very powerful tool when used by a good designer.

SPICE (Simulation Program with Integrated Circuit Emphasis)
A great deal of focus has been placed on SPICE device models e.g. transistor models. A seemingly endless wave of field effect transistor models have been spawned. Most recently these models include a long list of parameters which are not tightly related to device physics (various versions of BSIM). They include second order dependencies of parameters on device dimensions -- among other things. The literature abounds with papers which emphasize percent accuracies in the simulation of transistor drain characteristics. The parameters for recent models are derived from test devices by curve fitting routines which are "untouched by human hands". This is equated to infallibility and inherent accuracy. There is another viewpoint. It also encourages the substitution of blind faith in extracted model parameters and the use of those models in circuit simulators for common sense and fundamental skill. Too often the recent modeling approaches generate an environment where "worst case" analysis by rote is used as an excuse when designs fail to perform as required. The industry often hears the excuse that the circuit satisfied performance specification when simulated at the environmental and process extremes and the unacceptable performance must be a "processing problem". Understandably, the processing responsibility is not pleased and a veritable war erupts between the disciplines. On the other hand, if the design function assumes all responsibility for circuit behavior and the processing function is only required to satisfy device/wafer acceptance criteria, a clear separation of responsibilities can be established. This is the way that foundry services operate. And what makes them work well. Under these conditions integrated circuit designers are much more aware of circuit performance sensitivities , are much less likely to rely on the fine details of circuit models or use modeling as an excuse rather than an aid. It seems to us that it is not realistic to focus on percent details in modeling when several tens of percent variation in critical transistor parameters are likely over the range of operating temperature and reasonable processing variations. In fact, variations as big as a factor of two to one are not uncommon. At TAG we emphasize understanding what makes circuits operate reliably, how transistors behave and what their model parameters mean, and how to make circuits robust to the vagaries of processes and environment extremes. We are in a good position to do that since we have experience with more fabrication foundries than most and have come to the conclusion that silicon answers to a higher authority than the company that owns the foundry.

We do not rely on standard cells or gate arrays. We have nothing against their use. However, they are of marginal benefit to mixed signal design. On the other hand we seldom do completely "custom" layout. The majority of digital and linear functions are either transplanted from other projects or are modified to fit the job at hand. We also recognize the advisability of remaining as "foundry independent" throughout as we can. Our layout is done symbolically. That is, we work with a coarse grid system where the centers of key features of transistors are placed on a grid which is considerably larger than the minimum mask resolution. This does not compromise density. However, it affords a degree of flexibility which is unusual in the industry because the fine detail of layout is generated by computer toward the end of a project. The concept was popularized by Carver and Meade and is inherent in the procedures of the University of California Information Sciences Institute MOSIS foundry service. However, our "roots" predate those advocates of layout flexibility by over a decade. The methodology is transparent to the foundry or the customer. But it does afford a excellent flexibility for "re-targeting" or transporting designs to various foundries as the business environment may dictate. Foundries come and go. Natural disasters strike. Companies trade hands. Business objectives change. Our approach is robust to these events.

Integrated circuit designs would seldom work if it were not for computer checks of layout ground rules and verification that the layout does correspond to the schematic-set for the device. Computer checks fall in two areas. They are: Design Rule Check (DRC) and Layout versus Schematic Check (LVS). DRC checks assure that all of the dimensional and layer-to-layer rules for the process and masking are observed. LVS checks assure that the final layout corresponds exactly with the schematics for the device. It is tempting when pressed by schedule or budget limitations to make an innocuous change to a design without repeating all of the checks. This is the most common cause of catastrophic circuit failure. We resist changes which are not checked to the point of being insubordinate to our customers because it is in their best interest.

Most designs should be analyzed with mathematical models to verify that system algorithms are correct. Sometimes this modeling takes the form of hardware descriptive language simulations. On other occasions it is more appropriate to utilize programs such as MATLAB or MATHCAD or even a spread sheet program. In the digital domain, those simulations are always repeated with behavioral modeling e.g. logic simulation. The behavioral simulation is followed by circuit level simulation which is supported by device modeling. The end-to-end procedure is tied together by complete schematic capture. Every transistor, passive component, input/output protection device, etc. is present in the schematic set. The connectivity listing and device geometric descriptions that are generated from a complete project schematic set is used as one input to LVS checking programs. The other input is the detailed polygon descriptions of all processing layers (mask layers) from the layout.

Mixed signal simulation follows the same procedure. It includes behavioral level linear function substitution during SPICE simulation in a manner similar to the substitution of a behavioral flip/flop for a transistor level flip/flop in digital simulations. Transistor level simulation is also performed in-depth. In this way entire mixed signal circuits of VLSI complexity can be simulated and analyzed with surprisingly inexpensive hardware -- the Personal Computer. It is not unusual to include complex digital functions, random access memory, read only memory, amplifiers, filters, power supplies, analog-to-digital and digital-to-analog converters, etc. in end-to-end simulations. It is common to mix and match transistor level, behavioral analog and behavioral digital blocks in a single simulation deck. Hierarchical modeling is the enabling force for projects which could not be handled in a single deck at the transistor level. It is just as realistic to perform hierarchical mixed signal modeling as it is to perform hierarchical digital modeling for devices that are too complex to be simulated at the transistor level.