Verification is a huge topic and a number of books exist on the subject. As a goal of this blog, treatment of the topic has to be limited to a few aspects only.
Wrong functionality which does not meet the end specification results in products which dont meet customer expectations. Hence verification of the design is needed to make sure that the end specification is met and corrective actions are taken on designs which dont meet them. If verification does not catch a bug in design, wrong designs get out in to the market.
Coverage metrics are defined by most verification engineers. Based on the level of representation, here are a few coverage metrics..
1. Code based metrics (HDL code)
2. Circuit structure based metrics (Netlist)
3. State-space based metrics (State Transition Graphs)
4. Functionality based metrics (User defined Tasks)
5. Spec based metrics (Formal or executable spec)
There are many branches to verification of digital systems.
Below we list a couple of them
1. Simulation (for digital systems)
2. Advanced formal verification of Hardware [equivalence checking, Assertions, Model Checking]
3. Hardware Acceleration (FPGA/Emulation), or hardware/software co-design for simulation..
Simulation aims to verify a given design specification. This is achieved by building a computer model of the hardware being designed and executing the model to analyze it's behavior. For the model to be accurate, it has to include as much information in it as possible to be of any realistic value. At the same time, the model should not consume too much computer memory and operations on the model should not be run time intensive.
There are numerous levels of abstraction at which simulation can be performed..
1. Device level
2. Circuit level
3. Timing and macro level
4. Logic or gate level
5. RTL level
6. Behavioral level
The specification (for the computer model) for a digital system is usually written at a behavioral level or RTL level (we will discuss more about gate level sim later!) :). In addition to the design requirements in the spec, more behavioral or RTL code is written in the form of a wrapper (test bench) around the original design to test and see if the design meets the design intent. The wrapper logic probes the design with functional vectors, collects the responses and verifies them against the expectated response.
A simulator has a kernel to process an input description and apply stimuli on it and represent the result to an end user on a waveform viewer. Internally it creates models for gates, delay, connectivity and numerous other variables.
There are various logic simulators available from numerous CAD vendors. (ModelSim, ncVerilog, VCS). Most of these simulators are a combination of event driven and cycle based mechanisms. They can also handle mixed language designs (VHDL+verilog) and adhere mostly to the Language specification which the IEEE standards committee comes out with. Some of these simulators are mixed mode simulator, i.e they can handle multiple levels of abstraction.
Verification technology has matured over the years. We have many more mechanisms in place apart from simulation.
I will try to list a few of them below and we will cover each one in the future
Detection: Simulation, Lint Tools, Semi-formal, Random generators, Formal verification
Debug and comprehension: waveforms, debug systems, Behavior based systems which use formal technology
Infrastructure: Intelligent testbenches, Hardware Verification Langauages, Assertions
A good reference for verification is "writing Test benches by Janick Bergeron".