Post on 30-Nov-2015
description
Verification Principles
Pentium bug Intel Pentium chip, released in 1994 produced error in floating point division Cost : $475 million
ARIANE Failure In December 1996, the Ariane 5 rocket exploded 40 seconds after take off . A software components threw an exception Cost : $400 million payload.
Therac-25 Accident : A software failure caused wrong dosages of x-rays. Cost: Human Loss.
Rigorous Verification Essential
Bugs are costly!!!
A process of investigating a design and
demonstrating it to be accurate to specification.
A process of applying stimulus to the design.
Stimulus can be generated by writing test bench.
Verifying a design involves answering two questions:
• Does it work? And
• Are we done?
Does it work?
It asks Does the design match the intent?
Are we done?
It asks if we have satisfactorily compared the design and
intent, to conclude whether the design does indeed match
the intent, or if not, why not.
The process for answering the does-it-work and are-we-done
questions can be described in a simple flow diagram
Simulate
Modify stimulus
Verification plan
Design
implementation
Debug design
Testbench
Design specification
Does it
work ?
Are we
done?
Done
Are we done?
Does it work? Yes
Yes
No
No
A testbench is a virtual environment used to
verify the correctness of a design or model.
A testbench provides several basic functions,
including creating and applying stimulus and
verifying the correct interfacing and responses.
Developing a testbench environment is often the
single most important and time-consuming task
for an advanced verification team
DUT
A Testbench connects with DUT and checks
correctness of the DUT
Basic functionality:
• Generate data to DUT
• Apply data to DUT
• Receive the response from the DUT
• Check for correctness
Testbench
A
B
OUT
A
B
OUT Testbench
Design module
Truth Table A B OUT
0 0 0
0 1 0
1 0 0
1 1 1
Timing Diagram
A
B
OUT
Simulation is by far the most prevalent technique used
in functional verification today.
The ability to verify the ideas as well as the
implementation before a device is manufactured saves a
development team time and effort.
To Simulate the DUT under a variety of test conditions
including correct and faulty test inputs.
Testbench developers have been striving to meet the
goals of efficiency, reuse, and flexibility for many years.
Unfortunately, attaining these goals often makes
testbenches more complex to create and more difficult to
use.
Every testbench developer must make a trade-off
between the time and effort to create and use the
testbench versus the potential gain from making the
testbench
• efficient,
• reusable, and
• flexible
To improve the reusability of a testbench, a developer should focus
on isolating the design-specific information in the testbench and
separating the functionality of the testbench.
Knowing about the number of cycles after stimulus the response
appears, or observing internal design signals to help predict the
expected result, can simplify testbench creation.
Most advanced verification teams find that small internal blocks
with nonstandard interfaces are the worst candidates for reuse.
Testbench reuse is most advantageous at the subsystem level,
where interfaces are more standard and the testbench components
more complex.
To improve the efficiency of a testbench, a developer
should abstract design information to a higher level.
The testbench should represent data and actions in a
format most easily understood by those using the
testbench.
Low-level implementation details that are irrelevant to
the test should not be specified.
Throughout the testbench, data should be captured and
compared at a higher level of abstraction to make
debug easier.
The developers should focus on utilizing standard interfaces to
facilitate multiple different uses.
Developers should not be trapped into using only a limited set
of options for tools and processes associated with the testbench.
Advanced verification teams develop their testbenches
independent of the tools or languages used.
The environment of the language used should not dictate the
architecture of the testbench.
These teams make sure their testbench is adaptable so that they can
easily switch tools or technologies without changing the testbench
architecture.
compare
Response Checkers
Interface monitors
Interface monitors
Signal-Trans
Signal -Trans
Design Under Verification
Stim
ulu
s G
en
era
tor
Tran
sact
or(
mas
ter)
Tran
sact
or
(sla
ve)
Stimulus generators create the data that testbench uses to
stimulate the design.
Stimulus generators can create the data in a preprocessing
mode with custom scripts or capture programs, or they can
create the data on-the-fly as the simulation occurs.
Stimulus generators are usually classified by the control the
test writer exerts on the generation of the stimulus.
Stimulus Generators
Transactors Transactors change the levels of abstraction in a
testbench.
The most common use is to translate from
implementation-level signaling to a higher level
transaction representation or the reverse.
Transactors are placed in a testbench at the interfaces of
the design, providing a transaction- level interface to the
stimulus generators and the response checkers.
Transactors can behave as masters initiating activity with
the design, as slaves responding to requests generated by
the design, or as both a master and a slave
Continued….
The design of a transactor should be application-
independent to facilitate maximum reuse.
Also, when developing a transactor, the designer
should consider its use in a hardware accelerator.
Interface Monitors
Interface monitors check the correct signaling and protocol of
data transfers across design interfaces.
In some testbenches, interface monitors are combined either
with the transactors or with the response checkers. Keeping
interface monitors separate from these components allows for
maximum reuse of the monitors.
The interface monitors should be application- independent
and written in a manner that allows their easy reuse in
hardware acceleration.
Response Checkers
Response checkers verify that the data responses received
from the design are correct.
Response checkers contain the most application-specific
information in the testbench and usually can only be reused
when the block they are monitoring is being reused.
Continued… There are three basic types of response checkers:
Reference model response checkers apply the same stimulus
the design receives to a model of the design and verify that the
response is identical to the design.
Scoreboard response checkers save the data as it is received by
the design and monitor the translations made to the data as it
passes through the design.
Performance response checkers monitor the data flowing into
and out of the design and verify that the correct functional
responses are being maintained. These checkers verify
characteristic of the response rather than the details of the
response
Verification tests
Many verification teams separate the creation of the testbench
from the creation of the test stimulus, because the two tasks
are very different and require different skills.
The two basic types of tests written today are
• Directed Test
• Random Test
Directed Tests
Directed tests specify the exact type and sequence of
stimulus to provide to the design.
A directed test tests a specific function in a consistent,
thorough, and predictable way.
Using directed tests, you can incrementally test function
after function and build up a thorough regression suite that
can be used to re-verify that function if the design changes.
The disadvantages of directed tests are that they require
detailed knowledge of the design being tested and are often
very difficult to set up.
Continued…
The time required to write these tests might not be feasible
for the development schedule.
There are several types of directed tests :-
• Interface test.
• Stress test.
• Feature test.
• Error test.
Recoverable error test.
Non-recoverable error test.
• Performance test.
Random Test
Random tests allow for the automatic creation of many
cycles of stimulus with limited knowledge of the design
required.
Random tests automatically select part or all of the
information for the test, including type, sequence, and timing
of the stimulus.
One random test can verify many functions, so fewer random
tests are required than directed tests.
Continued...
The disadvantage of random tests is that it is difficult to know
what the random test has verified.
Even if you can determine which functions have been
verified, it is often difficult to consistently repeat the test for
regression purposes.
Very difficult to debug.
Most advanced verification teams use a combination of
random and directed tests.
Introduction-What is an assertion?
An assertion is a precise temporal description of some
specific behavior of a design.
An assertion’s sole purpose is to ensure the consistency
between the designer’s intention, and what is implemented.
Describe expected or unexpected conditions to be verified in
the design.
Why is systemverilog assertion
important?
It’s a verification technique that is embedded in the language
Gives “white box” visibility into the design
Enables specifying design requirements with assertions
Can specify design requirements using an executable
language
Why is systemverilog assertion
important?(Contd…)
Enables easier detecting of design problems
In simulation, design errors can be automatically detected
Formal analysis tools can prove that the model functionality
does or does not match the assertion
Enables constrained random verification with coverage
Assertions can be used to report how effective random
stimulus was at covering all aspects of the design
Assertion based verification
Design intent is expressed using assertions
Simulation is done as usual
•Assertions find more bugs faster
•Assertions isolate the source of the problem
Module
under
test
Test bench Assertion monitor
Assertion- To be extracted
from specification
Monitor-Must have access to
interface signals
Interface bind
ABV coverage usage
Environment
Assertions
Assertions Coverage
monitor
DUT
statistics
Match/fail
Match/fail
interface
Assertions provides an unambiguous formal
specification of the design
Assertion catches design bugs early
During simulation, assertion attempts to provide
functional coverage of the specification
•measures the uncovered space
•can also guide the simulation
Supports formal and semi-formal analysis
Basic Coverage-Verification
In industry, verification consists of massive amount of
random tests
Mechanism of random tests generators is used
Advanced random generators can improve
quality of tests
•Hitting interesting or rare cases
Cannot detect areas that are not tested while
other are tested repeatedly
Basic Coverage-Verification
Coverage analysis
Technique for showing that testing has been
thorough
Create list of tasks and check that each task was
covered
Then resources are steered to low coverage areas
Basic Coverage-Verification
Functional-Coverage verification
Focuses on the functionality of the design
Is implementation specific
•Coverage spaces are defined manually
•Difficult to measure
Rarely considered as an inseparable part of the
verification flow
Coverage-Driven verification
Study of the arch
and arch
Coverage plan
Coverage
collection
Coverage
analysis
Bug analysis
Instrumentation of
Coverage tasks Test development
Incorporates functional-coverage as the core engine
that drives the verification flow
Coverage-Driven verification
Many corner cases can be easily found by random or
directed-random tests
Uncertainty regarding coverage
•Craft test manually
•Use amount of random tests that contribute very little
Coverage-Driven verification
Usually impractical approach for most
designs
Inapplicable at the beginning of the verification flow
•Detailed knowledge is missing
•Focus is on bug detection •logic cleanup
•RTL instability •buggy and not complete
A robust verification environment results in a robust design. SystemVerilog constrained-random stimulus feature helps in developing a robust verification environment . The methodologies based on SystemVerilog helps in creating re-usable environments.
For more info, click here: SystemVerilog
To know more about us, visit: http://www.kacpertech.com