• Success on the first tapeout or reduce the number of re-spin to very Verification Methodology minimum • Most companies can justify at most 1 re-spin for new design Ming-Hwa Wang, Ph.D. (time-to-market/cost: 2 tapeouts) COEN 207 SoC (System-on-Chip) Verification • Most integrations of proven IPs can be done at the first tapeout Department of Computer Engineering • Shorten time to market Santa Clara University Responsibilities Topics Responsibilities and cooperation among 3 teams • Vision and goals • Architecture team responsibility • Strategy • has to make sure the model is a correct executable specification and • Verification environment covers all functionalities of the design • Verification plan and execution strategy • RTL design team responsibility • Automation • Doing RTL implementation according the model or the specifications • Resources • Verification team responsibility • Verify the RTL is the same as the model Vision and Goals model

The definition of verification design verification • Verify the RTL implementation is the same as the intention (which is not the same as the golden model or the specifications) Level of Verification • Block-level verification, integration verification, chip-level verification, intension and system-level verification equivalent ? Design vs. Verification implementation • Separation of Design and Verification

• Designers do verification on their own design tend to neglect/miss • Verification = infrastructure/methodology + product understanding + non-working cases, or misinterpret the specs monitoring • Intentional make two different views (one from designer, another • Verification crisis: complexity grows exponentially, re-spin cost, miss from verification engineer) on the same specs, if inconsistency time-to-mark, lost customer, etc. exists, need to find out why (specs wrong, implementation wrong, or

verification wrong) and fix it The Vision • Can use designers to do verification for other blocks not designed by • Build a independent verification team in order to verify that the RTL the same persons implementation is the same as the intension, the model, or the • Ratio of the number of verification engineers to the number of design specifications engineers • Provide a consistent and generic verification environment to support • At least 1:1 at block/unit level multiple SoC projects and reuse among design teams • More verification engineers needed on sub-chip and whole-chip • Automate most of verification flow to increase productivity, coverage, verification (including coverage) and quality of the verification tasks • Need extra verification methodology/infrastructure/tool engineers • Work in parallel and aligned with HW team • Total ratio at least 2:1, more companies have more than 2:1 ratio

now due to complexity grows exponentially Goals to Achieve

• Basic goals (verification only) • Make sure the RTL implementation is “almost” the same as the Strategy golden model • Achieve predefined coverage on time Basic Verification Strategy (verification only) • Enhanced goals (verification + modeling) • Use SystemVerilog and OVM to build a consistent verification environment, and use scripts to provide an generic and automated verification flow high -level block model 1 high -level block model 2 • Use assertions to quickly increase the quality of the design • Use coverage and constrained-random test generation to achieve high RTL block 3 high -level block model 4 coverage, and the rest corner cases verified by direct tests • Use DPI to integrate high level model efficiently • Use skeleton testbench and test APIs (transactions, sequences, etc.) to Verification Environment increase productivity • Use revision control, configuration, regression and bug tracking to repeat The V Model for Verification and Validation problem and monitor progress of fixes validation • Use LSF to fully utilize computing/license resources, and use parallel requirements acceptance specification test processing, , FPGA/emulation to speed up verification verification system system test & Enhanced strategy (model-driven methodology) specification integration • Use model-driven methodology for architecture, HW and SW concurrent verification development, verification, and validation subsystem subsystem • Use performance modeling for architecture exploring, performance specification test analysis, and HW/SW partitioning • Use functional or TLM model as the golden reference for HW/SW program verification program development and verification • specification test Use mix and match flow (with co-simulation) to gain visibility while preserving high run-time efficiency verification • Break the design/verification/backend dependencies: parallel design and module unit specification test verification efforts, use AFE models, hack sdf/RTL for GLS, etc.

Time to Market code • conventional iterative serial design process:

• →→→ →→→ →→→ architecture hardware software system integration • parallel/concurrent design process: Test Layers hardware architecture system integration Test Layer software Scenario layer: application and scenario test case generation and checking, e.g., test generators Example mix and match models • software model: e.g., all high-level models Functional layer: sequence of transaction and stimulus

generation, self -checking, e.g., transactors, self -checking

high -level block model 1 high -level block model 2 Command layer: protocol and interfaces base command and functional model, e.g., driver, monitor high -level block model 3 high -level block model 4

Signal layer: interface to HDL, e.g., DUT • hybrid model: e.g., replace one block model by RTL

Typical Testbench

• Vera language and VMM stimulus output • Cadence Specman e language and its CDV environment creation BFM DUT BFM checking • InFact • Verilog testbench with script to automate constrained random stimulus generation, high-level to low-level test translation, Verilog reference monitors and scoreboards for checking model • Open source free C++ environment - Teal • SystemC and its verification library (SCV) Coverage-Driven Verification Environment • SystemVerilog and OVM • Advantages • Fully automated process – testbench automation hand coded testbench • Cost and benefit tradeoff made by upfront coverage goals directhand test coded • Reusable broad-spectrum corner-case direct test input design for output verification verification monitor test (DUT) monitor 100% stimulus coverage tapeout call direct coverage scoreboard APIs tests constrained-random coverage transaction checker constrained test generator collector logging random

pure random functional coverage pass/fail build environment preliminary verification tapeout time log files model Verification Description Languages (VDL) • Vera • A design verification language • need to build the infrastructure Coverage-Driven Verification • Specman • CDV combines the following aspects to reduce the time spent verifying a • A coverage-driven verification environment design • long learning curves for the “e” language • Automatic stimulus generation • SystemC and its verification library (SCV) • Self-checking testbenches • SystemC is based on C++ and moved from high level language • Coverage metrics downward to support functional modeling (TLM), verification (SCV), • An iterative process and even implementation (not encouraged though) • Start with simple directed tests to bring up the design • Difficult to be accepted by design community • Fully random test generation and run simulations • SystemVerilog • Collect coverage statistics and adjust constrained-random test • Based on Verilog and grows upward to support system level generation to generate corner cases modeling and emphasizes on verification • Run simulation and goto step 2, until the iteration doesn’t increase • A hardware design and verification language (HDVL) enough coverage • Hand coded corner tests not covered by previously steps Verilog or Object-Oriented • Ways to implement CDV • Designers prefer Verilog to object-oriented language • Tools from EDA vendors – vendor lock-in? • Pros – designers can jump in easily to help on verification • Cons – Verilog is a hardware (HDL), but not good for verification, constraint c { i dist { [0:2] := 1, 3 := 2; 4 := 3 }; } which needs verification description language (VDL) // probability of : 0 is 1/8, 1 is 1/8, 2 is 1/8, 3 is 1/4, 4 is 3/8 • Object-oriented environment • Pros – good for complicated projects by using layered verification Code Coverage environment, which facilitates reusability • Line/block/segment coverage • Cons – tremendous training cycle on designers and need longer up- • If all lines, blocks, and segments are reached front development time • Branch coverage • If all branches are reached SystemVerilog Randomization • Expression coverage class Bus_trans; • If all possible legal Boolean values of the expression are reached static int next_id; • Toggle coverage const int id; • Which bits in RTL are toggled – for power analysis rand bit[2:0] dir; // read/write • FSM coverage rand bit[15:0] addr; • If all states and possible transitions are reached rand bit[31:0] data; • Path coverage constraint addrIsEven { addr[0] == 0; } // word alignment • If all possible paths/traces are reached function new; id = next_id; endfunction : new Functional Coverage function print; • A coverage point is a Boolean function f(x1, …, xn) of a given set of … design variables x1, …, xn representing a specific valuation on the set. endfunction : print Variables are sampled, if they have the valuation as specified, the endclass : Bus_trans function is said to evaluate to TRUE. Functional coverage is expressed as the fraction of the functions evaluating TRUE out of all such defined Bus_trans rR; functions. repeat (3) begin • Functional coverage is user-specified in order to tie the verification tR = new; environment to the design intent or functionality void` (tR.randomize( ) ); • Coverage metric to be used is determined by the design type: control tR.print( ); dominated, datapath, interactions, and scope (block vs. system) end • Measures how much of the design specification, as enumerated by • random variables: rand and randc features in the test plan, has been exercised • constraint mechanisms • Interesting scenarios • call “randomize( ) with” • Corner cases • distribution/weight control: dist, randcase .. endcase • Specification invariants • conditional randomization: if .. else, implication (->), case .. endcase • Other applicable design conditions • repetition: repeat, foreach • SystemVerilog functional coverage constructs: • interleaving: rand join • coverage group • scoping: static • coverpoint and cross coverpoint (w/ optional iff( ) ) • ordering: solve .. before • coverage bins • layering: inheritance and override • values, ranges “:”, list “,”, default • examples using randomization with • transition “=>” and sequence: consecutive repetition “*”, • unconstraint random values nonconsecutive repetition “=” or “->”, default sequence std::randomization( ); • wildcard bins, ignore_bins, illegal_bins, binof, intersect, “!”, • constraint random values randomize(addr) with { addr > 8`h0f; }; “&&”, “||”, open range “$” • randomize(addr) with { a > b; b[0] == 0; }; coverage options: weight (default 1), goal (default 90%), name, comments, at least (default 1), detect overlap (default false), • constraint expression in class rand int i; auto_bin_max (default 64), cross_num_print_missing (default 0), constraint i_small_even { i inside { 0, 2, 4, 6, 8 }; } per_instance (default false) • • random value distribution in class coverage type options: weight (default 1), goal (default 100%), rand int i; comments, strobe (default false) • Functional coverage control • Predefined methods: sample( ), get_coverage( ), get_inst_cover( ), 100% set_inst_name( ), start( ), stop( ) • Predefined system tasks and functions: $set_coverage_db_name( ), 50% $load_coverage_db( ), $get_coverage( ) • Functional coverage calculation Time • Coverage of a coverage group: Cg = ( ΣiWi * C i) / ΣiWi Formal Vector based • Coverage of a cross item: Ci = |bins covered | / ( Bc + Bu), where Bc = ( ΠjBj) - Bb Assertion based verification (ABV) class Instruction; ABV should emphasis on block level (by designers) with complex temporal bit [2:0] opcode; properties instead of system level (by verifiers). bit [1:0] mode; The structured approach to add assertions to a design: bit [10:0] data; 1. Identify assertion candidates – module ports, control signal, etc. covergroup cg @ (posedge clk); • assertion metrics: total number of assertion candidate, percentage coverpoint opcode; of high/med/low assertions coverpoint mode { bins valid [ ] = { 2`h01, 2`h10 }; // one-hot 2. Code assertions • illegal_bins invalid = default; challenges: user must abstract and code an accurate property, SVA } language is not intuitive, temporal expression operators are complex coverpoint data; to visualize, skills required are difficult and costly to develop, need to cross opcode, mode; prevent evaluation of open-ended delay expression endgroup 3. Debug assertions endclass : Instruction • challenges: debugging an unproven assertion is a major time sink, user must develop at testbench to validate the assertion initial begin 4. Implement assertion control repeat (100) @ (posedge clk) begin • control knobs: severity and level, messaging to test bench, reuse: cg.sample; find files, documentation cov = cg.get_coverage; • challenges: user must package and control assertions in a consistent if ( cov > 90.0 ) cg.stop; manner, several types of assertion controls required (disabling end assertions, changing severity levels, change coverage levels, end integration with testbench component), assertion expression

visualization Self-Checking

• Since random test generation can generate stimulus randomly, an Assertion and Property Checking automatic monitor and self-checking mechanism is necessary to check if • Assertion languages the test is pass/fail • Property specification language (PSL) - based on IBM Sugar • Checking should be done • OpenVera assertions (OVA) • Temporal/timing check by assertion – correct sequence • SystemC Verification (SCV) • Data value check by reference model – valid result • SystemVerilog assertion (SVA) • Checkers ensure function correctness by using assertions or procedure • Property code • Boolean-valued expression

• Static property checking Formal Method • Dynamic property checking • Vector based dynamic verification needs testbench and test cases, thus • Assertions takes significant initial time • Directive to a verification tool that the tool should attempt to • Static formal verification is about applying "constraints" on inputs by prove/assume/count a given property "assume", "assert" on behavior by assertions and check if assertions are • Slow running time, applicable to block/unit level verification, but may "triggered" (as well as functional coverage), thus can be started as soon not for full chip verification as you have specs in hand • Rule of thumb: 80% RTL code, 20% assertion; to make it precise, use Coverage assertion coverage • Extreme programming: end • No implementation will be performed until a testcase exists for the planned implementation Concurrent assertions • Nobody programs or tests alone! Always take a buddy! • Concurrent assertions = instruction to verification tools • Assert property ( aProperty ) actionBlock; Property vs. Assertion • Cover property ( aProperty ) statement; • A property is a boolean expression built form: • Assume property ( aProperty ); • (System)Verilog expression • Concurrent assertions can’t be in classes. • Property operators • Implication ( |->, |=> ), $rose( ) , $fell( ) , $past( ) , sequences ( ## ), • Sequences concatenation/repetition, operators ( or , and , intersect , throughout , property nonZero; within ) (address > 0 ); assert property ( endproperty @(posedge clk) a ##1 b |-> d ##1 e • An assertion is a statement that a particular property is required to be ); true, and thus a tool should attempt to prove it assert property (nonZero); Binding properties to scopes or instances • Example of binding two modules Properties module cpu (a, b, c); • A property is a fact about your design • Where to put properties? endmodule • Embedded in the HDL code module cpu_props (a, b, c); • Properties in a separate file, then bind it to modules • Properties are used to write endmodule • Simulation checkers • Bind cpu cup_props cpu_rules_1 (a, b, c); • Constraints defining legal simulation input vectors • cpu and cpu_props are the module names • Assertions to be proved by static formal property checkers • cpu_rules_1 is cpu_props instance name • Assumptions made by static formal tools • ports(a, b, c) gets bound to the signals (a, b, c) of the module cpu • Functional coverage points • every instance of cpu gets the properties

Assertions Direct Programming Interface (DPI) • Capture the design intent more formally and find specification error • Two separate layers: the SystemVerilog layer and the foreign language earlier layer • Find more bugs and source of the bugs faster • Both sides of DPI are fully isolated • Static property checker • No language compiler is required to analyze the source code in the • Encourage measurement of function coverage and assertion coverage other language • Re-use checks throughout life-cycle, strength regression testing • The memory spaces owned and allocated by foreign code and • Assertion failure severity SystemVerilog code are disjoined • $info, $warning, $error (the default), $fatal • SystemVerilog defines a foreign language only for the C programming language (use ‘extern “C” ’ to mix C++ code in C DPI Immediate Assertions • Pure and context • Immediate assertions = instructions to a simulator • Name resolution: global symbol and escape sequence • Immediate assertions can be in class • Syntax: assert ( expression ) [pass_statement] [ else fail_statement] An Unified SystemVerilog Environment always @ (posedge clk) begin:checkResults • An unified SystemVerilog environment assert ( output == expected ) okCount++; • Use SystemVerilog to do functional/reference modeling and RTL else begin design $error(“Output is incorrect”); • Use SystemVerilog to write testbenches errCount++; • Use SystemVerilog assertions to check design intents end • Use SystemVerilog to do coverage-driven verification • Use SystemVerilog DPI to integrate high level models • Trend • SystemVerilog is the trend in design verification • Concerns • Not all features of SystemVerilog are supported by EDA tools • Need more licenses for SystemVerilog • Need SystemVerilog training

Open Verification Methodology (OVM) • Open, interoperable, SV verification methodology • Transaction-level modeling (TLM) interfaces for reusability • OVM library with built-in automation for productivity • TLM communication • Phasing and execution management • Testbench/environment customization • Top-down configuration scheme for scalability • Sequential layered stimulus and test write interface • Common message reporting and formatting interface

VIP & verification environment

Open methodology OVM class l ibrary

SystemVerilog TLM • TLM IP design and verification flow – moving verification up to where is • OVM enable multi-language verification – reduce risk and unifies SoC more efficient, optimizing efforts with a metric driven verification plan verification teams • Comprehensive TLM-driven design and verification – enables the next • Single methodology for e, SystemC and SystemVerilog leap in design productivity • Unifies VIP for reuse 1. Based on industry standards – systemC and OVM • Choose best language for application and group skills 2. Move golden model up to TLM level – easier to write/read/debug, • Frees companies to choose best verification solution links virtual platform to implementation flow • Pros 3. Metric-driven verification reduce overall effort – verify more • The OVM provides a library of base classes (on top of SystemVerilog) functionality at faster TLM speeds, reuse testbench to regress lower that allows users to create modular, reusable verification RTL environments in which components talk to each other via standard 4. Enables multi-language verification transaction-level modeling (TLM) interfaces • TLM design and verification advantages – as reported by customers • Multi-language VIP plug and play in SV, e, or SC 1. ~3x more efficient/compact designs • Built-in automation and powerful configuration scheme – easy to 2. 5x-10x faster simulation/debug integrate and configure verification components into a powerful 3. Equal/better QoR than manual testbench 4. 10x+ productivity • Common message reporting and formatting interfaces for transaction automation Reusability • Cons • Transaction-level modeling (TLM) • Training to facilitate the learning curve • Blocking interfaces: put( ), get( ), peek( ) • Design engineers’ acceptance chance is moderate • Non-blocking interfaces: try_put( ), try_get( ), try_peek( ) • Development time and resource requirement • Port/export connection: one-to-one mapping • Analysis communication - write( ): one-to-one or one-to-many mapping (Partial) OVM Class Hierarchy • SystemVerilog mailbox: one-to-one, one-to-many, or many-to-many mapping • Top-down or hierarchical configuration • set_config_string(“”, “default_sequence”, • register data members using `ovm_*_utils macros “”); • top-down or hierarchical configurations • use grab( ) and ungrab( ) to achieve full control the underlying • set_config_int(, , ); drivers for a limited time, and we can use them to handle interrupts • set_config_string(, , ); • factory • set_config_object(, , ); • design patterns • set_type_override_by_type(, ); • factory: object-oriented creational design pattern • set_inst_override_by_type(, , ); • creating objects without specifying the exact class of object that • wildcard can be used in will be created • command line arguments • a separate method for creating the objects, whose subclasses • +OVM_TESTNAME can then override to specify the derived type of objects that will • run multiple tests w/o having to recompile or re-elaborate be created • Utility functions • create( ) • Register with `ovm_object_utils and `ovm_field_* • get_type_name( ) • Utility functions: randomize (by SV), create, print, clone, compare, copy, pack, get_type_name, record Module-Based SystemVerilog/OVM Verification Environment • Flags (concatenating with +): • wrap Verilog modules with SystemVerilog wrapper • OVM_ALL_ON • take advantages of SystemVerilog features • OVM_COPY/OVM_NOCOPY • not much reusability • OVM_COMPARE/OVM_NOCOMPARE • OVM_PRINT/OVM_NOPRINT Class-Based SV/OVM Verification Environment • OVM_DEEP/OVM_SHALLOW/OVM_REFERENCE • OVM_BIN, OVM_DEC, OVM_OCT, OVM_HEX (the default) OVM Agents • OVM_STRINTG • configurable – the is_active flag • OVM_TIME • master agent: is_active is 1, contains instandces of the sequencer, • OVM_UNSIGNED driver (or BFM), and monitor • `message(, ); • slave agent: is_active is 0, contains monitor only • verbosity: OVM_NONE, OVM_LOW (the default), OVM_MEDIUM, • monitor OVM_HIGH, OVM_FULL • collects information from the DUT • output: [