O.O. 68 Passing Band
Total Page:16
File Type:pdf, Size:1020Kb
US00941 1 007B2 (12) United States Patent (10) Patent No.: US 9.411,007 B2 Bertacco et al. (45) Date of Patent: Aug. 9, 2016 (54) SYSTEMAND METHOD FOR STATISTICAL (58) Field of Classification Search POST SILICON VALIDATION None See application file for complete search history. (71) Applicant: THE REGENTS OF THE UNIVERSITY OF MICHIGAN, Ann (56) References Cited Arbor, MI (US) U.S. PATENT DOCUMENTS (72) Inventors: Valeria Bertacco, Ann Arbor, MI (US); 6,438,664 B1 8, 2002 McGrath et al. Andrew DeOrio, Ann Arbor, MI (US); 6,701.477 B1 * 3/2004 Segal ............................ 714f732 Daya Shanker Khudia, Ann Arbor, MI 7,269,756 B2 9/2007 Baartmans et al. 7.461.311 B2 * 12/2008 Harter et al. .................. 714f732 (US) 8,086,926 B2 * 12/2011 Kato .............. T14f741 2009/0206868 A1* 8, 2009 Laisne et al. ... ... 324f765 (73) Assignee: THE REGENTS OF THE 2010/0293422 A1* 1 1/2010 Huang et al. .................. T14,726 UNIVERSITY OF MICHIGAN, Ann Arbor, MI (US) OTHER PUBLICATIONS W. Li, et al., “Scalable Specification Mining for Verification and (*) Notice: Subject to any disclaimer, the term of this Diagnosis.” Proceedings of the 47th design automation conference, patent is extended or adjusted under 35 pp. 755-760. ACM, Jun. 13-18, 2010.* U.S.C. 154(b) by 505 days. F. Wang, et al., “Deterministic Diagnostic Pattern Generation (DDPG) for Compound Defects.” IEEE International Test Confer (21) Appl. No.: 13/663,258 ence, 2008, pp. 1-10.* W. Navidi. “Principles of Statistics for Engineers and Scientists.” (22) Filed: Oct. 29, 2012 McGraw Hill, Boston Massechusettes, 2010, pp. 268 272.* (Continued) (65) Prior Publication Data US 2015/O268293 A1 Sep. 24, 2015 Primary Examiner — Aniss Chad Assistant Examiner — David M Rogers (51) Int. Cl. (74) Attorney, Agent, or Firm — Marshall, Gerstein & Borun GOIR 3L/26 (2014.01) LLP GOIR 29/26 (2006.01) GOIR 27/28 (2006.01) (57) ABSTRACT GO6F II/263 (2006.01) The system and method described herein relate to a bug GOIR 31/31.83 (2006.01) positioning system for post-silicon Validation of a prototype GO6F II/26 (2006.01) integrated circuit using statistical analysis. Specifically, the GOIR 31/318.5 (2006.01) bug positioning system samples output and intermediate sig (52) U.S. Cl. nals from a prototype chip to generate signatures. Signatures CPC ............ G0IR 31/2601 (2013.01); G0IR 27/28 are grouped into passing and failing groups, modeled, and (2013.01); G0IR 29/26 (2013.01); G0IR compared to identify patterns of acceptable behavior and 3 1/3183 (2013.01); G0IR 31/318544 (2013.01); unacceptable behavior and locate bugs in space and time. G06F 1 1/261 (2013.01); G06F 1 1/263 (2013.01) 20 Claims, 7 Drawing Sheets 600 Failing band O.O. 68 Passing band O. 4 US 9.411.007 B2 Page 2 (56) References Cited Li et al., “Scalable Specification Mining for Verification and Diag nosis, in Proc. DAC, 2010. OTHER PUBLICATIONS Liblit, “Cooperative Bug Isolation.” Ph.D. dissertation, Berkeley, CA, USA, 2004, aA13183833. M. Grosso, et-al., “Exploiting Embedded FPGA in On-line Software McLaughlin et al., “Automated Debug of Speed Path Failures Using Functional Tests, in Proc. VTS, 2009. based Test Strategies for Microprocessor Cores.” IEEE International Narayanasamy et al., “BugNet: Continuously Recording Program On-Line Testing Symposium, 2009, pp. 95-100.* Execution for Deterministic Replay Debugging.” in Proc. ISCA, M. Neishaburi, et al., “Hierarchical Embedded Logic Analyzer for 2005. Accurate Root-Cause Analysis.” 2011 IEEE International Sympo Park et al., "BLOG: Post-Silicon Bug Localization in Processors sium on Defect and Fault Tolerance in VLSI and Nanotechnology using Bug Localization Graphs,” in Proc. DAC, 2010. Systems, 2011, pp. 120-128.* Quinton et al., “Programmable Logic Core Based Post-Silicon Abramovici et al., “A Reconfigurable Design-for-Debug Infrastruc Debug for SoCs.” in IEEE Silicon Debug and Diagnosis Workshop, ture for SoCs, in Proc. DAC, 2006. 2007. Adir et al., “Threadmill: A Post-Silicon Exerciser for Multi-Threaded Safarpour et al., “Automated Design Debugging with Abstraction and Processors,” in Proc. DAC, 2011. Refinement.” IEEE Trans. CAD of ICs and Systems, vol. 28, No. 10, Dahlgren et al., “Latch Divergency in Micro-Processor Failure 2009. Analysis,” in Proc. ITC, 2003. Sarangi et al., “CADRE: Cycle-Accurate Deterministic Replay for De Paula et al., “Backspace: Formal Analysis for Post-Silicon Hardware Debugging,” in Proc. DSN. 2006. Debug.” in Proc. FMCAD, 2008. Savir, “Syndrome-Testable Design of Combinational Circuits.” IEEE DeOrio et al., “Post-Silicon Bug Diagnosis with Inconsistent Execu Trans. Computers, vol. C-29, No. 6, 1980. tions. ICCAD, Nov. 7-9, 2011. Wagner et al., “Reversi: Post-Silicon Validation System for Modem Dong et al., “System-Level Cost Analysis and Design Exploration for Microprocessors,” in Proc. ICCD, 2008. Three-Dimensional Integrated Circuits,” in Proc. ASPDAC, 2009. Whetsel, “An IEEE 1149.1 Based Logic/Signature Analyzer in a Frohwerk, “Signature Analysis: A New Digital Field Service Chip,” in Proc. ITC, 1991. Method.” Hewlett-Packard Journal, May 1977. Yang et al., “Automated Data Analysis Solutions to Silicon Debug.” Gao et al., “A New Post-Silicon Debug Approach Based on Suspect in Proc. DATE, 2009. Window,” in Proc. 1/7°S, 2009. Yang et al., “Expanding Trace Buffer Observation Window for In Jha et al., “Localizing Transient Faults Using Dynamic Bayesian System Silicon Debug through Selective Capture.” in Proc. VLSI Test Networks, in Proc. HLDVT, 2009. Symposiwn, 2008. Keng et al., “Bounded Model Debugging.” IEEE Trans. CAD of ICs and Systems, vol. 29, No. 11, 2010. * cited by examiner U.S. Patent Aug. 9, 2016 Sheet 1 of 7 US 9.411.007 B2 M 100 155 106 VALIDATION COMPUTER PROGRAM MEMORY SYSTEM DATABASE 145 VALIDATION PLATFORM 162 110 102 114 PROCESSOR PROTOTYPE CHIP WITH BPS HARDWARE FIG. 1 U.S. Patent Aug. 9, 2016 Sheet 2 of 7 US 9.411.007 B2 200 2O2 DETERMINE WHICH SIGNALS TO OBSERVE AND WHAT KIND OF SIGNATURES TO COLLECT 2O6 204 WNDow RUN ONE ORMORE POST-SILICON TESTS ON PROTOTYPE LENGTH CHIP AND LOG SIGNATURES DURING TEST WINDOW PARAMETER FOR TARGET SIGNALS 2O8 SEND SIGNATURES OFF-CHIP TO COMPUTERRUNNING ANALYSIS SOFTWARE AND INCREMENT WINDOW COUNTER 3D210 YES 212-SORT SIGNATURES INTO PASS GROUP AND THE FAIL. GROUP AND SORTEACH OF THE PASS GROUP AND FAL GROUP BYSIGNAL AND WINDOW 214 MAKE PASS MODELUSING PASS GROUP PASS BAND = lipass + k pass "Opass 216 MAKE FAIL MODELUSING FAL GROUP FAIL BAND = liai?t ktail "Otail 218 COMPARE PASS BAND TO FAIL BAND TO DETERMINE BUGBAND 222 220 BUGBAND RANK SIGNALS ACCORDING TO MAGNITUDE OF BUG BAND THRESHOLD AND IDENTIFY SIGNALS WITH MAGNITUDE > BUGBAND PARAMETER THRESHOLD PARAMETER ASBUGS FIG. 2 U.S. Patent Aug. 9, 2016 Sheet 3 of 7 US 9.411.007 B2 -passing testcases - - - failing testcases Signature value FIG. 3A 308 306 passing testcases N W y ---- failing testcases O 0.2 0.4 O.6 O.8 1 Signature value FIG. 3B U.S. Patent 411.007 B2 3. 9|7 U.S. Patent Aug. 9, 2016 Sheet 5 Of 7 US 9.411.007 B2 pueqônq 27 W2 U.S. Patent US 9.411.007 B2 909 enem euneufóS U.S. Patent US 9.411.007 B2 Ø Ø|OQUUOOJegXØ Ø|?OÐIÐOXE FIG. 7 LOOLOOLO OOC)OO OOC)OOO NNNNNNNNNNNNNNNNNNVS??Oe0 FIG. 8 US 9,411,007 B2 1. 2 SYSTEMAND METHOD FOR STATISTICAL can usually be observed) and periodically transfer data off POST-SILICON VALIDATION chip. Traces are then examined by validation engineers to determine the root cause of the problem. This process may be STATEMENT REGARDING FEDERALLY time-consuming and engineering intensive, and may be fur SPONSORED RESEARCH ther exacerbated by bugs with inconsistent outcomes. Addi tionally, off-chip data transfers are very slow, which further This invention was made with government Support under hinders observability due to limited transfer time. HR0011-07-3-0062 awarded by the Defense Advanced The debugging process of non-deterministic failures can Research Projects Agent. The government has certain rights be aided by deterministic replay mechanisms. However, these in the invention. 10 Solutions perturb system execution which can prevent the bug from manifesting, and often incur significant hardware and FIELD OF TECHNOLOGY performance overheads. In addition, in an effort to automate the failure diagnosis process, methods based on formal veri This applications relates generally to integrated circuit fication techniques have been proposed. These solutions testing and, more particularly, to post-silicon testing of pro 15 require deterministic execution and a complete golden totype integrated circuits. (known-correct) model of the design for comparison. How ever, the fundamental scaling limitations of formal methods BACKGROUND preclude these techniques from handling industrial size designs. Accordingly, there is a need for a scalable post Diagnosing and debugging failures in large, complex mod silicon Validation platform to localize inconsistent bugs, ern digital designs is a difficult task that spans the entirety of minimize off-chip transfers, without a priori knowledge of the design process. Recently, the role of post-silicon Valida the design or the failure. tion has increased, particularly in the microprocessor design industry, in light of the scaling problems of pre-silicon meth SUMMARY odologies and tight time-to-market development schedules. 25 Pre-silicon Verification operates on an abstract design The bug positioning system (“BPS) leverages a statistical model and has the advantage of being fully deterministic and approach to address the most challenging post-silicon bugs, fully observable, but is limited by its slow speed and low fault those that do not manifest consistently over multiple runs of a coverage. Failing testcases can be reliably reproduced to same test, by localizing them in space (design region) and diagnose functional bugs, which in turn manifest consis 30 time (of bug manifestation).