Print  |  Send  |   

Tackling the European challenges in verification

August 16, 2012 // Nick Flaherty

Tackling the European challenges in verification

EETimes Europe asked Mike Bartley of leading verification house TVS to look at how European companies see the challenges and possible solutions.


Page 1 of 1

The major silicon companies in Europe see the rise in design complexity as a major verification challenge but see different methods of solving this. ARM, ST, Infineon and Ericsson each identified their top verification challenges at the European-wide VerificationFutures conference. With VerificationFutures running in UK, Germany and France in November 2012 Mike Bartley sees the challenges and possible solutions ahead.
Clemens Muller of Infineon saw complexity in the whole system, both the hardware and the software. He wanted tools to help master that combined complexity with true hardware-software co-verification. Co-simulation is obviously a start but engineers require a combined hardware-software view of the simulation. Software engineers have a completely different view of the system and do not want to debug with waveforms. All four companies also pointed out that the solution starts earlier in the design process with ESL (Electronic System Level) design usually with SystemC. The TLM2.0 standard should enable users to more easily port their test benches seamlessly between SystemC and RTL. For example, Mentor Graphics highlighted their release of UVM Connect at DAC. This is a new open-source UVM-based library that provides TLM1 and TLM2 connectivity and object passing between SystemC and SystemVerilog.
Olivier Haller of ST highlighted that heterogeneous multi-core systems introduce a new level of complexity. As a consequence Systems tests become too complex to be hand written and new techniques such as graph based testing are applicable. Tools such as TrekSoC from Breker Systems automatically generate self-verifying C-based test cases that run on the embedded processors. These test cases exercise the corner cases of the design faster and more thoroughly than hand-written tests and triggers unusual conditions unlikely to occur even by running production code in the processors.
One way to handle increasing complexity is through an improved design process. Design for verification was named as a key verification challenge by both Bryan Dickman of ARM and Hans Lunden of Ericsson. The idea being to help designers to create RTL that has fewer bugs (bug avoidance) and it is easier to find bugs (bug hunting). Verification tool vendors are now taking their tools into the design entry space. For example, formal verification experts Jasper DA have a patented Visualize feature which allows the designer to automatically create waveforms for interesting properties of their design without the need to write a test bench. The designer may need to create constraints for the formal tool but both the constraints and properties can be re-used later either in simulation or a wider application of formal. Both Cadence and Jasper have also packaged a number of powerful tools into apps and put them in the hands of the designer. This allows them to check such issues as X-propagation and clock domain crossing before handing off the design to verification.
All of these dynamic and static tools generate an incredible amount of data which everybody quoted as a key challenge. But the goal of all the verification activity is to produce a level of confidence for a tape-out or IP release decision. Verification managers require a way to turn all of the data into information that informs a release decision. Mentor Graphics and Cadence Design Systems both have tools to help here: QVRM and VManager respectively.
One key advancement is version 1.0 of the Accellera Systems Initiative Unified Coverage Interoperability Standard (UCIS) This defines a standard API to access coverage information which will allow innovation both in defintion of coverage and use coverage information. For example, this will help to drive solutions in requirements driven verification such as asureSign from TVS. The rise of the standards (such as ISO26262 for automotive) will require verification engineers to provide more detailed audit trails from requirements to verification in the future.
For years verification engineers have been generating valuable data without open access to it, said Bartley. The UCIS standard will enable engineers to create tools to access and manipulate that data. I see a period of exciting innovation ahead.
Productivity remains a key issue. Olivier Haller summed this up nicely by saying we must do more with same budget. Harry Foster, Chief Verification Scientist for Mentor Graphics' Design Verification Technology Division points to some interesting data on this from a comprehensive world-wide survey in which every statistic indicated rising effort in verification (for example a 58% increase in verification engineers between 2007 and 2010). The survey also showed debug as the largest verification activity and all 4 companies highlighted debug as a key concern.
Many companies reduce debug time through the use of assertion based verification (ABV) and startup companies demonstrated their ABV tools. Zocalo and NextOp Software both have tools (Zazz and BugScope) that spot patterns between signals and generates properties that are either candidate assertions or coverage holes. Vennsa Technologies claims their tool OnPoint can categorise failures in a regression and also help to identify the lines of code that are causing the failures. Debug still seems to be an area ripe for innovation.
Access to tools via the cloud is another potential way to achieve more with the same budget by saving both time and money on hardware infrastructure and IT. For example, Aldec Cloud gives users access to a virtually unlimited number of high performance servers. By running simulations in the cloud engineers can cut their regression testing from days to hours or even minutes.
The design community has seen huge productivity gains through the use design IP which now allows them to stitch together complex blocks into extremely complex SoCs. The response on reuse from the verification community is Verification IP (VIP) which enables re-use of test bench components such as transactors, monitors, coverage and tests. A number of methodologies have been invented in order to enable re-use through VIP and these have now been combined into a single methodology (the Universal Verification Methodology). In Feb 2012 Synopsys announced their Discovery VIP family and a number of companies are offering UVM compliant VIP for both on-chip and off-chip standardised protocols.
testandverification.com/verification-futures

1 

All news

EDA Design Tools

Follow us

Fast, Accurate & Relevant for Design Engineers only!

Technical papers     

Linear video channel

READER OFFER

Read more

This month, Arrow Electronics is giving away ten BeMicro Max 10 FPGA evaluation boards together with an integrated USB-Blaster, each package being worth 90 Euros, for EETimes Europe's readers to win.

Designed to get you started with using an FPGA, the BeMicro Max 10 adopts Altera's non-volatile MAX 10 FPGA built on 55-nm flash process.

The MAX 10 FPGAs are claimed to revolutionize...

MORE INFO AND LAST MONTH' WINNERS...

Design centers     

Automotive
Infotainment Making HDTV in the car reliable and secure

December 15, 2011 | Texas instruments | 222901974

Unique Ser/Des technology supports encrypted video and audio content with full duplex bi-directional control channel over a single wire interface.

 

You must be logged in to view this page

Login here :