Home » IoT Embedded Systems » Featured Stories » Hardware and Software Requirements Driven Verification Perspectives: Part II

Hardware and Software Requirements Driven Verification Perspectives: Part II

In Part II of this story, Cadence, Synopsys and Imperas offer challenges that hardware and software designers will face in verifying system requirements.

by Hamilton Carter, Technology Editor

Figure: Verification environments change the state of designs by driving stimulus in the form of directed or constrained random inputs.  (Courtesy of Cadence)

Imperas

Imperas is uniquely perched on the border between software and hardware design engineering.  They provide executable models, (instructions set simulators), and associated verification environments for over 144 different cores.  Their tools help hardware/software co-designs get up and running, enabling testing of requirements driven use scenarios before all the parts of the system are available.  I spoke with Simon Davidmann Imperas’ CEO and founder.

Imperas’ customers see requirements from the perspective of software code interacting with a hardware model.  The metrics their customers monitor have more of a software bent: Does the software fail-safe when an instruction is mis-executed by the processor?  Has the software been tested in cases where the hardware resources it requires are denied?  What happens when the software under test finds itself under-scheduled amidst a wash of other p-threads?

Imperas sees software engineers branching out, from straight code coverage to …well… branch coverage as well as functional coverage.  Software engineers have also begun using assertions, including temporal assertions to provide coverage metrics that gauge the completeness of their software verification environments.

In an interesting aside, Simon mentioned some of the auto and aero standards are beginning to place requirements on allowed compilers and allowed subsets of languages for given applications.  “They prescribe not only the language and the constructs but also the coverage they want to see.  DO-254 in avionics does this.  When they define coverage they want to see, they are trying to control how they want things developed.”

 

Synopsys

Michael Sanie, Senior Director of Verification Marketing for Synopsys spoke with me about the growing adoption of metric driven techniques, especially functional coverage.  Most engineering teams are now making use of verification planning and functional coverage collection.  In response to this increased usage, Synopsys is working on coverage debug technology; helping engineers determine which coverage groups are blocked from being hit by intrinsic properties of either the device under test or the verification environment itself.

Michael also pointed out, “The other thing that’s coming up more and more is power coverage.  How many of your power states have you hit doing your verification?  You can cycle through more power verification scenarios on emulation or FPGA.  Engineers want to check designs with up to 100,000 power states.  The verification of some of these state transitions can be covered with static tools, others transitions have to be covered with simulation or emulation and the question becomes how do you best complement one with the other?”

Finally, Michael brought up an interesting point about meta-verification.  A ‘Who’s Watching the Watchmen?’ scenario is cropping up as a result of safety standards.  The standards ask the question, ‘can you prove your verification environment can catch bugs in the design?’ This has led to a class of tools and techniques that insert bugs into the design environment in order to qualify the quality of the verification environment.  Some examples are forcing bits within the DUT, or commenting out segments of the DUT code.

 

Cadence

I interviewed John Brennan of Cadence Design Systems along with Fernando Valera, and Moustapha Tadlaoui, from Visure Solutions, the Requirements Life-Cycle Management Company.  Visure is a Cadence partner who provides requirements tracking tools.  They pointed out that the explosion of software that’s being included in everyday objects, tennis shoes for example, is creating an increased demand for verification.  Embedded solutions must now work consistently across a variety of software variants.

Moustapha:  We’ve seen an incredible increase of software in systems. Our customers combine hardware and software.  They’re getting more and more software in their systems.  You know, now even with tennis shoes, you can put in a tube with some software.  So who would have ever thought that a tennis shoe would ever have software in it?  It’s pretty interesting that more and more customers are actually using the software to create variants of systems as well.  Even though the hardware and the software are combined together, the software is helping to create those variants. So basically, what we see is that they have shorter release cycles in the software with lots of changes that are affecting not only the software part but to some extent the hardware part so the teams need to be collaborating more closely. So, what some time ago was a completely isolated environment of software guys in the corner and the hardware guys somewhere else with completely different processes have now really tight environments, and what they’re trying to do is to kind of homogenize their methods and the tools they are using. Requirements gathering (or requirements elicitation) phase plays a key role in helping those teams collaborate with each other.They’re streaming from product requirements down to the system requirements and down to the software and the hardware requirements and they need to be able to trace the software requirements to the hardware requirements, the hardware requirements all the way down to the verification activities… the test cases, establishing an end-to-end requirement traceability of system development life-cycle.  Then as John was mentioning they need to feedback the results of those verification activities back to the hardware to be able to insure that the software will be able to work in that environment and finally trace back to the system requirements.

So you said the software was actually starting to affect the hardware parts of the team.  In what ways is it affecting them? 

Fernando: There is new functionality today what used to be in the hardware, now, software is taking over.  This is creating more challenges for the hardware in that the hardware has to support various variants of the same product with little changes.  Also another aspect of all this is that customers are retrained to use their existing software from one project to another project.  So to take one functionality that sustains between programs and they just tweak it a little bit and create a new variant of this software and the hardware has to support all these different variants and this functionality used to exist in the hardware.

John: So in addition to the hardware driven verification that sits under the metric driven verification umbrella we’re seeing software driven verification under that same umbrella.  The extension to use-cases is the critical driver to that.  The set of use-cases is part of the requirements and results need to tie back to the requirement system, whereas the RTL verification tends to sit at a lower level, but you need both to satisfy yourself that A. you have the right testcases to exercise the lowest level features and functions and B. that you have the right use cases.  Both pieces of information are important, but it is typically done by different teams.  So while the information is  correlated we are not seeing the impact on the hardware engineers.  Requirements driven verification is incomplete without those use cases being tied in, which is typically driven by the software team.

What metrics do you see your customers using most?  What’s popular now and why?

John:  You’ll be happy to hear at the IP level just about everyone is doing some combination of code and functional coverage in a very well defined process.  Virtually everyone has moved to functional metrics as the primary metrics for verification.  When you move up the stack to integration level or again to chip level and then SoC level, the metrics change.  For example,  at the integration level where you primarily do integration testing of a number of IP’s, your metrics are centered much more around your bus performance, your connectivity test results, your ability to manage transactions and the rate of transactions on your interconnect bus.  The generic concept of functional metrics are still the same but what you actually measure changes, and this data may no longer live within a UVM cover group for example. . So the key care-abouts  at each level of integration become the primary drivers of what metrics you use.  Going from integration level to chip level you start to measure all of the low power functionality you start to measure things like cache coherency.  You start to worry about memory performance and other kinds of performance.  And, like I mentioned before, when you start to get software running on the chip, you’re measuring use-cases where use-cases are “does the system do this?”  Metric Driven Verification for Cadence is about the ability to easily tie those metrics at any level of integration, to the requirement, and be able to represent the totality of tests against those requirements.

Are people using IP to cover some the things like bus transactions and performance?  Is there a person that just develops coverage?  How are they doing this?

Mostly we see that this is being done through tools versus a generic verification IP for example.  We have an Interconnect Workbench tool for example that allows you to measure and analyze bus performance and take a look at bus transactions.  We have formal tools that manage bus connectivity  and a whole process around interconnect verification which is really critical at the integration level.  What I’ve seen is not IP per se in the traditional form like VIP to solve the problems you mention,  but more in terms of tools that enable that functionality to be measured and verified.

Bios

JDisplaying JB Head Shot.jpgohn Brennan is Product Director for Incisive vManager at Cadence Design Systems, Inc.  He is responsible for the overall metric driven verification (MDV) methodology embedded within vManager and utilized in semiconductor functional verification.  John has been with Cadence since 1999 and worked in many aspects of functional verification including services, development and deployment of the Incisive Verification Kit, and Incisive business development.    Prior to EDA, John worked in several areas of industrial automation after graduating from Northeastern University with a BSME.

 

fernandoFernando Valera  Chief Technical Officer at Visure Solutions, Inc.

Fernando Valera completed his degree in Computer Engineering at the Complutense University in Madrid in 2003. He has ever since been dedicated to the field of Requirements Management and Requirements Engineering.  He has participated in the deployment of Requirements Engineering methodologies, processes and tools in companies in North (US and Canada), Central (Mexico and Costa Rica) and South America (Colombia and Argentina), Europe (Germany, UK, France, Scandinavia, Italy and Spain), Asia (China and India), in sectors such as automotive, medical devices, banking and finance, aerospace and defense and IT, training over 500 people in total.  He is an advocate of best requirements practices, which takes him to regularly participate in international symposiums, and the organization of the “Quality Days: A strive for good Requirements Engineering” in Europe.  He was relocated in 2012 to Visure Solutions, Inc. headquarters in San Francisco to develop his CTO role, where he currently leads the company’s effort to bring state-of-the-art Requirements Management solutions to help customers address compliance needs, and bring high quality products on time and on budget.

 

moustaphaMoustapha Tadlaoui  Senior Vice President & COO at Visure Solutions, Inc.

As the Chief Operation Officer, Dr. Tadlaoui has set the corporate vision and mission for Visure Solutions, The requirements lifecycle management company, which is focused on driving strategic business growth through the expansion of comprehensive sample management solutions, the development of strategic partnerships and the growth of business operations within North America.

Prior to joining Visure Solutions, Inc, Dr. Tadlaoui spent nearly a decade with LDRA Technologies, Inc. a leading provider of software test and verification toolset. As vice president of sales, Moustapha established the LDRA brand first in the US and then in South America in the heavily regulated markets of avionics, defense, medical, and automotive where software testing is critical in proving software reliability.

Dr. Tadlaoui had a proven track record of sales and revenue growth serving as the head of sales and marketing. In these roles, he was responsible for overseeing multiple lines of management, leading market expansion into new industry sectors and geographies, developing partnerships and introducing new product line in the organization.

Previous to this, Moustapha co-founded ATTOL Testware in France, a software testing company for embedded and real-time systems. He then went on to launch and develop ATTOL’s European business, building revenue and customer base and leading the company through its acquisition by Rational Software Corporation (now IBM Rational) in 2001. After successfully transitioning ATTOL technology to Rational offices in the US, Australia, and Japan, Moustapha, an entrepreneur at heart, created and set up US offices for LDRA.

Within these organizations he has managed a significant number of operational staff in the areas of corporate business development, sales and marketing.

 

MichaelSanieMichael Sanie is senior director of marketing in the Verification Group at Synopsys. He has more than 25 years of experience in semiconductor design and design software. Prior to Synopsys, Michael held executive and senior marketing positions at Calypto, Cadence, and Numerical Technologies. Michael started his career as a design engineer at VLSI Technology and holds four patents in design software. He holds BSCEE and MSEE degrees from Purdue University and an MBA from Santa Clara University.

 

 

simon_small

Simon Davidmann has been working on simulators and EDA products since 1978.  He is founder and CEO of Imperas and initiator of Open Virtual Platforms (www.OVPworld.org) – the place for Fast Processor Models.  Prior to founding Imperas, Simon was a VP in Synopsys following its successful acquisition of Co-Design Automation, the developer of SystemVerilog. Prior to founding Co-Design Automation, Simon was an executive or European GM with 5 US-based EDA startups including Chronologic Simulation, which pioneered the compiled code simulator VCS, and Ambit, which was acquired by Cadence for $280M. Simon was one of the original developers of the HILO logic simulation system, co-authored the definitive book on SystemVerilog, and is a visiting Professor of Digital Systems at Queen Mary, University of London.

Great information delivered straight to your inbox

Leave a Reply

Your email address will not be published. Required fields are marked *

*