Home » IoT Embedded Systems » Featured Stories » Hardware and Software Requirements Driven Verification Perspectives

Hardware and Software Requirements Driven Verification Perspectives

Embedded engineers and managers from these two different disciplines are being forced into closer interactions to meet IoT cost and time constraints.

By Hamilton Carter, Technology Editor

Requirements driven verification is the (ideally automatic) abstraction layer that links project goals to the data/metrics collected by metric driven verification.  The global applicability of this definition, however, raises many questions.  Who actually defines the requirements; who should?  Who is responsible for verifying that the defined requirements are met?  How should requirements driven verification processes be managed?    What changes when you move from hardware to software requirements?  Can hardware and software verification teams learn lessons from one another?  Do they have unique perspectives that might enlighten their counterparts?

Figure: Verification environments change the state of designs by driving stimulus in the form of directed or constrained random inputs.  (Courtesy of Cadence)

Figure: Verification environments change the state of designs by driving stimulus in the form of directed or constrained random inputs. (Courtesy of Cadence)

Recently, I was able to sit down with a number of experts from a variety of software and hardware verification companies to talk about these issues.  We discussed the above questions from the perspectives of hardware and software teams.  We also discussed how these two disciplines are being forced to interweave with one another as products become increasingly embedded and time-to-market windows become shorter as the IoT becomes commoditized.  I’ve included my conversations with Jama Software, Dassault Systemes, and Coverity in this first installment of a two part series.

Jama Software

I spoke with Eric Nguyen of Jama Software about a class of traceable metrics that are facilitating faster decisions regarding requirements changes, allowing for an overall product delivery speedup.  Eric emphasized that the flexible interface between what winds up being implemented as hardware or software is putting more pressure on system engineers and architects to properly translate business requirements into product design requirements.  He mentioned that ironically, the very ease, flexibility, and speed with which software can handle requirement changes, creates some of the most daunting ripples within system-level designs.  Finally, Eric pointed out that there are productivity gains to be had by tracking metrics pertaining to the network of blocks within a design, the people working on these blocks, and the parameters that went into each block’s associated design decisions.

Some examples of these high-level metrics that track each design block and the decisions influencing its construction are

  • The individuals who are responsible for and work with each block in the system
  • Which stakeholders made, or participated in each design decision for each block
  • The network of other design blocks that either influenced the decision, or were influenced as a result of the decision
  • Documentation describing why the decision was made, as well as the various considerations that applied to each block in the impacted/impacting network

These metrics lead to faster, more productive, less error-prone decision-making by lending themselves to ad-hoc collaboration.  When a requirement change is necessary, the effected parties can all discuss the change and view the associated impacts immediately.  Frequently, in Eric’s experience, this leads to decisions being made on the spot, reducing the need for large meetings.  When large meetings are required, they’re typically used to hash out the finer details; the time spent communicating context to the team at large is obviated by the collaboration and availability of the metrics mentioned above.

Dassault Systèmes

Michael Munsey, Director ENOVIA Semiconductor Strategy, at Dassault Systèmes,  opened our conversation with a brief enumeration of the different requirements at all levels of a design project, “… for example, you have requirements defined by your technology, like design rule checks.  You also have requirements defined by the marketing team that define what a given design must do.  These requirements, lead to design specifications that are used to generate verification requirements.”  He emphasized that it’s crucial to be able to communicate all these requirements, and their status to everyone on the project.

Communication has to move forward and backward along the chain between marketing and engineering.  With an effective requirements and metrics integration, not only can the business team see that their requirements are being designed, verified, and tested; they can also see which aspects of the project will be impacted by a proposed change in the specification of a product.  For example, the project’s stakeholders can determine how many modules will have to be changed based on a new requirement, as well as how many man-hours and compute resources will go into making and verifying the change.

When I asked Mike if he’d encountered any cultural adoption issues working with customers to integrate their communication and requirements tracking solutions, the answer was surprising: “No”.  The business people welcomed the visibility into the chip design process, integrating the requirements and metrics into their existing business-intelligence systems.  Engineering, driven by ever-increasing design complexity, was happy to have an objective picture of their progress readily available for management, and to be able to gauge what management’s new requirements would cost in terms of schedule and resources.  Integrating the engineering metrics provided by Dassault’s tools into existing business-intelligence systems isn’t much of an issue either.  Businesses are used to conglomerating information from disparate automated sources, so much so that the field of systems integration consulting has sprung up around the task.

To wrap up, I asked Mike what value ‘requirements-driven verification’ could bring to the arena of software/hardware co-design where partitioning of hardware and software resources has become very fluid.  Mike mentioned that the software world has existing metric driven tools as well.  Designers from both sides of the HW/SW fence, see value in being able to tie the metrics from hardware and software tools together at a higher level of abstraction.  With all the metrics clearly visible, more intelligent decisions can be made.  These decisions can more easily factor in the impact of changes to the hardware/software requirements and the partition between them.

Coverity:

James Croall of Coverity pointed out that many of his customers’ requirements are driven by industry standards.  For example, coding standards are common in the mil/aero, automotive and medical device industries.

Some examples of standards-driven requirements are:

  • Checking for code that will inadvertently cause system failures, or resource leaks
  • Verifying that code that implements new product requirements has actually been executed
  • Ensuring that when code is revised necessary testcases are run both on the changed code and any other modules impacted by the change
  • Analyzing what code needs to be tested immediately, prioritizing complex code over, for example, a file of code instrumentation routines that are used to decorate objects for hands-on debug
  • All unnecessary code (e.g. not associated with a feature, not executed, or not reachable) must be removed from the final release

When I asked James what metrics should be associated with top-level requirements when changes are mandated by the management team, he explained that, as requirements for software are changed, it’s very useful to be able to automatically detect associated changes in the code base.  He emphasized the importance of being able to automatically detect a requirement change necessitating either new code or modifications to existing code.   Starting from this inception point, metrics should be collected on the new or modified code modules’ execution status, the existence of any other code modules that are impacted by the change, and testcases associated with all these impacts.  These automated processes provide immediate feedback on the new requirement’s implementation status, as well as a gauge of the new requirement’s impact on the project.

Parting thoughts for Part I

In closing, we’ve seen that requirements mean a wide variety of different things to different stakeholders, even within a given project.  However, with a steadily growing stable of metric collection and analysis tools in both the software and hardware industries, the time is ripe to harness up and reap the benefits of requirements driven verification.

There are many metaphors that can be used to describe requirements driven verification.  One of the more colorful ones poses it as the harness that hitches the horse teams of metric driven verification to the carts defining business level requirements.

 

James Croall, Product Marketing Director at Coverity

James Croall, Product Marketing Director at Coverity

 

James Croall is a Product Marketing Director at Coverity and over the last 8 years has helped a wide range of customers incorporate static analysis into their software development lifecycle. Before joining Coverity, Mr. Croall spent 10 years in the computer and network security industry, including prior research and development positions at MITRE, McAfee and Symantec.

 

 

 

 

Michael Munsey, Director ENOVIA Semiconductor Strategy, at Dassault Systèmes

Michael Munsey, Director ENOVIA Semiconductor Strategy, at Dassault Systèmes

 

Michael Munsey brings 25 years of semiconductor and EDA experience to Dassault Systemes where he is responsible for semiconductor strategy. Michael has been working in the semiconductor, EDA and electronics industries since obtaining his BSEE from Tufts University in 1989. Starting directly from university as a senior associate engineer at IBM, he then worked in senior positions in applications engineering, sales, and product marketing at Viewlogic Systems, Sente, Sequence Design, Tanner EDA and Cadence. Most recently, Michael was Vice President of Marketing at Silicon Dimensions and Senior Director for Enterprise Solutions at Cadence.

 

 

 

Eric Nguyen, Director of Business Intelligence at Jama Software

Eric Nguyen, Director of Business Intelligence at Jama Software

 

Eric Nguyen is Director of Business Intelligence at Jama Software. You can catch up with Eric’s latest Jama Software related blogs at http://www.jamasoftware.com/blog/author/enguyen/ . Prior to joining Jama, Eric served in a number of executive roles at Admax Network, WebMD Health Services, GE Healthcare, and MedicalLogic.

Great information delivered straight to your inbox

2 comments

Leave a Reply

Your email address will not be published. Required fields are marked *

*