Aug. 01, 2018 – Tool providers have continually improved the performance, capacity, and memory footprint parameters of functional verification engines over the past decade. Today, although the core anchors are still formal verification, simulation, emulation, and FPGA-based prototyping, a new frontier focusing on the verification fabric itself aims to make better use of these engines including planning, allocation, and metrics tracking.
At the same time, artificial intelligence (AI), big data, and machine learning are top of mind for every design team asking, 'How do we make verification even more efficient given that all the core engines have improved and continue to improve; what's the next level? When am I done verifying?'
A panel discussion I moderated at the recent Design Automation Conference in San Francisco discussed these and other topics. The panel included Jeff Ohshima, member of the technology executive team at Toshiba Memory; Paul Cunningham, corporate vice president and general manager for R&D at Cadence; David Lacey, verification scientist at Hewlett Packard Enterprise (HPE); and Jim Hogan, managing partner at Vista Ventures.