Siemens Digital Industries software tied two themes of the Design Automation Conference (DAC) together in one new product when it announced Solido Design Environment, an artificial intelligence (AI) -powered IC design and verification software tool that can accelerate the design cycle.
Vice-president, general manager of Siemens EDA’s custom IC verification division, Amit Gupta, created Solido Design Automation in 2005, using machine learning and AI algorithms to improve the power, performance and area (PPA) of electronic chips. It was acquired in 2017 and products were brought into the Siemens portfolio.
The need for speed in IC design is driven by applications such as HPC (high performance computing) and automotive, says Gupta. There are more variable and parasitics as the designs become larger, more dense and more complex, he says. “AI can close the productivity gap, but optimising engines to improve performance help the human operator with software that provides analytics and can provide new insights for debug, diagnosis and yield,” he adds.
The Solido Design Environment software is designed for the increased complexity of IC designs driven by HPC, the IoT as well as wireless and automotive applications, says Siemens. It features AI technology and is ready to be deployed to the cloud for signoff variation analysis seamlessly integrated into an intelligent, cloud-ready design environment, says the company.
Considering the multiple disciplines of the IoT, HPC, automotive and wireless markets, it provides a single, comprehensive cockpit that handles nominal and variation-aware analysis, including SPICE-level circuit simulation setup, measurements and regressions, as well as waveforms and statistical results analysis.
Adding AI
AI is used to accelerate the design cycle and to close the productivity gap, explains Gupta.
The initial software introduces additive learning technology, bringing AI decisions and analysis to design and verification. Gupta estimates that it can accelerate the process by a factor of 10 to 100 and achieve verification accuracy up to Six Sigma and higher yield faster than brute force Monte Carlo, he says, while helping to significantly improve coverage and accuracy.
Machine learning enables design teams to measure performance and power consumption and achieve a lot more coverage in fewer simulations than would be required with conventional software. To verify a design to Six Sigma, that is, to make sure that there is only a one in a billion failure rate would conventionally take six to seven billion simulations, over weeks or even months.
The same level of coverage can be achieved in a few thousand simulations using the software and machine learning technology, he says. This avoids over-designing, or under-designing, and therefore optimises PPA and yield with full coverage of the design, he says.
Gupta outlines a roadmap to introduce tools with an additive learning engine, one to determine if additive learning can be used, and assistive AI, which identifies which transistors are responsible for failures and suggests ways to make the design meet specifications.
It also suggests how the design can run faster, and use less power in a smaller area using automation and user-friendly visualisation to improve iterations.
Generative AI for RTL
AI is also a contributing factor to accelerating RTL (register transfer level) design and implementation. The Joules RTL Design Studio uses actionable intelligence, says the company, to accelerate both RTL design and implementation.
It provides digital design analysis and debugging capabilities from a single, unified cockpit, to enable fully optimised RTL design prior to implementation handoff. Used with Cadence’s AI portfolio, it allows users to leverage generative AI for RTL design exploration and big data analytics. The company says that physical estimates are accelerated and its speed and accuracy can increase productivity by 500% and deliver up to 25% QoR (quality of results) improvements in the RTL.
An intelligent RTL debugging assistant system provides early (PPA + congestion) metrics as well as actionable debugging information throughout the design cycle with logical, physical and production implementation, allowing designers to explore ‘what-if’ scenarios and potential resolutions to minimise iterations.
The software has the same engines as Cadence’s Innovus implementation system, Genus synthesis solution, and Joules RTL Power solution. This provides users access to all analysis and design exploration features from a single GUI for optimal QoR.
Integrating Joules RTL Design Studio with Cadence’s generative AI, Cerebus Intelligent Chip Explorer opens up design space scenarios, such as floorplan optimisation and frequency versus voltage tradeoffs. The company’s Joint Enterprise Data and AI platform allows trend and insight analysis across different versions of the RTL or across previous project generations.
Other features in the Joules RTL Design Studio include lint checker integration to run lint checkers incrementally to rule out data and setup issues early. The unified cockpit offers physical design feedback, localisation and categorisation of violations, bottleneck analysis and cross-probing between RTL, schematic and layout, adds Cadence.
Dr Chin-Chi Teng, senior vice-president and general manager of the Digital & Signoff Group at Cadence, estimates that providing designers with all the physical information needed for PPAC debug without having to wait for implementation reduces tasks that previously took days or weeks. “Joules RTL Design Studio gives designers visibility into the challenges when they can still be addressed easily, ultimately speeding time to market,” he says.
Digital bridge
Ansys and Altium have announced they are joining forces to create a digital bridge connecting Altium’s electronic CAD (ECAD) tools and Ansys’ Electronics Desktop, a platform that provides access to the company’s electromagnetics simulation tools (for example, HFSS, Maxwell and Q3D Extractor).
The two partners hope that bi-directional integration will accelerate design, reduce errors and replace import and export translation and manual communication. It was demonstrated at DAC in San Francisco and will be available in the second half of this year.
The connection will streamline the exchange of design data and enable engineers to work together more effectively within a fully integrated workflow, say the companies. It is significant that there will be no need for import/export translations and replacing manual, ad-hoc communication methods, as the integration supports predictive accuracy, synchronisation and productivity, while reducing the risk of errors. The digital bridge is therefore expected to minimise the potential for respins and therefore accelerate the development cycle.
PCBs are used in multiple industries and applications, such as automotive, consumer electronics and the IoT, explains Ansys. As connectivity advances from wearable technology to autonomous vehicles, electronics designs increasingly involve fragile components such as sensors and ICs, which need predictively accurate modelling and simulation for design success, continues the company.
Comprehensive electronic design requires an evaluation of signal and power integrity, electromagnetic compatibility, thermal mechanical stresses and electronics reliability. Ansys offers end-to-end simulation solutions for PCBs, ICs, and IC packages to evaluate an entire system, says the company.
John Lee, vice-president of electronics, semiconductor, for the optics business unit at Ansys, is enthusiastic about the digital bridge engineers can use for a high degree of connectivity during design and development. “With a bi-directional link between Ansys and Altium solutions… engineers will no longer be slowed down or interrupted by data communication and can focus on design, innovation, and collaboration.”
Best Paper Award
Two other partners, Synopsys and Georgia Tech, celebrated winning the DAC 2023 Best Paper Award.
A team of engineers from the Synopsys EDA Group collaborated with Dr Yi-Chen Lu, then a PhD student who had been an intern at Synopsys, and Dr Sung-Kyu Lim, a professor at the university’s School of Electrical and Computer Engineering and director of the school’s Georgia Tech Computer-Aided Design Laboratory, and presented ‘RL-CCD: Concurrent Clock and Data Optimization Using Attention-Based Self-Supervised Reinforcement Learning’.
It was the culmination of six months’ collaborative research into methods to drive concurrent clock and datapath (CCD) optimisation in physical design and was recognised by the judges for its “innovative approach in applying reinforcement learning techniques to push the boundaries on [PPA] in
digital designs”.
Based on the research, RL-CCD emerged as a method to improve CCD quality by prioritising endpoints for useful skew optimisation using a self-supervised attention mechanism. The team’s experimental results on 18 industrial designs at 5nm-12nm process technologies demonstrated that RL-CCD can deliver up to 64% better total negative slack compared to the best-in-class production solution for CCD.