From left, Professor Heein Yoon of the Department of Electrical Engineering at Ulsan National Institute of Science and Technology (UNIST) and researcher Sungjin Kim. Courtesy of UNIST
A brand new know-how has been launched that may cut back the time required for communication semiconductor circuit design by 76% by optimizing the method with synthetic intelligence (AI).
Ulsan National Institute of Science and Technology (UNIST) introduced on the fifth {that a} crew led by Professor Heein Yoon of the Department of Electrical Engineering, in collaboration with a crew led by Professor Dae-Geun Song of Kyungpook National University, has developed an AI model that automates the complete course of from LC voltage-controlled oscillator (VCO) circuit design to its bodily structure on a chip. The analysis outcomes had been revealed within the worldwide journal ‘IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems’ on the third of final month.
An LC-VCO is a semiconductor circuit that generates frequency indicators in high-speed communication techniques like 5G. To cut back noise and energy consumption, the circuit should be designed by combining variables similar to inductor and transistor sizes.
During the structure stage, the place the circuit is transferred onto the precise chip, ‘parasitic results’—unintended effects similar to unintended resistance and noise era—can happen relying on the wiring thickness and part placement. This typically compromises the initially designed mixture throughout the structure course of.
The analysis crew developed an AI model that integrates and optimizes each circuit design and structure design. In the circuit design stage, it applies reinforcement studying to change design variables to discover a mixture that satisfies the goal frequency and efficiency. In the structure stage, the place the chip construction is set, it makes use of gradient descent to iteratively appropriate the wiring width and bodily design variables.
Reinforcement studying is a way the place an AI learns by way of trial and error to maximise rewards, whereas gradient descent is an optimization methodology that minimizes error (loss) by modifying values from their present state.
The analysis crew defined, “This technology allows an AI to integrally manage the circuit schematic design and physical layout, which were previously optimized separately.”
For the inductor, a part that induces voltage in line with adjustments in present, a deep learning-based prediction model considerably decreased the general design time. Notably, the AI succeeded in finishing duties that beforehand required repetitive electromagnetic simulations in just some milliseconds (ms, 1/one thousandth of a second).
Test outcomes confirmed that the developed AI model accomplished a activity in simply 28.5 hours, which took 119 hours with typical automated design strategies. The determine of advantage of the ultimate product was additionally superior to that of earlier research.
The model can proceed designing primarily based on its realized information even when course of traits change. An AI skilled on a 65-nanometer (nm, 1 billionth of a meter) course of can carry out designs on 40nm and 28nm processes utilizing solely about 10% of the information initially required for coaching. The crew plans to broaden this know-how to the automation of analog and radio frequency (RF) circuit design.
The analysis crew said, “This can improve the performance of frequency generation circuits, key components in 5G/6G communication and AI chips, while significantly lowering design costs,” including, “In the mid-to-long term, it is a tool that can help solve the shortage of semiconductor design engineers and dramatically accelerate the transition to next-generation processes.”
– doi.org/10.1109/TCAD.2026.3680789
Copyright ⓒ DongA Science. All rights reserved.