Semiconductor Equipment

Semiconductors have a supply problem — AI can help

22 August 2022
Smart robots powered by artificial intelligence can help with Industry 4.0 as well as smart semiconductor manufacturing. Source: AdobeStock/Xiaoliangge

Supply chain issues wreaked havoc during the COVID-19 pandemic, including the resulting semiconductor shortage. While many consumers don’t understand what semiconductors are, there are few who don’t use them in their everyday life — with the powerful chips needed for everything from smartphones to car features.

While lockdowns and other health and safety complications made supply chain issues worse, the problem was brewing well before the virus hit — semiconductors are notoriously difficult and time-consuming to produce. Fortunately, there’s a potential solution: The use of artificial intelligence (AI).

Challenges of semiconductor production

The supply chain and production problems during the pandemic only exacerbated existing trends. Geopolitical problems have made it harder to get materials from China while demand has been rising due to the 5G rollout and increasing digitization. In fact, global chip sales are forecast to grow by 10% in 2022.

It typically takes around three months to make a semiconductor chip, and the process involves 700 steps — not to mention the complexity of building factories and equipment that are capable of producing them. To make matters worse, chips have become more complex in recent times, requiring significant research and design (which in turn have increased costs). While a 65 nanometer node costs about $28 million to develop, a 5 nanometer node costs a whopping $540 million.

To make it feasible for manufacturers to continue to meet rising demand and keep costs down, they need to be using the tools at their disposal, above all technological advances. However, the semiconductor industry has been partly left behind when it comes to the shift to digitalization. A KPMG report found that the semiconductor industry has an 89% lag on the tech sector, with it continuing to favor more traditional technologies and leaving simple tasks to humans rather than outsourcing to technology.

But this may be about to change.

The role of AI

While AI has typically attracted attention for its applications in exciting areas such as robots and self-driving cars, its most powerful innovations may well be within some of the places most would overlook, such as manufacturing semiconductors.

The term “artificial intelligence” encompasses all kinds of solutions, but one of the most revolutionary has been the development of machine learning. When machine learning is used, developers don’t have to train a computer how to do something (play chess, for instance) by loading it with a range of different instruction sets for how it should react in different situations. Rather, an algorithm can process data for itself to learn continuously.

This may involve a developer giving software information in the form of videos, articles or even a machine going through trial and error via experience or simulations. For instance, a machine could model the impact of various government responses to a pandemic to figure out which one would reduce deaths the most, or “read” multiple books about medicine to power a chatbot about healthcare advice.

Even more excitingly, machine learning can combine with other technologies, such as the internet of things (IoT), which uses devices including sensors to collect data continuously; something many devices are currently doing. The combination of constant access to real-time data and the ability to crunch that data to reach conclusions and make decisions can be a powerful one.

AI and semiconductors

How does this fit into the semiconductor industry? AI could be employed at various stages of production to improve operations, from research and design to manufacturing to sales.

Yet currently, McKinsey research suggests that only 30% of semiconductor manufacturers are using AI to help with their operations, with the remainder researching AI to some extent but not pushing forward in the project. For those firms willing to be pioneers, here are some of the areas AI could be most useful.

AI in research and design

Some of the biggest improvements can take place before the manufacturing even starts, with McKinsey predicting machine learning could lead to a 28% to 32% reduction in the costs associated with research and chip design. This may be even higher than the cost reductions from later stages.

Since machine learning can run simulations without the expense of carrying them out with real labor and materials involved, it offers a way to figure out how to lower costs and improve efficiency with minimal stakes. AI may even be able to handle some of the repetitive processes involved in creating designs initially, making it easier to identify potential problems ahead of time and figure out which designs will work the best.

AI in manufacturing

If AI devices were placed at different points in the manufacturing process, they could account for and analyze materials at each stage — therefore reducing losses and delays while cutting costs.

Plus, sensors could be used to help identify defects or problems early on in the production process, which would remove the need for humans to carry out inspections. This would both reduce resource wastage and the need for human labor. Over time, AI could also be used to find better ways to carry out processes to boost efficiency.

AI in inventory management

Perhaps one of the easiest places for AI to be used is looking after inventory, where it can remove the need for time-consuming processes where people have to seek out stock and keep track of where it is.

IoT devices can also track the state of inventory and where to locate items — and then machine learning can figure out what managers should do based on that information, such as making sourcing decisions based on supply and demand or offering insights.

The problems the semiconductor industry faces may not be well-known, but the impact of solving them will be felt across the globe, allowing everyone to access the technology they need and want in plentiful quantities and at reasonable prices. AI and machine learning may be the key to reducing shortages and production times.

About the author

David Cotriss is an award-winning writer of over 500 news and feature articles on business and technology. His website is davidcotriss.com.

To contact the author of this article, email GlobalSpecEditors@globalspec.com


Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the GlobalSpec
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement