Is AI the best news for semiconductors since microprocessor days?

The microprocessor chip acts as the brain of any computer system, whether the device is used for the Internet of Things (IoT), consumers, or the cloud. In fact, the semiconductor industry owes its success to Intel’s introduction of its 4004 microprocessor back in the 1970s. That later evolved into the 808X series, which led to the widespread adoption of chips.

Even today, microprocessors receive more media coverage than any other type of chipset. Microprocessors are still the single most expensive semiconductor component in terms of the bill of materials (BOM) on many devices and server processors can fetch as high as $1,000 for a single chip.

Why are microprocessors able to command such high premiums? One could answer that question by using the analogy of human anatomy. Given the advances in science, we are able to fix the heart, limbs, and other parts of the body, but there are not yet any easy fixes for the brain. Microprocessors have generally acted as the powerful brain of computer systems that only a select few are able to manufacture and advance in the market. Therefore, it is not surprising that microprocessors command higher margins and premiums. Even today, Intel’s margins are close to 60%.

In 2017, we are on the cusp of the artificial intelligence (AI) revolution. The need for faster hardware for AI applications has been identified in academia for a long time and the industry has just begun creating new architectures. Central processing unit (CPU), graphics processing unit (GPU), field programmable gate array (FPGA), and application-specific integrated circuit (ASIC) companies are heavily investing in the area and introducing new products at a rapid pace.

These AI chipsets will act as the brain for all these advanced systems, providing intelligence in conjunction with the microprocessor. While thinking was previously done by the processor, the vast majority will be done by AI chipsets in the future. This means that deep learning chipsets will be able to command similar, if not increased premium pricing. This is a great news for the semiconductor industry, which has suffered from low growth rates due to eroding average selling prices (ASPs).

Another potential direction in which AI chipsets could take the semiconductor industry is the introduction of new business models. For example, the semiconductor industry could sell boxes or cloud services. NVIDIA is already offering its DGX-1 box optimized for deep learning for $129,000, which is far above a standard chip’s ASP. Nervana, an ASIC company for deep learning chipsets (recently acquired by Intel), planned on offering a cloud service using its chipset. In the future, the cloud could be enhanced to include intelligent compute, storage, and network systems. Such a system could adapt based on the demand and nature of the application, scaling up or down as needed, while providing maximum performance at any given time.

This could also lead to a few other interesting evolutions of service-based business models, such as an “intelligence as a service” business model. This model would enable companies to periodically update the neural networks on their customers’ devices, ensuring that the devices are always up for any task and equipped with the latest data. Services like “vision as a service” or “emotions as service” could also emerge, with devices adding such capabilities as and when needed.

AI will impact the semiconductor industry via higher premiums and new business models. The AI revolution is just beginning and we will see many innovations in the very near future.

Note: This blog was published on Tractica web site some time ago.

Leave Comment

Your email address will not be published. Required fields are marked *