The exciting world of AI

AI keeps on evolving… on a weekly basis! Ever since I started following this industry, not a single week as passed by when I haven’t heard something interesting. While this is fantastic for world and consumers but sometimes I worry that things are getting unnecessarily hyped (just as they’ve always been. Remember dot com boom?) I mean what is the big deal if someone was able to generate an artsy image using auto encoder in a research paper? It is just a paper – it will take years of work before human art loses its meaning and AI art takes over. (and I would even argue that AI art will not come anywhere close to competition with human created art. That is why we have those pictures that you can buy in bed bath and beyond for twenty bucks and then there’s Picasso!). …

Who will go into mass production first for AI ASICs?

Artificial intelligence (AI) and deep learning have generated lot of excitement over the past few years. Many semiconductor startups have emerged to build chipsets optimized for AI. They are tackling compute, communication, and memory-related problems specific to AI algorithm accelerations and building highly optimized architectures that promise low power and high performance. Nervanawas perhaps the first company to build a chipset specifically for AI, which got started in 2014. Nervana wanted to sell cloud services based on its chipsets and bypass the application-specific integrated circuits (ASICs) altogether. When Intel bought the company for $350+ million, the move got everyone’s attention and suggested that exciting times were ahead for the AI chipset industry. At the time, Intel announced that the silicon would be available in 2H 2017.

Is AI the best news for semiconductors since microprocessor days?

The microprocessor chip acts as the brain of any computer system, whether the device is used for the Internet of Things (IoT), consumers, or the cloud. In fact, the semiconductor industry owes its success to Intel’s introduction of its 4004 microprocessor back in the 1970s. That later evolved into the 808X series, which led to the widespread adoption of chips.

Even today, microprocessors receive more media coverage than any other type of chipset. Microprocessors are still the single most expensive semiconductor component in terms of the bill of materials (BOM) on many devices and server processors can fetch as high as $1,000 for a single chip.

ASICs for deep learning are not exactly ASICs

The term “application specific integrated circuit” (ASIC) became popular in the 1990s when such chips promised to bring down the cost per chip for a given application, such as mobile phones or Ethernet cards. An ASIC, by definition, meant developing hardware to solve a problem by building gates to emulate the logic. These chips offered little programmability, but provided maximum performance at a given power and cost budget.

It is hard to trace the origins of the word ASIC and its use to describe deep learning chipsets. Perhaps the deep learning chipset industry started using the word ASIC to differentiate itself from the rest of the chipsets. The ASIC movement for deep learning is entirely driven by startups and is making headlines of …