New Artificial Intelligence Chips Lean Toward the Edge
What a difference a year makes. Few companies had enjoyed the sort of bull run AI chipmaker Nvidia (NVDA ) had been on, returning more than +1200% between June 2015 and June 2018, eventually hitting a market cap of about $175 billion by September 2018. Then everything went south – literally – as the market took a historic plunge in the fourth quarter, taking Nvidia with it. However, while many companies have bounced back, Nvidia has continued to languish, sitting at a valuation of about $88 billion, pretty much where it was circa May 2017 when we compared its AI chip technology against AMD (AMD). Now, over the last five years, the two chip manufacturers have returned almost identical value to investors, while a number of upstart startups have risen to also challenge Nvidia’s supremacy with new artificial intelligence chips.
In fact, it was exactly three years ago that we first introduced you to five startups building artificial intelligence chips, and then followed that up with 12 new AI chip makers in 2017. Last year, we noted that the Chinese are also gunning for Nvidia with their own homegrown artificial intelligence chips, as China seeks to dominate everything to do with AI and other emerging technologies. Now there are at least 40 different startups that have raised capital to develop AI hardware, according to research firm CB Insights, within an ecosystem of some 2,000 companies that have raised $19 billion in equity for AI-related software or services. Here’s a look at just some of the startups working on new AI chips (data taken from Crunchbase).
|Company||Location||Total Funding||Last Funding Date||Last Funding Amount|
|Horizon Robotics||Beijing, China||$700M||Feb-2019||$600M|
|SambaNova Systems||Silicon Valley||$206M||Apr-2019||$150M|
|Wave Computing||Campbell, California||$203.3M||Nov-2018||$86M|
|Cambricon Technologies||Beijing, China||$200M||Jun-2018||$100M|
|Habana Labs||San Jose, California||$120M||Nov-2018||$75M|
|ThinkForce Electronic Technology||Shanghai, China||$68 M||Dec-2017||$68M|
|ThinCI||El Dorado Hills, California||$65 M||Sep-2018||$65M|
|Esperanto Technologies||Silicon Valley||$63M||Nov-2018||$58M|
|DeePhi Tech||Beijing, China||$40M||Oct-2017||$40M|
|Hailo||Tel Aviv, Israel||$24.5M||Jan-2019||$8.5M|
|Nervana Systems||San Diego, California||$24.4M||Jun-2015||$20.5M|
|Tachyum||San Jose, California||$17M||Mar-2019||$17M|
|NovuMind||Santa Clara, California||$15.2M||Dec-2016||$15.2M|
|Flex Logix Technologies||Silicon Valley||$12.4M||May-2017||$5M|
|Untether AI||Toronto, Ontario||$10M||Apr-2019||$10M|
|Cornami||Santa Clara, California||$6.5M||Nov-2017||$1.7M|
|krtkl||San Francisco, California||$160,000||Oct-2015||$160,000|
While few if any of these AI chip startups may be a direct threat to Nvidia at this point – its current struggles are related to the inevitable cryptocurrency crash and supply chain problems – it can’t be too far in the future where some of these new companies will start carving out market share with potentially better and cheaper hardware.
Why Does AI Need Special Chip Hardware?
The computing needs for AI applications like facial recognition are different from the hardware used in your basic laptop or smartphone, let alone the huge servers that power cloud services or supercomputers. Nvidia’s graphics processing units (GPUs) were originally designed for intense computing needs required for games and other complex software. It was a stroke of luck that they also worked well for neural networks, or AI systems that mimic the human brain in order to do things like detect patterns in data that might help stop cyberattacks or even discover new drugs.
If we think of how AI software is the new oil, then artificial intelligence chips are the engine itself. You might say we’re about to move from the Model T phase right into the Tesla phase, as startups begin customizing the hardware for different applications. Many of the new companies are especially focused on what’s called edge computing, meaning hardware where some of the heavy AI computing takes place within the device itself. That’s a market where many of the Chinese startups are particularly focused.
The $100 Million AI Chip Club
It’s really only in the last two or three years that investors have really started to put serious money into some of the startups developing artificial intelligence chips. We count at least eight AI chip startups that have raised $100 million or more so far, three of which are in Beijing. While the top two AI chip companies by funding are Chinese, the third most funded AI chip company is a U.K. startup called Graphcore.
With $310 million in capital and a $1.7 billion valuation in only three short years of existence, Graphcore is by far the most well funded non-Chinese firm among the bunch. It has an impressive roster of investors, including Microsoft (MSFT), Sequoia Capital, and BMW, all of which chipped in for the $200 million Series D last December. The company is developing an artificial intelligence chip it calls an intelligence processing unit (IPU) that it claims is a better approximation of the human brain. In effect, its design sacrifices a certain amount of number-crunching precision to allow the machine to tackle more math more quickly with less energy. Or, as a recent deep dive in Bloomberg explained: “It’s sort of the equivalent of a human brain shifting from calculating the exact GPS coordinates of a restaurant to just remembering its name and neighborhood.”
In the list of AI chip startups with more than $100 million in funding, there are a couple of high-rolling AI chip designers that we haven’t covered yet.
Silicon Valley startup SambaNova Systems, founded in 2017, has suddenly jumped into the fray with a $150 million Series B in April, bringing total funding to about $206 million, putting it on par with Wave Computing and its war chest of about $203.3 million. SambaNova also boasts of investors like Google (GOOGL) and Intel, and its founders include a pair of Stanford professors and a former C-suite guy that did turns at Sun Microsystems and Oracle. While details are still sparse, SambaNova describes its approach as software-defined hardware, where the software and algorithms define the processing power and dataflow requirements of the hardware. In other words, SambaNova is building a complete AI ecosystem from scratch.
Another startup that recently broke into the $100 million club is Habana Labs out of Israel that raised a $75 million Series B last November, bringing total funding to $120 million. Apparently, Intel is hedging its bets, as it was the lead investor in the latest round, along with VC firm Bessemer Venture Partners, among others. Habana recently unveiled its first AI processor from its Goya line, focusing on what is referred to as deep learning inference.
Habana Labs is designing chips that can run pre-trained algorithms. What does that mean exactly? The software, like a picture, can be compressed (or, more technically, the layers of the neural network can be merged together), which allows for a faster and less-power hungry processor than those needed for training neural networks.
The company says its processors offer one to three orders of magnitude better performance than its competitors. In the graph above, Habana’s artificial intelligence chip can breeze through 15,000 images per second, while consuming only 100 watts of power.
Update 12/16/2019: Intel Corporation has acquired Habana Labs for approximately $2 billion to strengthen Intel’s artificial intelligence (AI) portfolio and accelerate its efforts in the nascent, fast-growing AI silicon market, which Intel expects to be greater than $25 billion by 2024.
More Startups Developing Artificial Intelligence Chips
Continuing down the list, we see eight more AI chip startups that we haven’t looked at yet, many of which are designing AI chips around inference applications, especially for edge computing devices, from self-driving cars to smartphones.
Founded in 2016, Silicon Valley-based Groq has raised a not-so-measly $62.3 million, including a 52.3 million round last September. There’s not much information on the startup except that it, reputedly, is developing a tensor processing unit (TPU), another type of AI-specific chip. It’s the hardware that Google designed in-house. Coincidentally, some of Groq’s founders are former Google executives.
Founded way back in 2014, Silicon Valley-based Esperanto Technologies has raised a total of $63 million after a $58 million Series B last November. One of its primary investors is Western Digital (WDC), the data storage company. The startup is trying to do what most of its competitors are also seeking to accomplish – build more energy-efficient chips by customizing for applications – through adopting open standards for developing hardware and software.
Founded in 2017, Hailo hails from Tel Aviv, Israel, and has raised $24.5 million in disclosed capital, including an extended $8.5 million Series A in January, only about six months after taking in $12.5 million. The startup is focused on artificial intelligence chips for connected edge devices (aka, the Internet of Things), recently rolling out its first processors that it claims are like having the capability of a data center on your device.
Founded in 2016, Tachyum is ostensibly a Silicon Valley startup mainly based in the Slovak Republic, which loaned $17 million to the company in March to build an R&D facility in the country. The company is developing what it calls the Prodigy Universal Processor Chip, which it claims is the smallest and fastest general purpose, 64-core processor for data centers. It says the new hardware, which will be used in a new European supercomputer, requires a tenth of the power of other processors, reducing costs by a third.
Founded in 2015, NovuMind is yet another Silicon Valley startup developing artificial intelligence chips. It has raised $15.2 million. Its NovuTensor hardware, like Habana Labs, is designed with AI inference applications, such as object detection and classification, in mind. It also claims higher throughput than its competitors with low latency and power requirements.
Founded in 2017, Toronto-based Untether AI raised about $13 million in April from Intel. It’s working on hardware that MIT Technology Review says works at “warp speed” by “transferring data between different parts of the chip 1,000 times more quickly than a conventional AI chip.” Untether is able to do that while lowering power consumption by reducing the physical distance between memory and processing tasks, known as near-memory computing.
Update 11/07/19: Untether AI has raised $7 million in Series A funding to lay runway for their next stage of growth. This brings the company’s total funding to $20 million to date.
Founded in 2014, Flex Logix Technologies has raised $12.4 million in disclosed funds. And, yes, it’s based in Silicon Valley. Like Hailo, Flex Logix is focused on AI chips for high-end edge devices like robots. It claims its new hardware, the InferX X1, offers nearly the throughput power of a data center but with just a fraction of the power and cost of other competitors.
Founded in 2012, Cornami is based (you guessed it) in Silicon Valley, pulling together about $6.5 million to date, though they’ve actually raised more than that prior to undergoing a name change. With a very experienced leadership team at the helm, Cornami is designing chips with elements specialized for both training neural networks and inference applications. While the company has been operating in stealth mode for a while, they began disclosing their approach late last year in an article by ZDNet which talks about their unique approach on systolic arrays from the 70 – 80s. The company explained to us in simple terms how they differentiate this technology:
Systolic arrays are used by many AI processor architectures in the market, however, they are deployed as a fixed square block, accelerating a specific function. In other words, a fixed systolic array is customized to address a single problem. Running a slightly different workload on a fixed systolic array results in inefficiencies in terms of power, performance and latency. Cornami has patented “dynamic reconfigurable systolic arrays.” They can be dynamically reconfigured under software control to fit the algorithm and can be any shape. This delivers enormous advantages in optimizing the compute resources to the workload tasks for all the right things – meaning higher performance at lower power.
“We are keeping the company stealth rather than spending investor funds in promoting unsubstantiated market claims,” the company told us, which is surely something their investors appreciate. With more than 75 issued patents thus far, they appear to be spending that money wisely.
You’re probably picking up on a few themes by now: Startups designing new artificial intelligence chips are looking at customizing the hardware for more granular applications, which leads to innovations in design and function. The end result is faster, leaner hardware that’s less power-hungry, which reduces costs in both cloud and edge computing. The question will be which of these many new startups can win customers and market share before they burn through all of their cash. No doubt some of the big fish will eat the little ones, with companies like Intel prowling the waters for a tasty acquisition.
If you enjoyed this article, then sign up for our free newsletter - Nanalyze Weekly. About every week, we'll send you a simple summary of all our new articles. If you didn't enjoy this article, share it on Twitter and tell everyone how much you hated it.