Nvidia expands in artificial intelligence, but Intel has taken root

Tinuku - Nvidia Corp dominates chips for training computers to think like humans, but it faces an entrenched competitor in a major avenue for expansion in the artificial intelligence chip market Intel Corp.

Nvidia chips dominate the AI training chip market, where huge amounts of data help algorithms “learn” a task such how to recognize a human voice, but one of the biggest growth areas in the field will be deploying computers that implement the “learned” tasks. Intel dominates data centers where such tasks are likely to be carried out.

Tinuku Nvidia expands in artificial intelligence, but Intel has taken root

But Intel processors already are widely used for taking a trained artificial intelligence algorithm and putting it to use, for example by scanning incoming audio and translating that into text-based requests, what is called “inference.” Intel’s chips can still work just fine there, especially when paired with huge amounts of memory.

That market could be bigger than the training market who sees an inference market of $11.8 billion by 2021, versus $8.2 billion for training. Intel estimates that the current market for AI chips is about $2.5 billion, evenly split between inference and training.

Nvidia, which posted an 89 percent rise in profit Thursday, hasn’t given a specific estimate for the inference chip market but CEO Jensen Huang said on an earnings call with analysts on Thursday that believes it “is going to be a very large market for us.”

Nvidia sales of inference chips are rising. In May, the company said it had doubled its shipments of them year-over-year to big data center customers, though it didn’t give a baseline. Earlier this month, Alphabet Inc’s Google Cloud unit said it had adopted Nvidia’s inference chips and would rent them out to customers.

But Nvidia faces a headwind selling inference chips because the data center market is blanketed with the CPUs Intel has been selling for 20 years. Intel is working to persuade customers that for both technical and business reasons, they should stick with what they have.

Take Taboola, a New York-based company that helps web publishers recommend related content to readers and that Intel has touted as an example of how its chips remain competitive.

The company uses Nvidia’s training chips to teach its algorithm to learn what to recommend and considered Nvidia’s inference chips to make the recommendations. Speed matters because users leave slow-loading pages. But Taboola ended up sticking with Intel for reasons of speed and cost.

Nvidia’s chip was far faster, but time spent shuffling data back and forth to the chip negated the gains. Second, Intel dispatched engineers to help Taboola tweak its computer code so that the same servers could handle more than twice as many requests.

Nvidia has been working to solve those challenges. The company has rolled out software this year to make its inference chips far faster to help overcome the issue of moving data back and forth. And it announced a new family of chips based on a technology called Turing earlier this week that it says will be 10 times faster still.

“We are actively working with just about every single Internet service provider in the world to incorporate inference acceleration into their stack. Voice recognition is only useful if it responds in a relatively short period of time. And our platform is just really, really excellent for that,” Huang said.