(Reuters) – Nvidia Corp (NVDA.O) dominates chips for training computers to think like humans, but it faces an entrenched competitor in a major avenue for expansion in the artificial intelligence chip market: Intel Corp (INTC.O). Nvidia chips dominate the AI training chip market, where huge amounts of data help algorithms “learn” a task such how to recognize a human voice, but one of the biggest growth areas in the field will be deploying computers that implement the “learned” tasks.
Intel dominates data centers where such tasks are likely to be carried out. “For the next 18 to 24 months, it’s very hard to envision anyone challenging Nvidia on training,” said Jon Bathgate, analyst and tech sector co-lead at Janus Henderson Investors.
But Intel processors already are widely used for taking a trained artificial intelligence algorithm and putting it to use, for example by scanning incoming audio and translating that into text-based requests, what is called “inference.” Intel’s chips can still work just fine there, especially when paired with huge amounts of memory, said Bruno Fernandez-Ruiz, chief technology officer of Nexar Inc, an Israeli startup using smartphone cameras to try to prevent car collisions. That market could be bigger than the training market, said Abhinav Davuluri, an analyst at Morningstar, who sees an inference market of $11.8 billion by 2021, versus $8.2 billion for training.
Intel estimates that the current market for AI chips is about $2.5 billion, evenly split between inference and training. Nvidia, which posted an 89 percent rise in profit Thursday, hasn’t given a specific estimate for the inference chip market but CEO Jensen Huang said on an earnings call with analysts on Thursday that believes it “is going to be a very large market for us.” Nvidia sales of inference chips are rising.
In May, the company said it had doubled its shipments of them year-over-year to big data center customers, though it didn’t give a baseline. Earlier this month, Alphabet Inc’s (GOOGL.O) Google Cloud unit said it had adopted Nvidia’s inference chips and would rent them out to customers. Read more from reuters.com…
thumbnail courtesy of reuters.com