Nvidia's CEO defends his moat as AI labs change how they improve their AI models
“Foundation model pretraining scaling is intact and it’s continuing,” said Huang on Wednesday. “As you know, this is an empirical law, not a fundamental physical law, but the evidence is that it continues to scale. What we’re learning, however, is that it’s not enough.”
That's also what I would say, if the growth of my business depended on people buying my chips, which are the gold standard for pre-training.
Source: TechCrunch