Why would they? They were first with CLAUDE.md. Others could have adopted to that if they wanted. Don’t see a reason for Claude to change their approach.
Being a good citizen of the commons means not hard-coding things specific about your product as a standard. ChatGPT or Gemini using a file called "CLAUDE" doesn't make sense. The first mover doesn't just automatically win.
It’s similar to or better than Opus 4.5 as per benchmarks, while being 2x-3x cheaper, definitely worth it over Opus 4.6, if cost/tokens is the concern.
... attributionism for such a trivial thing is a waste of time. Multiple people can come up with a term like this independently because it's not that creative. People have been doing this with the ";dr" suffix for as long as it has been popular.
He sounds rattled, you don't respond in this manner from the position of power. They didn't need to respond at all, all they did opening their mouth is bring more eyes to Anthropic.
> He[Jensen Huang] has also privately criticized what he has described as a lack of discipline in OpenAI’s business approach and expressed concern about the competition it faces from the likes of Google and Anthropic, some of the people said.
People talk about an AI bubble. What we actually have is a GPU bubble. NVidia makes really expensive GPUs for AI. Others also make GPUs.
Companies like Google produce and operate AI models largely using their own TPUs rather than NVidia's GPUs. We've seen the Chinese produce pretty competitive open models with either older NVidia GPUs or alternative GPUs because they are not allowed to buy the newer ones. And AMD, Intel and other chip makers are also eager to get in on the action. Companies like Microsoft, Amazon, etc. have their own chips as well (similar to Google). All the hyperscalers are moving away from NVidia.
And then Apple runs a non Intel and non NVidia based range of workstations and laptops that are pretty popular with AI researchers because the M series CPU/GPU/NPU is pretty decent value for running AI models. You see similar movement with ARM chips from Qualcomm and others. They all want to run AI models on phones, tablets, laptops. But without NVidia.
NVidia's bubble is about vastly overcharging for a thing that only they can provide. Their GPU chips have enormous margins relative to CPU chips coming out of the same/similar machines. That's a bubble. As soon as you introduce competition, the companies with the best price performance wins. NVidia is still pretty good at what they do. But not enough to justify an order of magnitude price/cost difference.
NVidia's success has been predicated on its proprietary software and instruction set (CUDA). That's a moat that won't last. The reason Google can use its own TPUs rather than CUDA is that it worked hard to get rid of their CUDA dependence. Same for the other hyperscalars. At this point they can do training and inference without CUDA/NVidia and its more cost effective.
The reason that this 100B deal is apparently being reconsidered is that it is a bad deal for OpenAI. It was going to overpay for a solution that they can get cheaper elsewhere. It's bad news for NVidia, good news for OpenAI. This deal started out with just NVidia. But at this point there are also deals with AMD, MS, and others. OpenAI like the other hyperscalers is not betting the company on NVidia/CUDA. Good for them.
> People talk about an AI bubble. What we actually have is a GPU bubble. NVidia makes really expensive GPUs for AI. Others also make GPUs.
Yes it is. I think even for multiple reasons. Competition in that space not sleeping is one but it's also a huge overestimation of demand combined with the questionable believe those GPUs and the Datacenters housing them can actually be built and put into operation as fast as envisioned.
> The reason that this 100B deal is apparently being reconsidered is that it is a bad deal for OpenAI. It was going to overpay for a solution that they can get cheaper elsewhere. It's bad news for NVidia, good news for OpenAI. This deal started out with just NVidia. But at this point there are also deals with AMD, MS, and others. OpenAI like the other hyperscalers is not betting the company on NVidia/CUDA. Good for them.
I think in case of OpenAI both may be true. While what you are saying makes sense, NVs first mover advantage obviously can't last forever, OpenAI currently does have little to no competitive advantage over other players. Combine this with the fact that some (esp. Google) sit on a huge pile of cash. In contrast for OpenAI the party is pretty much over as soon as investors stop throwing money into the oven so they might need to cut back a bit.
Over, and over, and over again, until ICE is disbanded and those involved are held accountable. When that happens (and how high the casualty number gets) is up to the American people.
reply