Nvidia's stake in Intel could have terrible consequences. First, it is in Nvidia's interest to kill Intel's Arc graphics, and that would be very bad because it is the only thing brighing GPU prices down for consumers. Second, the death of Intel graphics / Arc would be extremely bad for Linux, because Intel's approach to GPU drivers is the best for compatibility, wheras Nvidia is actively hostile to drivers on Linux. Third, Intel is the only company marketing consumer-grade graphics virtualization (SR-IOV), and the loss of that would make Nvidia's enterprise chips the only game in town, meaning the average consumer gets less performance, less flexibility, and less security on their computers.
Conclusion: Buy AMD. Excellent Linux support with in-tree drivers. For 15 years! A bug is something which will be fixed.
Nvidias GPUs are theoretically fast on initial benchmarks. But that’s mostly optimization by others for Nvidia? That’s it.
Everything Nvidia has done is a pain. Closed-source drivers (old pain), out of tree-drivers (new pain), ignoring (or actively harming) Wayland (everyone handles implicit sync well, except Nvidia which required explicit sync[1]), and awkward driver bugs declared as “it is not a bug, it is a feature”. The infamous bug:
This extension provides a way for applications to discover when video
memory content has been lost, so that the application can re-populate
the video memory content as necessary.
This extension will be soon ten years old. At least they intend to fix it? They just didn’t in the past 9 years! Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on. The good news is, after years someone figured that out and implemented a workaround. For X11 with GNOME:
> Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on.
This actually makes sense: for example, a new task has swapped out previous task's data, or host and guest are sharing the GPU and pushing each others data away. I don't understand why this is not a part of GPU-related standards.
As for solution, discarding all the GPU data after resume won't help? Or keeping the data in the system RAM.
Apparently it is 5% ownership. Does that give them enough leverage to tank Intel’s iGPUs?
That would seem weird to be. Intel’s iGPUs are an incredibly good solution for their (non-glamorous) niche.
Intel’s dGPUs might be in a risky spot, though. (So… what’s new?)
Messing up Intel’s iGPUs would be a huge practical loss for, like, everyday desktop Linux folks. Tossing out their dGPUs, I don’t know if it is such a huge loss.
Intel's iGPUs don't seem very at risk because the market for low-power GPUs isn't very profitable to begin with. As long as Nvidia is able to sell basically any chip they want, why waste engineering hours and fab time on low-margin chips? The GT 1030 (Pascal) never got a successor, so that line is as good as dead.
Even before the Pascal GTs, most of the GT 7xx cards, which you would assume were Maxwell or Kepler from the numbering, were rebadged Fermi cards (4xx and 5xx)! That generation was just a dumping ground for all the old chips they had laying about, and given the prominence of halfway decent iGPUs by that point, I can't say I blame them for investing so little in the lineup.
That said, the dGPUs are definitely somewhat at risk, but I think the risk is only slightly elevated by this investment, given that it isn't exactly a cash cow and Intel has been doing all sorts of cost-cutting lately.
Agree. Not only would there be no money in it to try to replace Iris graphics or whatever they call them now -- it would be ultra pointless because the only people buying integrated graphics are those where gaming, on-device AI, and cryptocurrency aren't even part of the equation. Now, that is like 80%+ of the PC market, but it's perfectly well served already.
I saw this move more as setting up a worthy competitor to Snapdragon X Elite, and it could also probably crush AMD APUs if these RTX things are powerful.
Calling BS on "gaming not part of the equation". Several of my friends and I have exclusively games on integrated graphics. Sure we don't play the most abusively unoptimized AAA games like RDR2. But we're here and we're gaming.
RDR2 is quite optimized. We spend a lot of time profiling before release, and while input latency can be a tad high, the rendering pipeline is absolutely highly optimized as exhibited by the large amount of benchmarks on the web.
Sorry, I'm happy for you, and I do play Minecraft on an iGPU. I just meant that about 80% of the PCs sold seem to be for "business use" or Chromebooks, and the people writing those POs aren't making their selections with gaming in mind.
(And also, I'm pretending Macs don't exist for this statement. They aren't even PCs anymore anyway, just giant iPhones, from a silicon perspective.)
> Sure we don't play the most abusively unoptimized AAA games like RDR2.
Wait, RDR2 is badly optimized? When I played it on my Intel Arc B580 and Ryzen 7 5800X, it seemed to work pretty well! Way better than almost any UE5 title, like The Forever Winter (really cool concept, but couldn't get past 20-30 FPS, even dropping down to 10% render scale on a 1080p monitor). Or with the Borderlands 4 controversy, I thought there'd be way bigger fish to fry.
"Gaming" = "real-time-graphics-intensive application". You could be playing chess online, or emulated SNES games, but that's not what "gaming" refers to in a hardware context.
It would be amusing to see nVidia cores integrated into the chipset instead of the Intel GPU cores. I doubt that is in the cards unless Intel is looking to slash the workforce by firing all of their graphics guys.
> Tossing out their dGPUs, I don’t know if it is such a huge loss
It would be an enormous loss to the consumer/enthusiast GPU buyer, as a third major competitor is improving the market from what feels like years and years of dreadful price/perf ratio.
amd is slow and steady. they were behind many times and many times they surprrised with amazing innovations overtaking intel. they will do it again, for both CPU and GPU.
I would guess Nvidia doesn't care at all about the iGPUs, so I agree they are probably not at risk. dGPUs though I absolutely agree They are in a risky spot. Perhaps Intel was planning to kill their more ambitious GPU goals anyway, but That seems extremely unhealthy for pretty much everyone except Nvidia
We'd have to see their cap table approximation, but I've seen functional control over a company with just a hair over 10% ownership given the voting patterns of the other stock holders.
5% by about any accounting makes you a very, very influential stockholder in a publicly traded company with a widely distributed set of owners.
- The datacenter GPU market is 10x larger than the consumer GPU market for Nvidia (and it's still growing). Winning an extra few percentage points in consumer is not a priority anymore.
- Nvidia doesn't have a CPU offering for the datacenter market and they were blocked from acquiring ARM. It's in their interest to have a friend on the CPU side.
- Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC. Intel is one of the only other leading fabs onshoring, which significantly improves Nvidia's supplier negotiation position and hedges geopolitical risk.
Nvidia's options are fund your competition to keep the market dynamic, or let the government do it by breaking you a part.
So yes. That's how American competition works.
It isn't a zero sum game. We try to create a market environment that is competitive and dynamic.
Monopolies are threat to both the company and a free open dynamic market. If Nvidia feels it could face an antitrust suit, which is reasonable, it is in its best interest to fund the future of Intel.
Will Nvidia continue to exist beyond the current administration? If yes, then would it be prudent to consider the future beyond the current administration?
One interesting parallel is Intel and AMD back in x86 1991, which is today the reason AMD is at all allowed to produce x86 without massive patent royalties to intel. [Asianometry](https://youtu.be/5oOk_KXbw6c) had a nice summery of it.
Nvidia is leaning more into data centres, but lack a CPU architecture or expertise. Intel is struggling financially, but have knowledge in iGPUs and a vast amount of patents.
They could have alot to give one another, and it's a massive win if it keeps intel afloat.
Microsoft wasnt funding bankrupt Apple, Microsoft was settling lawsuit with Jobs just on the cusp of DOJ monopoly lwasuit. Microsoft was stealing and shipping Apple QuickTime sourcecode.
> handwritten note by Fred Anderson, Apple's CFO, in which Anderson wrote that "the [QuickTime] patent dispute was resolved with cross-licence and significant payment to Apple." The payment was $150 million
This seems like it could be a long term existential threat for AMD. AMD CPU + GPU combos are finally coming out strong, both with MI300+ series in the supercomputing space, Strix Halo in laptops, etc. They have the advantage of being able to run code already optimized for x86 (important for gamers and HPC code), which NVIDIA doesn't have. Imagine if Grace Blackwell had x86 chips instead of Arm. If NVIDIA can get Intel CPUs with its chip offerings, it could be poised to completely take over so many new portions of the market/consolidate its current position by using its already existing mindshare and market dominance.
The article hints at it, but my guess would be this investment is intended towards Intel foundry and getting it to a place where NVIDIA can eventually rely on them over TSMC — and the ownership largely to give them upside if/when Intel stock goes up on news of an NVIDIA contract etc. It isn’t that uncommon of an arrangement for enterprise deals of such a potential magnitude. Long-term, however, and without NVIDIA making the call that could definitely have the effect of leading to Intel divesting from directly competing in as many markets, ie Arc.
For context, I highly recommend the old Stratechery articles on the history of Intel foundry.
My first thought was also that this relates to Intel's foundry business. Even if only to be able to use it in price negotiations with TSMC (it's hard to threaten to go elsewhere when there is no elsewhere left).
This seems more like the deal where Microsoft invested in Apple. It’s basically charity and they will flip it in a few years when Intel gets back on their feet.
Something about this reminds me of other industry gobbling purchases. None of them ever turned out better for the product, price or general well being of society.
As an Apple user (and even an Apple investor), I'd rather that Apple went out of business back then. If we could re-roll the invention of the (mainstream) smartphone, maybe we'd get something other than two monopolistic companies controlling everything.
For instance, maybe if there were 4 strong vendors making the devices with diverse operating systems, native apps wouldn't have ever become important, and the Web platform would have gotten better sooner to fill that gap.
Or maybe it'd have ended up the same or worse. But I just don't think Apple being this dominant has been good for the world.
Or... we could still be using blackberry-like devices without much in the way of active/touch interface development at all. Or worse, the Windows CE or Palm with the pen things.
Well, AMD isn't going away yet, and they do seem to have finally released the advantage of open-source drivers. But that's still bad very for competition and prices.
This is a death blow to the Intel GPU+AI efforts and should not be allowed by the regulators. It is clear that Intel needs the downstream, low-cost GPU market segment to have a portfolio of AI chips based on chiplets, where most defective ones end up in the consumer grade GPUs based on manufacturing yield. NVidias interest is now for Intel not to enter either the GPU market, nor the AI market - which Intel was preparing for with its GPU efforts in recent years.
The US government is itself a major shareholder in Intel, and has every incentive to push Intel stock over its competitors. It's almost a certainty that Nvidia was forced into this deal by the government as well. We are way beyond regulation here.
They want a source of chips for the wars they want to conduct that is not either controlled by the party they want to go war with, or way way closer to the party they want to go to war with than they are. Buying a chunk of Intel is a way of making sure they do the things the government wants that will lead to that outcome. Or at least so the theory goes; I've got my own cynicism on this matter and wouldn't dream of tamping down on anyone else's.
Right now if the US wants to go to war with China, or anyone China really really likes, they can expect with high probability to very quickly encounter major problems getting the best chips. AIUI the world has other fab capacity that isn't in Taiwan, and some of it is even in the US, but they're all on much older processes. Some things it's not a problem that maybe you end up with an older 500MHz processor, but some things it's just a non-starter, like high-end AI.
Sibling commenters discussing profits are on the wrong track. Intel's 2024 revenue, not profits, was $53.1 billion. The Federal Government in 2024 spent $6,800 billion. No entity doing $1.8 trillion in 2024 in deficit spending gives a rat's ass about "profits". The US Federal government just spends what it wants to spend, it doesn't have any need to generate any sort of "profits" first. Thinking the Federal government cares about profits is being nowhere near cynical enough.
This is generally true even setting side the "war with China" angle. Intel is a large domestic company employing hundreds of thousands in a very critical sector, and the government has every incentive to prevent it from failing. In the last two decades we've bailed out auto companies and banks and US Steel (kinda) for the same reason.
> Right now if the US wants to go to war with China
The US is desperate to not have that war, because they spent so long in denial about how sophisticated China has become that it would be a total humiliation. What you see as the US wanting war is them simply playing catch up.
Concisely put. This is exactly the reasoning. The US is preparing for a potential war with China in 2026 or 2027, and this is how it is beginning preparations.
Sure, but this is an interesting independent of the government holding Intel stock.
The US government always ought to have the interest of US companies in mind, their job is to work in the interest of the voters and a lot of us work for US companies.
They can buy enough stock to shift the price, then use that as a lever to control their own investments prices (and thence profits). Like they've done with tariffs.
That sounds more like an abuse of government powers for individual gain than any legitimate government interest. If that was the plan it would make just as much sense to short a company and then announce a plan to put them under greater regulatory scrutiny.
Shouldn't be, yes. Isn't? Have you seen the rhetoric around tariffs? A lot of people thought they wanted the government run like a business, so welcome to the for-profit government society.
Well, the AI bubble will eventually pop since none of the major AI chatbots are remotely profitable, even on OpenAI's eyewatering $200/month pay plan which very few have been willing to pay, and even on that OpenAI is still loosing money on it. And when it pops, so will Nvidia's stock, it's only a matter of time.
The AI hype train was built on the premise that AI will progress linearly and eventually end up replacing a lot of well paid white collar work, but it failed to deliver on that promise by now, and progress has flatlined or sometimes even gone backwards (see GPT-5 vs 4o).
FAANG companies can only absorb these losses for so long before shareholders pull out.
The AI bubble pop is probably not something NVIDIA is super looking forward to, but of anybody near the bubble they are the least likely to really get hurt by it.
They don’t make AI chips really, they make the best high-throughput, high-latency chips. When the AI bubble pops, there’ll be a next thing (unless we’re really screwed). They’ve got as good chance of owning that next thing as anybody else does. Even better odds if there are a bunch of unemployed CUDA programmers to work on it.
When you follow the progress in the last 12 months, it really isn't. Big AI companies spent "hella' stacks" of cash, but delivered next to no progress.
Progress has flatlined. The "rocket to the moon" phase has already passed us by now.
The white collar worker doesn't need to be replaced for the bots to be profitable. They just need to become dependent on the bots to increase their productivity to the point where they feel they cannot do their job without the chatbot's help. Then the white collar worker will be happy to fork over cash. We may already be there.
Also never forget that in technology moreso than any other industry showing a loss while actually secretly making a profit is a high art form. There is a lot of land grabbing happening right now, but even so it would be a bit silly to take the profit/loss public figures at face value.
Numbers prove we aren't. Sales figures show very few customers are willing to pay $200 per month for the top AI chatbots, and even at $200/month, OpenAI is still taking a loss on that plan so they're still loosing money even with top dollar customers.
I think you're unaware just how unprofitable the big AI products are. This can only go on for so long. We're not in the ZIRP era anymore where SV VC funded unicorns can be unprofitable indefinitely and endlessly burn cash on the idea that when they'll eventually beat all competitors in the race to the bottom and become monopolies they can finally turn a profit by squeezing users with higher real-world price. That ship has sailed.
I don't think you can confidently say how it will pan out. Maybe OpenAI is only unprofitable at the 200/month tier because those users are using 20x more compute than the 20/month users. OpenAI claims that they would be profitable if they weren't spending on R&D [1], so they clearly can't be hemorrhaging money that badly on the service side if you take that statement as truthful.
"OpenAI claims that they would be profitable if they weren't spending on R&D "
Ermmm dude they are competing with Google. They have to keep reinvesting otherwise Google captures the users OAI currently has.
Free cash flows matter. Not accounting earnings. On a FCFF basis they largely in the red. Which means they have to keep raising money, at some point somebody will turn around and ask the difficult questions. This cannot go on forever.
And before someone mentions Amazon... Amazon raised enough money to sustain their reinvestment before they eventually got to the place where their EBIT(1-t) was greater than reinvestment.
He didn’t say anything violent. Have you watched the monologue?
Even if he did (which he didn’t), I don’t see Fox shutting down anything when one of their presenters recently stated, on air, that we should euthanize our homeless population.
To be clear (not that I agree with this situation):
Fox News (where that presenter works) is a cable network, beholden to the cable providers but not a broadcaster. The FCC has relatively little leverage to regulate it, because it does not rely on broadcast licenses.
ABC is a broadcast network. It relies on a network of affiliates (largely owned by a few big companies) who selectively broadcast its programming both over the airwaves and to cable providers. Those affiliates have individual licenses for their radio broadcasting bandwidth which the FCC does have leverage over (and whose content the FCC has a long history of regulating, but not usually directly over politics, e.g. public interest requirements, profanity, and obscenity laws).
Of course I watched it, many times. I didn't say he said anything directly violent, but he spread hateful disinformation about someone's death, entirely against FBI's findings and common sense, during a time of the highest temperatures in a while. Just to try to win the attention of people that'd rather not look in the mirror.
This is exactly the kind of disingenuous, dehumanizing behavior that radicalizes people like Tyler. And saying that right now would be like if Reagan got in to a spat about something personal during the cold war.
Intel had an opportunity to differentiate themselves by offering more VRAM than Nvidia is willing to put in their consumer cards. It seemed like that was where Battlemage was going.
But now, are they really going to undermine this partnership for that? Their GPUs probably aren't going to become a cash cow anytime soon, but this thing probably will. The mindset among American business leaders of the past two decades has been to prioritize short-term profits above all else.
It may be that Nvidia doesn’t really see Intel as a competitor. Intel serve a part of the GPU market that Nvidia has no interest in. This reminds me a bit of Microsoft’s investment into Apple. Microsoft avoided the scorn of regulators by keeping Apple around as a competitor and if they succeed, great, they make money off of the deal.
I remember when I was studying for an MBA.. a professor was talking about the intangible value of a brand .. and finance.. and how they would reflect on each other ..
At some point we were decomposing the parts of a balance sheet and they asked if one could sell the goodwill to invest in something else .. and the answer was of course .. no… well.. America has proven us wrong .. the way you sell the goodwill is to basically enshittification.. you quickly burn all your brand reputation by lowering your costs with shittier products .. your goodwill goes to 0 but your income increases so stock go up .. the CEO gets a fat bonus for it .. even tho the company itself is destroyed .. then the CEO quickly abandons ship and does the same on their next company .. rinse and repeat… infinite money!
We always called this “monetizing the brand” and it’s been annoying me since at least when Sperry when private equity and the shoes stopped being multi-year daily drivers
I don’t follow how it’s a death knell to intel AI chips. Nvidia bought shares, not a board seat. May be that’s the plan, but if you take the example of Microsoft buying apple shares that only gave apple a lifeline to build better. I do understand nvidia wants to have the whole gpu market to themselves but how will they do it?
I think the assumption there is that the strategic partnership that is part of the deal would in effect preclude Intel from aggressively competing with NVIDIA in that market, perhaps with the belief that the US governments financial stake in Intel would also lead to reduced anti-trust scrutiny of such an agreement not to compete.
They literally bought board seats - not today, but shares entitle you to vote on the board members on the next shareholder meeting. And 5$bn of shares buy you a lot of votes.
5$bn may not buy a huge amount of voting power, but if there are close votes on important things then it could be enough to affect the company. Keeping ones enemies closer, regardless of voting, can also help overall.
The likelihood intel AI was going to catch up with efforts like AWS Trainium, let alone Nvidia was already vanishingly small. This gives intel a chance at maintaining leading edge fab technologies.
I feel bad for gamers - I’ve been considering buying a B580 - but honestly the consumer welfare of that market is a complete sidenote.
I don’t agree. OneAPI gets a lot of things right that ROCM doesn’t, simply because ROCM is a 1:1 rip of what nvidia provides (warts and historical baggage included) whereas OneAPI was thoughtfully designed and did away with all of that. Intel has a strong history in networking, much stronger than Xilinx/AMD, and really was the best hope we had for an open standard to replace nvidia’s hellscape.
> This gives intel a chance at maintaining leading edge fab technologies.
I don't think so:
> The chip giant hasn’t disclosed whether it will use Intel Foundry to produce any of these products yet.
It seems pretty likely this is an x86 licensing strategy for nvidia. I doubt they're going to be manufacturing anything on intel fabs. I even wonder if this is a play to get an in with Trump by "supporting" his nationalizing intel strategy.
nvidia doesn’t need x86, they’re moving forward on aarch64 and won’t look back. For example, one of the headlines from CUDA 13 is that sbsa can be targeted from all toolkits, not as a separate download, which is important for making it easy to target grace. They have c2c silicon on grace for native host side nvlink. They’re not looking back.
They're clearly looking back though, investing in Intel and announcing quite substantial partnerships. Maybe they're not looking back for technical reasons, but they are looking back.
> The likelihood intel AI was going to catch up with efforts like AWS Trainium, let alone Nvidia
...and yet Nvidia is not gambling with the odds. Intel could have challenged Nvidia on performance-per-dollar or per watt, even if they failed to match performance in absolute terms (see AMD's Zen 1 vs Intel)
That was quite a long time ago! Intel going down the chutes now isn’t an effective punishment for how it behaved under Andy Grove and won’t deter others from Grove-like behaviour. Instead it’ll just mean even less restraint on any of the big players with market power now, like nVidia, AMD and TSMC.
Consumer gpus are totally different products from the high end gpus now. Intel has failed on the gpu market and has effectively zero market share, so it is not actually clear there is an antitrust issue in that market. It would be nice if there was more competition but there are other players like AMD and a long tail of smaller ones
>Consumer gpus are totally different products from the high end gpus now. Intel has failed on the gpu market and has effectively zero market share, so it is not actually clear there is an antitrust issue in that market. It would be nice if there was more competition but there are other players like AMD and a long tail of smaller ones
I'm sorry that's just not correct. Intel is literally just getting started in the GPU market, and their last several releases have been nearly exactly what people are asking for. Saying "they've lost" when the newest cards have been on the market for less than a month is ridiculous.
If they are even mediocre at marketing, the Arc Pro B50 has a chance to be an absolute game changer for devs who don't have a large budget:
The lastest Arc GPUs were doing good, and were absolutely an option for entry/mid level gamers. I think lack of maturity was one of the main things keeping sales down.
Intel has been making GPUs for over 25 years. Claiming they are just getting started is absurd.
To that point, they've been "just getting started" in practically every chip market other than x86/x64 CPUs for over 20 years now, and have failed miserably every time.
If you think Nvidia is doing this because they're afraid of losing market share, you're way off base.
Sure, but claiming they have literally just started is completely inaccurate.
They've been making discrete GPUs on and off since the 80s, and this is at least their 3rd major attempt at it as a company, depending on how you define "major".
They haven't even just started on this iteration, as the Arc line has been out since 2022.
The main thing I learned from this submission is how much people hate Nvidia.
I love GPU differentiation, but this is one of those areas where Nvidia is justified shipping less VRAM. With less VRAM, you can use fewer memory controllers to push higher speeds on the same memory!
For instance, both the B50 and the RTX 2060 use GDDR6 memory. But the 2060 has a 192-bit memory bus, and enjoys ~336 GB/s bandwidth because of it.
I don't know what anybody would do with such a weak card.
My RTX 5090 is about 10x faster (measured by FP32 TFLOPS) and I still don't find it to be fast enough. I can't imagine using something so slow for AI/ML. Only 2.2 tokens/sec on an 8B parameter Llama model? That's slower than someone typing.
I get that it's a budget card, but budget cards are supposed to at least win on a pure price/performance ratio, even with a lower baseline performance. The 5090 is 10x faster but only 6-8x the price, depending on where in the $2-3,000 price range you can find one at.
I feel as though you are measuring tokens/s wrong, or have a serious bottleneck somewhere. On my i5-10210u (no dedicated graphics, at standard clock speeds), I get ~6 tokens/s on phi4-mini, a 4b model. That means my laptop CPU with a power draw of 15 watts, that was released 6 years ago, is performing better than a 5090.
> The 5090 is 10x faster but only 6-8x the price
I don't buy into this argument. A B580 can be bought at MSRP for 250$. A RTX 5090 from my local Microcenter is around 3250$. That puts it at around 1/13th the price.
Power costs can also be a significant factor if you choose to self-host, and I wouldn't want to risk system integrity for 3x the power draw, 13x the price, a melting connector, and Nvidia's terrible driver support.
EDIT: You can get an RTX 5090 for around 2500$. I doubt it will ever reach MSRP though.
I've been using Mistral 7B, and I can get 45 tokens/sec, which is PLENTY fast, but to save VRAM so I can game while doing inference (I run an IRC bot that allows people to talk to Mistral), I quantize to 8 bits, which then brings my inference speed down to ~8 tokens/sec.
For gaming, I absolutely love this card. I can play Cyberpunk 2077 with all the graphics settings set to the maximum and get 120+ fps. Though when playing a much more graphically intense game like that, I certainly need to kill the bot to free up the VRAM. But I can play something simpler like League of Legends and have inference happening while I play with zero impact on game performance.
I also have 128 GB of system RAM. I've thought about loading the model in both 8-bit and 16-bit into system RAM and just swap which one is in VRAM based on if I'm playing a game so that if I'm not playing something, the bot runs significantly faster.
Hold on, you're only getting 45 tokens/sec with Mistral 7B on a 5090 of all things? That gets ~240 tokens/sec with Llama 7B quantized to 4 bits on llama.cpp [1] and those models should be pretty similar architecturally.
I don't know exactly how the scaling works here but considering how LLM inference is memory bandwidth limited you should go beyond 100 tokens/sec with the same model and a 8 bit quantization.
My understanding is that quantizing lowers memory usage but increases compute usage because it still needs to convert the weights to fp16 on the fly at inference time.
Clearly I'm doing something wrong if it's a net loss in performance for me. I might have to look more into this.
> My RTX 5090 is about 10x faster (measured by FP32 TFLOPS) and I still don't find it to be fast enough. I can't imagine using something so slow for AI/ML. Only 2.2 tokens/sec on an 8B parameter Llama model? That's slower than someone typing.
Its also orders of magnitudr slower than what I normally see cited by people using 5090s; heck, its even much slower than I see on my own 3080Ti laptop card for 8B models, though usually won’t use more than an 8bpw quant for that size model.
Other than the market segmentation over RAM amounts, I don't see very much difference. There's some but there's been some for a long time. Isn't AMD re-unifying their architectures?
> There's some but there's been some for a long time. Isn't AMD re-unifying their architectures?
Yes.
> Other than the market segmentation over RAM amounts, I don't see very much difference.
The difference between CDNA and RDNA is pretty much how fast it can crunch FP64 and SR-IOV. Prior to RDNA, AMD GPUs were jacks of all trades with compute bias. Which made them bad for gaming unless the game is specifically written around async compute. Vega64 has more FP64 compute than the 4080 for context.
I think if AMD was able to get a solid market share of datacenter GPUs, they wouldn't have unified. This feels like CDNA team couldn't justify its existence.
The alternative is currently looking like cutting up of intel into piecemeal to make a quick buck just to stay afloat. The GPU division is not profitable and may be destroyed if overall financials don't improve.
Right now, for the US national interests, our biggest concern is that Intel continues to exist. Intel has been making crappy GPUs for 25 years. They weren’t going to start making great GPUs now.
Besides, who would actually use them if they don’t support CUDA?
Everyone designs better GPUs than Intel - even Apple’s ARM GPUs have been outpacing Intel for a decade even before the M series.
Why does it matter if Intel exists if they can't compete? AMD exists. The only point of hoping they remain is to create an environment of competition as that drives development and progress.
Though fair and free markets is not at all what the current regime in the US believes in, instead it will be consolidation, leading waste, and little innovation and progress.
So you don’t see the difference in the threat level of China bombing and invading Taiwan - which they already claim they own - and China attacking the US directly?
So its just an imagined subtext that China that has been rabble rousing about taking over Taiwan is more likely to attack a tiny island nation right next to than attack the US?
Wait what. Intel GPU+AI efforts. People had to come together to fund the abandoned Intel SW development team. Intel GPUs are great at what they do but they are no nvidia. I don't even think that was on the roamdap. Also you don't know what nvidia wants. Maybe they want to flood the low end to destroy AMD benefiting consumers. We just don't know
The reason why Nvidia is buying now does not have to do anything with Arc or GPU competition. There are mainly two reasons.
1) This year, Intel, TSMC, and Samsung announced their latest factories' yields. Intel was the earliest, with 18A, while Samsung was the most recent. TSMC yieled above 60%, Intel below 60%, and Samsung around 50% (but Samsung's tech is basically a generation ahead and technically more precise), and Samsung could improve their yields the most due to the way set up the processes, where 70% is the target. Until last year, Samsung was in the second place, and with the idea that Intel caught up so fast and taking Samsung's position at least for this year, Nvidia bought Intel's stock since it's been getting cheaper since COVID.
2) It's just generally good to diversify into your competitors. Every company does this, especially when the price is cheap.
I am curious where you get your information about Samsung being more “precise”.
I was recently looking into 2nm myself, and based on wikipedia article on 2nm, TSMC 2nm is about 50% more dense than the samsung and intel equivalent. They aren’t remotely the same thing. Samsung 2nm and Intel 18A are about as dense as TSMC 3nm, that’s been in production for years.
> I was recently looking into 2nm myself, and based on wikipedia article on 2nm, TSMC 2nm is about 50% more dense than the samsung and intel equivalent.
I did the math on TSMC N2 vs Intel 18A, and the former is 30% denser according to TSMC
> For personal computing, Intel will build and offer to the market x86 system-on-chips (SOCs) that integrate NVIDIA RTX GPU chiplets. These new x86 RTX SOCs will power a wide range of PCs that demand integration of world-class CPUs and GPUs.
I don't think this is Intel trying to save itself, it's nVidia. Intel GPUs have been in 3rd place for a long time, but their integrated graphics are widely available and come in 2nd place because nVidia can't compete in the x86 space. Intel graphics have been closing the gap with AMD and are now within what? A factor of 2 or less (1.5?)
IMHO we will soon see more small/quiet PCs without a slot for a graphics card, relying on integrated graphics. nVidia has no place in that future. But now, by dropping $5B on Intel they can get into some of these SoCs and not become irrelevant.
The nice thing for Intel is that they might be able to claim graphics superiority in SoC land since they are currently lagging in CPU.
Way back in the mid-late 2000s Intel CPUs could be used with third party chipsets not manufactured by Intel. This had been going on forever but the space was particularly wild with Nvidia being the most popular chipset manufacturer for AMD and also making in-roads for Intel CPUs. It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.
This was all for naught as AMD purchased ATi, shutting out all other chipsets and Intel did the same. Things actually looked pretty grim for Nvidia at this point in time. AMD was making moves that suggested APUs were the future and Intel started releasing platforms with very little PCIe connectivity, prompting Nvidia to build things like the Ion platform that could operate over an anemic pcie 1x link. There were really were the beginnings of strategic moves to lock Nvidia out of their own market.
Fortunately, Nvidia won a lawsuit against Intel that required them to have pcie 16x connectivity on their main platforms for 10 years or so and AMD put out non-competitive offerings in the CPU space such that the APU take off never happened. If Intel had actually developed their integrated GPUs or won that lawsuit or if AMD had actually executed Nvidia might well be an also-ran right around now.
To their credit, Nvidia really took advantage of their competitors inability to press their huge strategic advantage during that time. I think we're in a different landscape at the moment. Neither AMD nor Intel can afford boot Nvidia since consumers would likely abandon them for whoever could still slot in an Nvidia card. High performance graphics is the domain of add-in boards now and will be for awhile. Process node shrinks aren't as easy and cooling solutions are getting crazy.
But Nvidia has been shut out of the new handheld market and haven't been a good total package for consoles as SoC both rule the day in those spaces so I'm not super surprised at the desire for this pairing. But I did think nvidia had given up these ambitions was planning to try to build an adjacent ARM based platform as a potential escape hatch.
> It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.
This feels like a 'brand new sentence' to me because I've never met an ALi chipset that I liked. Every one I ever used had some shitty quirk that made VIA or SiS somehow more palatable [0] [1].
> Intel started releasing platforms with very little PCIe connectivity,
This is also a semi-weird statement to me, in that it was nothing new; Intel already had an established history of chipsets like the i810, 845GV, 865GV, etc which all lacked AGP. [2]
[0] - Aladdin V with it's AGP Instabilities, MAGiK 1 with it's poor handling of more than 2 or 3 'rows' of DDR (i.e. two double-sided sticks of DDR turned it into a shitshow no matter what you did to timings. 3 usually was 'ok-ish' and 2 was stable.)
[1] - SIS 730 and 735 were great chipsets for the money and TBH the closest to the AMD760 for stability.
[2] - If I had a dollar for every time I got to break the news to someone that there was no real way to put a Geforce or 'Radon' [3] in their eMachine, I could have had a then-decent down payment for a car.
[3] - Although, in an odd sort of foreshadowing, most people who called it a 'Radon', would specifically call it an AMD Radon... and now here we are. Oddly prescient.
ALi was indeed pretty much on the avoid list for me for most of their history. It was only when they came out with the ULi M1695 made famous by the Asrock939dual-sata2 that they were a contender for best out of nowhere. One of the coolest boards I ever owned and was rock solid for me even with all of the weird configs I ran on it. I kind of wish I hadn't sold it even today!
I remember a lot disappointed people on forums who couldn't upgrade their cheap PCs as well, but there were still motherboards available with AGP to slot into for Intel's best products. Intel couldn't just remove it from the landscape altogether (assuming they wanted to) because they weren't the only company making Intel supporting chipsets. IIRC Intel/AMD/Nvidia were not interested in making AGP+PCIe supporting chipsets at all, but VIA/ALi and maybe SiS made them instead because it was a free for all space still. Once that went away Nvidia couldn't control their own destiny.
nvidia does build SOCs already. The AGXs and other offerings. I'm curious why they want intel despite having that technical capability of building SOCs.
I realize the AGX is more of a low power solution and it's possible that nvidia is still technically limited when building SOCs but this is just speculation.
Does anybody know actual ground truth reasoning why Nvidia is buying Intel despite the fact that nvidia can make their own SOCs?
Sometimes HN users appear to have absolutely zero sense of scales. Lifetime sales numbers of those are like hours to days worth equivalent of Switch 2.
I think it is bad news for the GPU market (AMD has had a beachhead with their integrated solution here as they've lost out elsewhere) but good for x86 which I've worried would be greatly diminished as Intel became less competitive.
That was targeted at supporting more tightly integrated and performant Macbooks .... it flopped because Apple came up with M1, not because it was bad per se.
Remember when Microsoft invested in Apple when Apple was down in the dumps? This is giving similar vibes. That deal was arguably what saved Apple near its nadir. I’m not a fan of Intel’s past monopolistic practices, but for the sake of sustaining competition in the CPU/GPU market, I hope this deal works out for them even half as well as the MS deal did for Apple.
>Remember when Microsoft invested in Apple when Apple was down in the dumps? This is giving similar vibes.
Doesn't feel the same because the 1997 investment was arranged by Apple co-founder Steve Jobs. He had a long personal relationship with Bill Gates so could just call him to drop the outstanding lawsuits and get a commitment for future Office versions on the Mac. Basically, Steve Jobs at relatively young age of 42 was back at Apple in "founder mode" and made bold moves that the prior CEO Gil Amelio couldn't do.
Intel doesn't have the same type of leadership. Their new CEO is a career finance/investor instead of a "new products new innovation" type of leader. This $5 billion investment feels more like the result of back-channel discussions with the US government where they "politely" ask NVIDIA to help out Intel in exchange for less restrictions selling chips to China.
> This $5 billion investment feels more like the result of back-channel discussions with the US government where they "politely" ask NVIDIA to help out Intel in exchange for less restrictions selling chips to China.
This style of classical fascism or economic fascism, or whatever the term is differentiate it from the modern unrelated usage of fascism, being used in the US is a bit unnerving, and it's crazy that it's usually from the Republican party, who claims to espouse free markets.
It also happened under G. W. Bush with banks and auto manufacturers, but the worst offense was under Nixon with his nationalization of passenger rail.
At least with the bank and car manufacturer bailouts the government eventually sold off their stocks, and with the Intel investment the government has non-voting shares, but the government completely controls the National Railroad Passenger Corporation, (the NRPC aka Amtrak) with the board members being appointed by the president of the United States.
We lost 20 independent railroads overnight, and created a conglomerate that can barely function.
That's how post-WW2 France was actually rebuilt. You could also see big hints of that in the US WW2 economic effort, which couldn't have been done without the Government taking a direct hold of things and instituting central-ish planning.
You're speaking of what is referred to as neo-corporatism [0] and it's a tripartite, democratic process, not the fascist sort where everything is within and for the benefit of the state [1].
There was not that much democracy in the French post-WW2 technocratic establishment, but I agree that they were not technically fascist (nor otherwise).
There's big difference between government allocating tax payer dollars by passing a bill than a president using their influence to force dealings between corporate entities that benefit the ruling party.
The parent comment is speculation. But yes, speculatively, a legislative act of investment would be less authoritarian than the whims of an executive that puts tariffs on your product constantly unless you do what he says.
Is the method by which it’s communicated what gives you negative feelings? Because this is an approach to handling the labor dumping that’s been allowed in nearly every industry since the 1980s, and it’s been used numerous times in the US and abroad. They typically only offer temporary relief, while domestic industries should be adjusting and better trade deals get negotiated. The last I checked, that’s been happening to some degree… but it also probably needs to be supported by the ability for companies to borrow money, which the Fed (until recently) seemed hell bent on preventing, while we continued to watch the job market burn to the ground. So cash flush businesses investing in each other to keep competition alive seems like a positive here. Maybe that’s just me?
Most regulation is effectively coercion. The difference is regulation isn’t easily rolled back, whereas the current approach to modifying behavior is (as we’ve seen, numerous times in the last few months even). One is more tolerant of failure than the other.
There is an extreme where policy cannot be modified, and there is an extreme where the whims of one person, and the precedent of having the US government defined as the whims and whiplashes of one person, is immensely harmful to our national credibility. It fucks with investment, immigration and education.
I don't think that's an apt comparison, given that Microsoft and Apple were more direct competitors than Intel and Nvidia; the latter have a more symbiotic relationship. I think the rationale is closer to the competitor of my competitor is my friend -- they face two threats by AMD growing larger in the CPU market:
- a bigger R&D budget for their main competitor in the GPU market
- since Nvidia doesn't have their own CPUs, they risk becoming more dependent on their main competitor for total system performance.
Required in that Nvidia would like to sell them to you. But customers seem to be hesitant and prefer x86-based DGX and similar systems. At least from what I've heard and seen.
This is a big ask for a shrinking market- with the pressure that the Chinese government is putting on their domestic companies to not buy H20's, I'm not sure how big this is going to be going forward. 5 billion (plus whatever it costs to build these products) is a lot for a market that is probably going to be closed soon.
That Microsoft-Apple deal was part lifeline, part strategic insurance. Intel clearly needs a win, and Nvidia needs more control over its ecosystem without being chained to TSMC forever
> Remember when Microsoft invested in Apple when Apple was down in the dumps?
Had Apple failed, Microsoft would probably have been found to have a clear monopolistic position. And microsoft was already in hot waters due to InternetExplorer IIRC.
> Nvidia will also have Intel build custom x86 data center CPUs for its AI products for hyperscale and enterprise customers.
Hell has frozen over at Intel. Actually listening to people that want to buy your stuff, whatever next? Presumably someone over there doesn't want the AI wave to turn into a repeat of their famous success with mobile.
In the event Intel ever do get US based fabrication semi competitive again (and the national security motivation for doing so is intense) nVidia will likely have to be a major customer, so this does make sense. I remain doubtful that Intel can pull it off, and it will have to come from someone else.
If you were a big enough customer you could get a SKU for you, too. E.g. hyperscalers have Xeons which are not available for any other customers for any price.
But what they've completely resisted so far is any non trivial modification.
They turned down Acorn about the 286, which led to Acorn creating the Arm, they have turned down various console makers, they turned down Apple on the iPhone, and so on. In all cases they thought the opportunities were beneath them.
Intel has always been too much about what they want to sell you, not what you need. That worked for them when the two aligned over backwards compat.
Clearly the threat of an Arm or RISC-V finding itself fused to a GPU running AI inference workloads has woken someone up, at last.
Intel’s test for new business ideas has always been: will it make $1B in the first year?
It leads to mistakes like you mention, where a new market segment or new entrant is not a sure thing. And then it leads to mistakes like Larrabee and Optane where they talk themselves into overconfidence (“obviously this is a great product, we wouldn’t be doing it if it wasn’t guaranteed to make $1B in the first year”).
It is very hard to grow a business with zero risk appetite. You can’t take risky high return bets, and you can’t acknowledge the real risk in “safe” bets.
Larrabee could have grown into something very cool if they had not dropped it and made it available on the open market, donated to universities and so on. Transputer vibes.
I think for Larrabee it was intel experimenting to find other markets for their Atom cores, and if there was market for it they needed to have the tenacity to cultivate it. Similar to how nvidia took huge amounts of time establishing GPGPU, CUDA, then machine learning, through to reaping the rewards over the past few years.
2010-2011 was also the time that AMD were starting to moan a bit about DX11 and the higher level APIs not being sufficient to get the most out of GPUs, which led to Mantle/Vulkan/DX12 a few years down the road. Intel did a bit regarding massively parallel software rendering, with the flexibility to run on anything x86 and implement features as you liked, or AMD's efforts for 'fusion' (APU+GPU, after recently acquiring ATi) or HSA which I seem to recall was about dispatching different types of computing to the best suited processor(s) in the system for it. However I got the impression a lot of development effort is more interested in progressing on what they already have instead of starting in a new direction, and game studios want to ship finished and stable/predictable product, which is where support from intel would have helped.
If Intel had a server SKU with fully integrated, competitive performance GPU cores that work with CUDA + unified memory, they’d sell billions worth in a day to the CSPs alone.
Sounds like they will someday soon.
There will always be giant, faraway GPU supercomputer clusters to train models. But the future of inference (where the model fits) is local to the CPU.
Console makers only get trivial modifications. ASRock sold a cryptocurrency miner, the BC-250, with the PS5 APU, and it works just like any of their other APUs, albeit with limited driver support.
The BC250 does not use a PS5 APU, it uses another APU which has the same CPU core. By that measure the Cell in the PS3 and the Xenon of the XBox 360 were the same, or any AMD Jaguar device is a PS4.
This relates to the Intel problem because they see the world the way you just described, and completely failed to grasp the importance of SoC development where you are suddenly free to consider the world without the preexisting buses and peripherals of the PC universe and to imagine something better. CPU cores are a means to an end, and represent an ever shrinking part of modern systems.
There's almost no chance it isn't using rejected PS5 APU dies. It has fused off two of the eight CPU cores, as well as 12 of the 36 GPU compute units, but otherwise has the exact same specifications. The one customization Sony did get, the use of GDDR6 RAM, is still present. It also exhibits the same very short-lived mix of Zen 2 with RDNA 2 and has the same die size and aspect ratio.
The problem is, console manufacturers know precisely how much of their product they anticipate to sell, and it's usually a lot. The PlayStation 5 is 80 million units so far.
And at that scale, the console manufacturers want to squeeze every vendor as hard as they can... and Intel didn't see the need to engage in a bidding war with AMD that would have given them a sizable revenue but very little profit margin compared to selling Xeon CPUs to hyperscalers where Intel has much more leverage to command higher prices and thus higher margins.
> they turned down Apple on the iPhone
Intel just was (and frankly, still is) unable to compete on the power envelope with ARM, that's why you never saw x86 take off on Android as well despite quite a few attempts at it.
Apple only chose to go for Intel with its MacBook line as PowerPC was practically dead and offered no way to extract more performance, and they dropped Intel as soon as their own CPUs were competitive. To get Intel CPUs to the same level of power efficiency that M-series CPUs have would require a full rework of the entire CPU infrastructure and external stack, that would require money that even Intel at its best frankly did not have. And getting x86 to be power effective enough for a phone? Just forget it.
> Clearly the threat of an Arm or RISC-V finding itself fused to a GPU running AI inference workloads has woken someone up, at last.
Actually, that is surprising for me as well. NVIDIA's Tegra should easily be powerful enough to run the OS for training or inference workload. If I were to guess, NVIDIA wants to avoid getting caught too hard on the "selling AI shovels" train.
> And at that scale, the console manufacturers want to squeeze every vendor as hard as they can... and Intel didn't see the need to engage in a bidding war with AMD that would have given them a sizable revenue but very little profit margin compared to selling Xeon CPUs to hyperscalers where Intel has much more leverage to command higher prices and thus higher margins.
And so that gave AMD an opening, and with that opening they got to experiment with designs, tailor a product, get experience and industrial marketshare, and they were able to continue to offer more and better products. Intel didn't just miss a mediocre business opportunity, they missed out on becoming a trusted partner for multiple generations, and they handed market to AMD that AMD used to be a better market competitor.
> and they handed market to AMD that AMD used to be a better market competitor.
AMD isn't precisely a market competitor. The server and business compute market is still firmly Intel and there isn't much evidence of that changing unless Apple drops M series SoCs to the wide open market which Apple won't do. Intel could probably release a raging dumpster fire and still go strong, oh wait, that's what they've been doing the last few years.
AMD is only a competitor in the lower end of the market, a market Intel has zero issue handing to AMD outright - partially because a viable AMD keeps the antitrust enforcers from breathing down their neck, but more because it drags down per-unit profit margins to engage in consoles and the lower rungs and niches.
> The server and business compute market is still firmly Intel and there isn't much evidence of that changing
This is not true anymore, as it IS changing, and very rapidly. AMD has shot up to 27.3% of the server market share, which they haven't had since the Opteron days 20 years ago. Five years ago their server market share was very small single digits. They're half of desktops, too. https://www.pcguide.com/news/no-amd-and-intel-arent-50-50-in...
Apple did not want their x86 chips, they wanted their Xscale stuff. Apple went to Intel to get chips, the power envelope was appealing to Apple. Intel was the one to say no.
Right. But of course, intel was busy spinning off their Xscale business to Marvell. If they had seriously invested in it, they could have owned the coming mobile revolution.
They did push hard on their UMPC x86 SoCs (Paulsbo and derivatives) to Sony, Nokia, etc. These were never competitive on heat or battery life.
Estimates are at 1M Xeons a month [1], so there have been more units of PS5 and thus CPUs sold to a single customer in the same timeframe than units of Xeon CPUs over all customers.
NVDA sold 153 million Tegra units to Nintendo in 8 years, so 1.5M units a month. That's just as comparable.
Spot on about the AI/mobile parallel. Intel sat out the smartphone wave while pretending it didn’t matter, and now they’re scrambling not to miss the AI train
It was intel culture at one time - when I started, everyone got a card to wear with your badge with intel values, there were only 6 and ‘customer orientation’ was one. It definitely influenced my personal development, but was clearly not adopted equally across the company.
Intel is a strategically important company for the United States. This smells like a token investment to appease the US government. Not saying it’s bad, but very much looks like that.
Why would either of these three be interested in buying a fab? The only other large player with its own fab is Samsung and Samsung has the same problem that Intel has, namedly a fab that is nowhere near close to TSMC.
I agree that Intel would be better served to spin off its fab division, a potential buyer could be the US government for military and national security relevant projects.
Someone could be interested. It could also be Global Foundries. High risk big reward bet which the government is willing to help mitigate some of the risk with funding.
Not an expert in the area, but I think the highest of the high-end chips is a big market, but not the biggest market as revenue for fabs. It is just the most profitable part of the market.
Maybe this changed with the AI race but there are plenty of people buying older chips by the millions for all sorts of products.
The key for getting (financial) value out of fabs is their time after they are the overtaken by the next node. The ability to keep the order book full after you have a better node is what pays off the fab. So its all the other chips- the chips for cars, for low-power internet connected devices, etc. that make the fab profitable. That is where TSMC's ability to work with different customers enables them to extract value from a fab that pure-play CPU makers struggle with.
Being fabless is a huge strategic advantage to chip designers. Intel's biggest problem has been that theyre stuck on shitty fabs. Nvidia, amd, and qualcomm do not want to be in that position.
I wonder what this means for Intel's Arc lineup. Would be a bit crazy to have privileged access to a competitor's roadmap through just owning a chunk of them. I also have to admit I really hope they dont cancel them. A triopoly is at least better than a duopoly (or realistically, a monopoly as AMD's competitiveness in gpus is pretty questionable)
It probably kills any prospect of Intel releasing a market disrupter card that many were calling for - a 64GB or 92GB card with even middling performance for under $1k.
It's pretty clear AMD and Nvidia are gatekeeping memory so they can iterate over time and protect their datacenter cards.
That's what I think of, along with favour from their new investment sibling, the US government. AMD doesn't want to be super competitive, they like their margins and being second choice in a hypetastic market. Even though Arc has very low adoption, it was making signs of doing scrappy things, like enabling two 24GB GPUs on one card from third party vendors, which got the hobby/upstart community pretty excited. Ultimately it's not a real market giving the people what they want via competition, it's all contrived by politics and the biggest players.
Which is arguably kind of weird because where is it actually competing with NVIDIA? A hypothetical future, I guess?
But also, does this amount of ownership even give them the ability to kill anything on Intel's roadmap without broad shareholder consensus (not that that's even how roadmaps are handled anyway)?
Nvidia sees the forest of the trees. The consequences of the US government buying steaks and Intel are that there will be Federal requirements for us companies using Intel. This is entirely about the foundry business. Nvidia is at risk when 100% of the production of its intellectual property occurs in Taiwan. They're more interested than anyone else in diversifying their foundry solutions. Intel has just been a terrible partner and totally disregards its customers. It's only because of the new strategic need for the US to have a foundry business that the government is saving until. NVIDIA is understandably supportive of this.
Might rather see it the other way around - Nvidia getting license to create products with x86(_64) CPUs integrated in the silicon. Nvidia are the big boy in this transaction and they'll get what they want out of it. But I can see the attraction for Intel.
I don't think they can, as AFAIK the agreement for x86_64 is that Intel and AMD cannot change hands. AMD will surely fight this tooth and nail in the courts
But with the state of the courts today... who knows..
Yes indeed. It's still a step in that direction that opens up a bunch of communication channels between the execs of the two companies. Things move slowly.
Intel was well on its way to be a considerable threat to NVIDIA with their Arc line of GPUs, which are getting better and cheaper with each generation. Perhaps not in the enterprise and AI markets yet, but certainly on the consumer side.
This news muddies this approach, and I see it as a misstep for both Intel and for consumers. Intel is only helping NVIDIA, which puts them further away from unseating them than they were before.
Competition is always a net positive for consumers, while mergers are always a net negative. This news will only benefit shareholders of both companies, and Intel shareholders only in the short-term. In the long-term, it's making NVIDIA more powerful.
I'm not convinced. The latest Battlemage benchmarks I've seen put the B580 at the same performance as the RTX 4060 (which is a two years old entry-level card) but with 50% more power consumption (80W vs 125W average). It's good to have more than one open source supporting graphics vendor, but I don't think Nvidia is losing any sleep over Intel's GPU offerings.
Battlemage had the best perf/% and most the driver issues from Alchemist had been ironed out. Another generation or two of steady progress and intel have a big winner on their hands.
Intel's foundry costs are probably competitive with nvidia too - nvidia has too much opportunity cost if nothing else.
This is very short-sighted. The cards are improving, which can't really be said about AMD, the only other potential threat to Nvidia. It's also well known that Nvidia purposefully handicaps their consumer cards to avoid cannibalizing their enterprise cards. That means that the consumer market at least is not as efficient/optimal as it could be, so a competitor actually trying to compete (unlike AMD, apparently) should be able to do that without even having to out-innovate Nvidia or anything like that. Just get close on compute performance, but offer more VRAM or cheaper multi-gpu setups.
Nah, nobody cares about that. Even in their heyday, SLI and CrossFire barely made sense technologically. That market is basically non-existent. There's more people now wanting to run multiple GPUs for inference than there ever were who were interested in SLI, and those people can mix and match GPUs as they like.
The B580 was released in December 2024, and the 4060 in May 2023. So not quite a two year difference.
While it doesn't quite compete at performance and power consumption, it does at price/performance and overall value. It is a $250 card, compared to the $300 of the 4060 at launch. You can still get it at that price, if there's stock, while the 4060 hovers around $400 now. It's also a 12GB card vs the 8GB of the 4060.
So, sure, this is not competitive at the high-end segment, but it's remarkable what they've accomplished in just a few years, compared to the decades that AMD and NVIDIA have on them. It's definitely not far fetched to assume that the gap would only continue to close.
Besides, Intel is not only competing at GPUs, but APUs, and CPUs. Their APU products are more performant and efficient than AMD's (e.g. 140V vs 890M).
nvidia's margins are over 80% for datacenter products. If Intel can produce chips with enough vram and performance on par with nvidia from 2 years ago at 30% margins theyd steal a lot of business, if they can figure out the cuda side of things.
I'm sure Larrabee will be superb any year now. The Xeon phi will rise again. For supporting evidence, the success of Aurora. Weren't the loss-leading arc GPUs cancelled as well? Maybe that only one generation of them, it does look like some are on the market now.
I think this partnership will damage nvidia. It might damage intel, but given they're circling the drain already, it's hard to make matters worse.
It's probably bad for consumers in every dimension.
Or to take the opposite, if nvidia rolled over intel and fired essentially everyone in the management chain and started trying to run the fabs themselves, good chance they'd turn the ship around and become even more powerful than they already are.
> It might damage intel, but given they're circling the drain already, it's hard to make matters worse.
How was Intel "circling the drain"?
They have a very competitive offering of CPUs, APUs, and GPUs, and the upcoming Panther Lake and Nova Lake architectures are very promising. Their products compete with AMD, NVIDIA, and ARM SoCs from the likes of Apple.
Intel may have been in a rut years ago, but they've recovered incredibly well.
This is why I'm puzzled by this decision, and as a consumer, I would rather use a fully Intel system than some bastardized version that also involves NVIDIA. We've seen how well that works with Optimus.
None of their products are competitive, they fired the CEO who was meant to save them, fired tens of thousands of their engineers, sold off massive chunks of the company, they're still bleeding money and begging for state support?
Also their network cards no longer work properly which is deeply aggravating as that used to be something I could rely on, just bought some realtek ones to work around the intel ones falling over.
Intel's 140V competes with and often outperforms AMD's 890M, at around half the power consumption.[1]
Intel's B580 competes with AMD's RX 7600 and NVIDIA's RTX 4060, at a fraction of the price of the 4060.[2]
They're not doing so well with desktop and laptop CPUs, although their Lunar Lake and Arrow Lake CPUs are still decent performers within their segments. The upcoming Panther Lake architecture is promising to improve this.
If these are not the signs of competitive products, and that they're far from "circling the drain", then I don't know what is.
FWIW, I'm not familiar with the health of their business, and what it takes to produce these products. But from a consumer's standpoint, Intel hasn't been this strong since... the early 00s?
No way, man. Peak consumer Intel was from Core 2 up to Skylake-ish. That was when they started coasting and handed the market to AMD. Right now they're losing market share to them on mobile, desktop, and server. If we ignore servers, most PCs have an AMD CPU inside.
The GPUs might be competitive on price, but that's about it. It's pretty much a hardware open beta.
When your own most competitive products are being made by your competitor for you, while you still have the cost center of running your own production fabs incapable of producing your most competitive products, and receiving bailouts just to keep the lights on...
Mergers where one company is on the verge of failing can be a net positive for consumers. Most obviously this happens when banks fail and people’s bank cards still work etc and at least initially the branches stay open.
Intel isn’t at that point, but the companies trajectory isn’t looking good. I’d happily sacrifice ARC to keep a duopoly in CPU’s.
AMD has always followed closely NVIDIA in crippling their cheap GPUs for any other applications.
After many years of continuously decreasing performance of the "consumer" GPUs, only Intel has offered in the Battlemage GPUs FP64 performance comparable with what could be easily obtained 10 years ago, but no longer today.
Therefore, if the Intel GPUs disappear, then the choices in GPUs will certainly become much more restricted than today. AMD has almost never attempted to compete with NVIDIA in features, but whenever NVIDIA dropped some feature, so did AMD.
The only consumer GPUs ten years ago that offered decent FP64 performance were the GTX TITAN series. And they were beasts! It's a shame nothing quite like them exists anymore. But they were the highest of high-end cards, certainly not that common or cheap.
It's absolutely not, the ARC line is not a threat in any way to nVidia, it's to get it's feet into the CPU market without the initial setup costs and research it would take to start from scratch.
They will be dominating AMD now on both fronts if things go smoothly for them.
This really wasn't a surprise, nVidia has seemed to be itching for a meaningful entry to the CPU market and when intel's CEO started undoing all and any future investment in the company it was clear everything was being setup for a sell off.
5 Billion is just a start but this is a gift for nVidia to eventually squire intel.
I think if Nvidia wanted to acquire Intel, they would acquire Intel.
Intel has never been so cheap relative to the kinds of IP assets that Nvidia values and probably will not be ever again if this and other investments keep it afloat.
Trump's FTC would not block.
You write with proper case-sensitivity for their titles which suggests some historic knowledge of the two. They have been very close partners on CPU+GPU for decades. This investment is not fundamentally changing that.
The current CEO is more like a CFO--cutting costs and eliminating waste. There are two exits from that: sell off, as you say, and re-investment in the products of most likely future profit. This could be a signal that the latter is the plan and that the competitive aspects of the nVidia-intel partnership will be sidelined for a while.
I'm guessing NVidia didn't do this by choice. Propping up Intel doesn't seem in their best interests, nor does it do their share holders any favors by diluting their rapid growth.
There's some case for self-interest – propping up another fab etc. – but I do wonder how much of it is USG. (The economic case for Intel integrating Nvidia silicon on-chip doesn't too much sense to me: there's no growth potential in commodity/consumer x86, and maybe they can shove their new integrated Nvidia in front of enterprise buyers, but I'd be a dubious re: ROI.)
> I'm guessing NVidia didn't do this by choice. Propping up Intel doesn't seem in their best interests
In a top-down oligarchy, their best interests are served by focusing on the desires of the great leader, in contrast to a competitive bottom-up market economy, where they would focus on the desires of customers and shareholders.
There's a lot of concern in the comments here about what this means for ARC. The size of this investment while large isn't enough to warrant jeopardizing ARC though. Intel has a responsibility to all shareholders, and diminishing ARC would be a bad move for overall shareholder value.
If Nvidia did try to exert any pressure to scrap ARC, that would be both a huge financial and geopolitical scandal. It's in the best interest of the US to not only support Intel's local manufacturing, but also it's GPU tech.
Called it. I knew Nvidia had nowhere left to go, with that insanely high valuation, other than to start buying competitors and adjacent companies. I don't think this is the end, either.
> It is unclear if Intel will issue new stock for Nvidia to purchase
Erm, a rather important point to bury down the story. The fiest question on anyone’s lips will be is this $5bn to build new chip technology, or $5bn for employees to spend on yachts?
It’s the most important part of the story. It’s so gross that companies can just dilute and create stock out of thin air like this. Why hold stock in Intel if the only people that ever buy the real stock and create buy pressure are the plebs? Here is the previous time…
> Intel stock experienced dilution because the U.S. government converted CHIPS Act grants into an equity stake, acquiring a significant ownership percentage at a discounted price, which increased the total number of outstanding shares and reduced existing shareholders' ownership percentage, according to The Motley Fool and Investing.com. This led to roughly 11% dilution for existing shareholders
> It’s so gross that companies can just dilute and create stock out of thin air like this.
To get money from the outside, you either have to take on debt or you have to give someone a share in the business. In this case, the board of directors concluded the latter is better. I don't understand why you think it is gross.
To get a share in the business, you can also just buy stock in the business like everyone else, not increasing the total share count or causing dilution. They chose not to do this because it would have been more expensive due to properly compensating existing shareholders. So it's spiritually just theft.
> To get a share in the business, you can also just buy stock
The business is looking for additional capital. You can only do that by either selling new shares or raising debt.
> in the business like everyone else, not increasing the total share count or causing dilution. They chose not to do this because it would have been more expensive due to properly compensating existing shareholders. So it's spiritually just theft.
Shareholder dilution isn't inherently theft. Specific circumstances, motivations, and terms of issuance have a bearing on whether the dilution is harmful or whether it is necessary for the business.
For instance, it can be harmful if: minority shareholders are oppressed, shares are issued at a deeply discounted price with no legitimate business need or to benefit insiders at the expense of other shareholders, or if the raised capital isn't used effective to grow the company.
Dilution can be beneficial, such as when the raised capital is used for growth, employee compensation via employee stock options, etc.
nVidia has also been licensing their GPU IP to MediaTek recently, who are working on a 2nd generation of a SoC that combines their ARM cores with nVidia GPUs now, catering to e.g. the automotive market.
Looks like using GPU IP to take over other brands' product lines is now officially an nVidia strategy.
I guess the obvious worry here is whether Intel will continue development of their own dGPUs, which have a lovely open driver stack.
I'd agree, but Intel has also halted dGPU development efforts before, cf. the canned Larrabee project. Which was more troubled on the technology side however.
Seems Nvidia needs an alternative to MediaTek or wants to pressure MediaTek given the announcement of x86 Intel/Nvidia SoCs and the delay of DGX Spark, GB10 and N1X.
They wanted to launch DGX Spark early summer and it's nowhere to be seen, while strix halo is shipping in over 30+ SKUs from all major manufacturers.
Intel should never have existed in the first place. We should have gone the RISC route in the 80s and 90s and it took 30 years for the world to realize that with ARM. It’s like we’re continuously resuscitating a zombie to terrorize us, instead of just letting it die.
Except that Nvidia, Inc. doesn't own any of that NVDA stock, other people do, and they cannot access that money. Nvidia, Inc. owns net profit, which is orders of magnitude less than market cap. Last year's net profit was just under $73 billion. ($5 billion is still very affordable, too be sure).
> Except that Nvidia, Inc. doesn't own any of that NVDA stock, other people do, and they cannot access that money. Nvidia, Inc. owns net profit, which is orders of magnitude less than market cap. Last year's net profit was just under $73 billion. ($5 billion is still very affordable, too be sure).
Not all deals are made in cash, they can borrow money against their market share.
I think you may have missed AMC and TSLA; for quite a while their best selling products and biggest revenue drivers were their stock. Reflexivity is a thing; NVDA could issue 5, 10 or 20B and I don't think the price would move very much in this market. (Note that could change tomorrow.)
I'm mixed on this, only because when they've done similar hybrid chips with AMD GPUs in the past the support has been poor and dropped off rather quickly.
A weird kind of full-circle moment: Intel used to laugh off Nvidia, then tried Kaby Lake-G with AMD (RIP), and now they're handing over CPU real estate to the company that wiped the floor with their own GPU efforts
People have been talking about the concentration of risk in popular indices, now large caps are buying each other's stakes? Intel's stock is up 27% today..
Difference is, AMD wasn't a competitor for ATi. One mostly built CPU's, while another- GPUs.
These two, on the other hand, are competing in several major product categories. Overall, not a good look
It feels like the end is in sight for dedicated graphics chips in consumer devices. Phones, consoles, and now Apple silicon are proving that SoC designs with unified memory and focused thermals are a winning strategy for efficiency and speed. Nvidia may be happy enough to move the graphics strategy onto an SoC and keep discrete boards just for AI.
You mean AMD's unified architecture. They were a founder of the HSA Foundation that drove innovation in this space complete with Linux kernel investments and unified compute SDKs, and they had the first shipping hardware support.
AMD's actual commitment to open innovation over the past ~20 years has been game changing in a lot of segments. It is the aspect of AMD that makes it so much more appealing than intel from a hacker/consumer perspective.
No way this doesn't get blocked by antitrust. This will make them way too large and Intel is already trying to sell off (US govt bought $10B couple weeks ago)
That action may cease to exist soon, especially after Vance is POTUS and the courts stacked with Peter Thiel loyalists that back his vision of anti-competition. Bet on it.
SemiAccruate reported that NVidia had been dipping its toes into manufacturing its products using Intel's fabs several months ago, I'd assume that that's related.
I'm very pessimistic about this. Goodbye to those nice, budget-friendly intel GPUs. nGreedia is going to continue selling 8 gig cards to consumers forever.
smart move for nvidia, as amd is the true competitor, keeping using amd's cpu will just help to build up a competitor fast. also this helps intel to figure out its foundry business and it might work someday, which also benefits nvidia as now its only choice is tsmc.
This has been an interesting 1.5 months for Intel on all fronts. I wonder how long this deal was in the making, since the timing is impeccable, looking at the current administration's involvement with Intel.
> Nvidia announced that it will buy $5 billion in Intel common stock at $23.28 per share, representing a roughly 5% ownership stake in Intel. (Intel stock is now up 33% in premarket trading.)
Why/how is INTC premarket up from $24.90 around 30% (to $32), when Nvidia is buying the stock at $23.28 ? Who is selling the stock?
I suppose the Intel board decided this? Why did they sell under the current market price? Didn't the Intel board have fiduciary duty to get as good a price from Nvidia as possible? If Nvidia buying stock moves it up so much, it seems like a bad deal to sell the stock for so little.
It's typical in these situations that the price per stock is negotiated, with current SP as a starting point. It's fairly unusual, I think, for the company selling stock to get a price significantly higher than market price. It's more typical that there's a slight discount. At least that's been the case for every stock I owned where dilution has occured. We also don't know yet when exactly this deal was negotiated and approved, so it's hard to actually say. Considering where INTC has been very recently(below $20), $23.28 seems very reasonable to me.
The reason the stock surged up past $30 is the general market's reaction to the news, and subsequent buying pressure, not the stock transaction itself. It seems likely that once the exuberance cools down, the SP will pull back, where to I can't say. Somewhere between $25 and $30 would be my bet, but this is not financial advice, I'm just spitballing here.
Nowadays I always wonder to what extent such deals are actually driven by market considerations and to what extent it's catering to the Trump administration. Token investments into this state enterprise named Intel seems to be a practical way to cater goodwill with the autocrats.
No idea what to think of this. I don't want Intel to die, but what will this do to their GPU business they're competing with NVIDIA on. And at worst this leads to even more consolidation
NVIDIA is Jensen Huang life, and he is probably the best CEO in the USA. But he should be careful. Possible Shareholders lawsuits come with Discovery. NVIDIA sales to Coreweave for example, a company they have shares on is starting to look a lot like self-dealing.
Also, since this Intel deal makes no sense for NVIDIA, a good observer would notice that lately, he seems to spend more time on Air Force One than with NVIDIA teams. The leak of any evidence, showing this was an investment ordered by the White House, will make his company hostage of future demands from the current corrupt administration. The timing is already incredibly suspicions.
We will know for sure he become a hostage, if the next NVIDIA investment is on World Liberty Financial.
INTC is strategically important company. They won't be allowed to fail. Of course, that doesn't mean the stock is a good investment. During the GFC, all the equity holders were wiped out all the bond holders got all their money back. Figure that one out.
Perhaps, but you do understand that the probability of every tranche regardless of seniority getting paid in full and the equity getting nothing is zero. It's mathematically impossible to pin the waterfall like this.
With good enough lawyers mathematically impossible is practically relative. Assume the game is rigged, play accordingly. If it doesn't make sense, it makes sense.
If you wanted to acquire Intel you'd do it now. Maybe Intel's future products are garbage and they do worse - but the upside seems pretty high otherwise. This seems like a bit of a firesale price to acquire an advanced fab and CPU maker. Sure, it's Intel and they haven't been doing great, but companies with solid reliable outlooks don't trade this cheaply.
Ofc I would kind of hope/expect antitrust to object given that Intel makes both GPUs and CPUs, and Nvidia is/has dipped their toes into CPU production as well.
Intel still has to go through a lot of reorg (i.e. massive cuts) to get to a happy place, and this is what their succession of CEOs have been procrastinating over.
I recall reading a reddit comment (resounding source, I know) that claimed the reason Intel's e-cores are crushing it is because they actually synthesise them, while the P-cores are a bunch of bespoke circuits bodged together.
One wonders just how bad things must have been internally for that to be the state of one of their core IPs in this day and age...
Not even the government at this point. The oligarchs are now in full control of the US and are dividing up their kingdoms. The plans for glulags for detractors are also being placed.
> This needlessly divisive and devoid of any factual basis. No gulags will exist and you know it.
What about "Alligator Alcatraz", that has been called "concentration camp" [1] (so comparable with a gulag), or where the Korean detainees from the raid on the Hyundai/LG plant ended up, alleging utterly horrible conditions [2]? And there's bound to be more places like the latter, that was most likely just the tip of the iceberg and we only know about the conditions there because the South Korean government raised a huge stink and got the workers out of there.
Okay, Alcatraz 2.0 did get suspended in August to my knowledge, but that's only temporary. It's bound to get the legal issues cleaned up and then be re-opened - or the case makes its way through to the Supreme Court with the same result to be expected.
I do not agree with that. In some cases it is acceptable to detain non-citizens for immigration-related offenses, but only if they receive due process to establish that they indeed should be detained.
Any denial of due process to any person is a gross violation of our most important right. Without the guarantee of due process to everyone, no one has any rights because those in power can violate rights at a whim.
There have been reported cases where ICE just ignored people's legal residence status or that they also snatched up citizens who didn't have paperwork on them just for "walking while black".
ICE doesn't reliably make any distinction, not since they hired thugs off of the streets and issued arrest quotas. Doesn't matter if the arrested have to be released later on.
The U.S. government won’t have a seat on the board and agreed to vote with Intel’s board on matters requiring shareholder approval “with limited exceptions.”
Why? That is an example of a bad engineering company being acquired and then poisoning the quality of the acquirer with its toxic, low-quality, corporate-politics-above-engineering culture.
There have been a lot of mergers where that has not happened.
There are two scenarios here. In one, the AI bubble bursts (so Nvidia is overpriced now) and almost any value stock deal is good for them. In the other, it doesn't, and this gives them a limited hedge against problems with their most critical strategic partner (TSMC).
It looks like a good deal either way and in any amount. But of course I am no expert.
I suppose the problem is Intel doesn't actually have the fab capacity anyway. They were building it, but that's all on ice now, and probably wasn't close to TSMC anyway, I'd guess.
This all ignores the near complete lack of product out of their advanced processes as well.
This is a technology forum first and foremost. I know it might not look that way given the recent flood of political activism articles. But, in the technology field, this is pretty big news. This stake makes Nvidia one of Intel's biggest shareholders.
It's a good deal for Nvidia, because custom x86 server CPUs have optimization potential for AI computing clusters, which matters now that Nvidia has competitors that they didn't just 2 years ago. I think that the next several years of Nvidia will be ones of fending off growing competition.
They basically baked in a massive investment profit into the deal. When you factor in the stock jump since this announcement, Nvidia has already made billions.
Strategically this is good for the US and the West. Intel needs to survive because they have the only advanced fabs that aren't within reach of China.
But as a consumer, I hate this. Intel APUs have become quite good and are great for Linux users. I don't want Nvidia's bullshit infecting them. Jenson wants to be the Apple of chips and we'll all be worse off if Nvidia SoCs become ubiquitous.
Nvidia's stake in Intel could have terrible consequences. First, it is in Nvidia's interest to kill Intel's Arc graphics, and that would be very bad because it is the only thing brighing GPU prices down for consumers. Second, the death of Intel graphics / Arc would be extremely bad for Linux, because Intel's approach to GPU drivers is the best for compatibility, wheras Nvidia is actively hostile to drivers on Linux. Third, Intel is the only company marketing consumer-grade graphics virtualization (SR-IOV), and the loss of that would make Nvidia's enterprise chips the only game in town, meaning the average consumer gets less performance, less flexibility, and less security on their computers.
Conclusion: Buy AMD. Excellent Linux support with in-tree drivers. For 15 years! A bug is something which will be fixed.
Nvidias GPUs are theoretically fast on initial benchmarks. But that’s mostly optimization by others for Nvidia? That’s it.
Everything Nvidia has done is a pain. Closed-source drivers (old pain), out of tree-drivers (new pain), ignoring (or actively harming) Wayland (everyone handles implicit sync well, except Nvidia which required explicit sync[1]), and awkward driver bugs declared as “it is not a bug, it is a feature”. The infamous bug:
https://registry.khronos.org/OpenGL/extensions/NV/NV_robustn...This extension will be soon ten years old. At least they intend to fix it? They just didn’t in the past 9 years! Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on. The good news is, after years someone figured that out and implemented a workaround. For X11 with GNOME:
https://www.phoronix.com/news/NVIDIA-Ubuntu-2025-SnR
I hope in the meantime somebody implemented a patch for Wayland.
What we need? Reliability. And Linux support. That’s why I purchase AMD. And previously Intel.
[1] I don’t judge whether implicit sync or explicit are better.
> Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on.
This actually makes sense: for example, a new task has swapped out previous task's data, or host and guest are sharing the GPU and pushing each others data away. I don't understand why this is not a part of GPU-related standards.
As for solution, discarding all the GPU data after resume won't help? Or keeping the data in the system RAM.
Apparently it is 5% ownership. Does that give them enough leverage to tank Intel’s iGPUs?
That would seem weird to be. Intel’s iGPUs are an incredibly good solution for their (non-glamorous) niche.
Intel’s dGPUs might be in a risky spot, though. (So… what’s new?)
Messing up Intel’s iGPUs would be a huge practical loss for, like, everyday desktop Linux folks. Tossing out their dGPUs, I don’t know if it is such a huge loss.
Intel's iGPUs don't seem very at risk because the market for low-power GPUs isn't very profitable to begin with. As long as Nvidia is able to sell basically any chip they want, why waste engineering hours and fab time on low-margin chips? The GT 1030 (Pascal) never got a successor, so that line is as good as dead.
Even before the Pascal GTs, most of the GT 7xx cards, which you would assume were Maxwell or Kepler from the numbering, were rebadged Fermi cards (4xx and 5xx)! That generation was just a dumping ground for all the old chips they had laying about, and given the prominence of halfway decent iGPUs by that point, I can't say I blame them for investing so little in the lineup.
That said, the dGPUs are definitely somewhat at risk, but I think the risk is only slightly elevated by this investment, given that it isn't exactly a cash cow and Intel has been doing all sorts of cost-cutting lately.
Agree. Not only would there be no money in it to try to replace Iris graphics or whatever they call them now -- it would be ultra pointless because the only people buying integrated graphics are those where gaming, on-device AI, and cryptocurrency aren't even part of the equation. Now, that is like 80%+ of the PC market, but it's perfectly well served already.
I saw this move more as setting up a worthy competitor to Snapdragon X Elite, and it could also probably crush AMD APUs if these RTX things are powerful.
Calling BS on "gaming not part of the equation". Several of my friends and I have exclusively games on integrated graphics. Sure we don't play the most abusively unoptimized AAA games like RDR2. But we're here and we're gaming.
RDR2 is quite optimized. We spend a lot of time profiling before release, and while input latency can be a tad high, the rendering pipeline is absolutely highly optimized as exhibited by the large amount of benchmarks on the web.
This is why I love HN. You get devs from any software or hardware project you care to name showing up in the comments.
Sorry, I'm happy for you, and I do play Minecraft on an iGPU. I just meant that about 80% of the PCs sold seem to be for "business use" or Chromebooks, and the people writing those POs aren't making their selections with gaming in mind.
(And also, I'm pretending Macs don't exist for this statement. They aren't even PCs anymore anyway, just giant iPhones, from a silicon perspective.)
RDD2, Ghosts Of Tsushima, Black Myth Wukong. These games will play at 40 to 50 + fps at 1080p low to medium on the intel ARC igpus (no AI upscaling).
To anyone actually paying attention, igpus have come a long way. They are no longer an 'I can play minecraft' thing.
> Sure we don't play the most abusively unoptimized AAA games like RDR2.
Wait, RDR2 is badly optimized? When I played it on my Intel Arc B580 and Ryzen 7 5800X, it seemed to work pretty well! Way better than almost any UE5 title, like The Forever Winter (really cool concept, but couldn't get past 20-30 FPS, even dropping down to 10% render scale on a 1080p monitor). Or with the Borderlands 4 controversy, I thought there'd be way bigger fish to fry.
"Gaming" = "real-time-graphics-intensive application". You could be playing chess online, or emulated SNES games, but that's not what "gaming" refers to in a hardware context.
It would be amusing to see nVidia cores integrated into the chipset instead of the Intel GPU cores. I doubt that is in the cards unless Intel is looking to slash the workforce by firing all of their graphics guys.
> Tossing out their dGPUs, I don’t know if it is such a huge loss
It would be an enormous loss to the consumer/enthusiast GPU buyer, as a third major competitor is improving the market from what feels like years and years of dreadful price/perf ratio.
You don’t say… on the very same day AMD launched a new RDNA3 card (RX 7700).
Literally a previous gen card.
amd is slow and steady. they were behind many times and many times they surprrised with amazing innovations overtaking intel. they will do it again, for both CPU and GPU.
I would guess Nvidia doesn't care at all about the iGPUs, so I agree they are probably not at risk. dGPUs though I absolutely agree They are in a risky spot. Perhaps Intel was planning to kill their more ambitious GPU goals anyway, but That seems extremely unhealthy for pretty much everyone except Nvidia
We'd have to see their cap table approximation, but I've seen functional control over a company with just a hair over 10% ownership given the voting patterns of the other stock holders.
5% by about any accounting makes you a very, very influential stockholder in a publicly traded company with a widely distributed set of owners.
5% of Ubisoft was all it took for Tencent to have very deep reaching ramifications.
They were felt at an IC level.
Would be antitrust right?
Which would take an administration that cared about enforcing anti trust for the stated reasons behind anti trust laws.
This misses the forest from the trees IMO:
- The datacenter GPU market is 10x larger than the consumer GPU market for Nvidia (and it's still growing). Winning an extra few percentage points in consumer is not a priority anymore.
- Nvidia doesn't have a CPU offering for the datacenter market and they were blocked from acquiring ARM. It's in their interest to have a friend on the CPU side.
- Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC. Intel is one of the only other leading fabs onshoring, which significantly improves Nvidia's supplier negotiation position and hedges geopolitical risk.
Not necessarily true. This might be a Microsoft funding a bankrupt Apple kind of moment.
American competition isn't a zero sum, and it's in Nvidias' best interest to keep the market healthy.
> American competition isn't a zero sum, and it's in Nvidias' best interest to keep the market healthy.
Looking at Google's recent antitrust settlement, I'm not sure this is true at present.
Nvidia's options are fund your competition to keep the market dynamic, or let the government do it by breaking you a part.
So yes. That's how American competition works.
It isn't a zero sum game. We try to create a market environment that is competitive and dynamic.
Monopolies are threat to both the company and a free open dynamic market. If Nvidia feels it could face an antitrust suit, which is reasonable, it is in its best interest to fund the future of Intel.
That's American capitalism.
> or let the government do it by breaking you a part.
Looking at Google's recent antitrust settlement, I'm not sure this is true at present.
Because the recent settlement determined, in my opinion correctly, that the market is still dynamic and competitive.
Google search is genuinely being threatened.
Google is not a monopoly, not entirely.
If AI usage also starts accruing to Google then there should be a new antitrust suit.
There are at least 2 more anti trust suits against Google on going. One is about to enter the remedies phase in Virginia.
I can’t imagine Nvidia has any concerns about that with the current administration.
Will Nvidia continue to exist beyond the current administration? If yes, then would it be prudent to consider the future beyond the current administration?
But it did when Biden was in office?
Google literally "won" antitrust case ???
the fact that google pay firefox anually meaning that its in best interest of google that there is no monopoly, judge says
One interesting parallel is Intel and AMD back in x86 1991, which is today the reason AMD is at all allowed to produce x86 without massive patent royalties to intel. [Asianometry](https://youtu.be/5oOk_KXbw6c) had a nice summery of it.
Nvidia is leaning more into data centres, but lack a CPU architecture or expertise. Intel is struggling financially, but have knowledge in iGPUs and a vast amount of patents.
They could have alot to give one another, and it's a massive win if it keeps intel afloat.
Microsoft wasnt funding bankrupt Apple, Microsoft was settling lawsuit with Jobs just on the cusp of DOJ monopoly lwasuit. Microsoft was stealing and shipping Apple QuickTime sourcecode.
https://www.theregister.com/1998/10/29/microsoft_paid_apple_...
> handwritten note by Fred Anderson, Apple's CFO, in which Anderson wrote that "the [QuickTime] patent dispute was resolved with cross-licence and significant payment to Apple." The payment was $150 million
Wow quicktime... That's a name I haven't heard for a long time.
You might want to re-read about that Apple-Microsoft incident.
Quicktime got stolen by an ex-Apple employee & in return Apple had Microsoft commit money & promise to have Office suite available on macOS/OS X
According to [0] it was a contractor working for both Apple and Microsoft. Not an ex-Apple employee but still an interesting read, if true.
[0] https://thisdayintechhistory.com/12/06/apple-sues-over-quick...
> Nvidia is actively hostile to drivers on Linux
Nvidia is contributing to Nova, the new Nvidia driver for GSP based hardware.
https://rust-for-linux.com/nova-gpu-driver
Alexandre Courbot, an Nvidia dev, is comaintainer.
https://www.phoronix.com/news/NOVA-Core-Co-Maintainer
This seems like it could be a long term existential threat for AMD. AMD CPU + GPU combos are finally coming out strong, both with MI300+ series in the supercomputing space, Strix Halo in laptops, etc. They have the advantage of being able to run code already optimized for x86 (important for gamers and HPC code), which NVIDIA doesn't have. Imagine if Grace Blackwell had x86 chips instead of Arm. If NVIDIA can get Intel CPUs with its chip offerings, it could be poised to completely take over so many new portions of the market/consolidate its current position by using its already existing mindshare and market dominance.
The article hints at it, but my guess would be this investment is intended towards Intel foundry and getting it to a place where NVIDIA can eventually rely on them over TSMC — and the ownership largely to give them upside if/when Intel stock goes up on news of an NVIDIA contract etc. It isn’t that uncommon of an arrangement for enterprise deals of such a potential magnitude. Long-term, however, and without NVIDIA making the call that could definitely have the effect of leading to Intel divesting from directly competing in as many markets, ie Arc.
For context, I highly recommend the old Stratechery articles on the history of Intel foundry.
My first thought was also that this relates to Intel's foundry business. Even if only to be able to use it in price negotiations with TSMC (it's hard to threaten to go elsewhere when there is no elsewhere left).
You absolutely nailed it IMHO. I wish I had more upvotes to give. I guess time will tell, but this seems like a clear conflict of interest.
This seems more like the deal where Microsoft invested in Apple. It’s basically charity and they will flip it in a few years when Intel gets back on their feet.
If only antitrust laws would exist
Or if monopoly laws like copyright didn't
and be enforced
Reading this after that memo about China's attitude to Nvidia is actually chilling - they just don't care do they ?
Something about this reminds me of other industry gobbling purchases. None of them ever turned out better for the product, price or general well being of society.
why would they wanna kill intel? amd has better cpus and also goes gpu better than intel :?? (*yes i may be missing things,. pls do tell!)
Microsoft’s investment in Apple was helpful for the world.
As an Apple user (and even an Apple investor), I'd rather that Apple went out of business back then. If we could re-roll the invention of the (mainstream) smartphone, maybe we'd get something other than two monopolistic companies controlling everything.
For instance, maybe if there were 4 strong vendors making the devices with diverse operating systems, native apps wouldn't have ever become important, and the Web platform would have gotten better sooner to fill that gap.
Or maybe it'd have ended up the same or worse. But I just don't think Apple being this dominant has been good for the world.
Or... we could still be using blackberry-like devices without much in the way of active/touch interface development at all. Or worse, the Windows CE or Palm with the pen things.
Lol exactly. That poster should quickly realise hes got it pretty good given the alternatives.
No no, its in NVIDIA interest to ensure it's just good enough for the plebs, so they can continue to gouge the high rate market.
nah, nvidia wouldnt do that
it would invite an DOJ case
Thats something Jensen Huang can do away with by bringing a golden gpu statue to the white house.
Assuming the DoJ is functional and paying attention.
a usable top gaming Intel GPU at good price is a myth :,(
I agree. As a Linux user who favors Intel hardware due to their Linux support, I gotta say the future looks bleak.
Well, AMD isn't going away yet, and they do seem to have finally released the advantage of open-source drivers. But that's still bad very for competition and prices.
This is a death blow to the Intel GPU+AI efforts and should not be allowed by the regulators. It is clear that Intel needs the downstream, low-cost GPU market segment to have a portfolio of AI chips based on chiplets, where most defective ones end up in the consumer grade GPUs based on manufacturing yield. NVidias interest is now for Intel not to enter either the GPU market, nor the AI market - which Intel was preparing for with its GPU efforts in recent years.
The US government is itself a major shareholder in Intel, and has every incentive to push Intel stock over its competitors. It's almost a certainty that Nvidia was forced into this deal by the government as well. We are way beyond regulation here.
Yep, there is absolutely no problem with that at all.
Never imagined politics so obviously manipulating the talking heads with nary a care about perception.
[dead]
The US government isn’t (or at least shouldn’t be) profit-motivated anyway, so it isn’t obvious what their incentives are WRT Intel’s stock.
They want a source of chips for the wars they want to conduct that is not either controlled by the party they want to go war with, or way way closer to the party they want to go to war with than they are. Buying a chunk of Intel is a way of making sure they do the things the government wants that will lead to that outcome. Or at least so the theory goes; I've got my own cynicism on this matter and wouldn't dream of tamping down on anyone else's.
Right now if the US wants to go to war with China, or anyone China really really likes, they can expect with high probability to very quickly encounter major problems getting the best chips. AIUI the world has other fab capacity that isn't in Taiwan, and some of it is even in the US, but they're all on much older processes. Some things it's not a problem that maybe you end up with an older 500MHz processor, but some things it's just a non-starter, like high-end AI.
Sibling commenters discussing profits are on the wrong track. Intel's 2024 revenue, not profits, was $53.1 billion. The Federal Government in 2024 spent $6,800 billion. No entity doing $1.8 trillion in 2024 in deficit spending gives a rat's ass about "profits". The US Federal government just spends what it wants to spend, it doesn't have any need to generate any sort of "profits" first. Thinking the Federal government cares about profits is being nowhere near cynical enough.
This is generally true even setting side the "war with China" angle. Intel is a large domestic company employing hundreds of thousands in a very critical sector, and the government has every incentive to prevent it from failing. In the last two decades we've bailed out auto companies and banks and US Steel (kinda) for the same reason.
> Right now if the US wants to go to war with China
The US is desperate to not have that war, because they spent so long in denial about how sophisticated China has become that it would be a total humiliation. What you see as the US wanting war is them simply playing catch up.
Concisely put. This is exactly the reasoning. The US is preparing for a potential war with China in 2026 or 2027, and this is how it is beginning preparations.
Sure, but this is an interesting independent of the government holding Intel stock.
The US government always ought to have the interest of US companies in mind, their job is to work in the interest of the voters and a lot of us work for US companies.
They can buy enough stock to shift the price, then use that as a lever to control their own investments prices (and thence profits). Like they've done with tariffs.
That sounds more like an abuse of government powers for individual gain than any legitimate government interest. If that was the plan it would make just as much sense to short a company and then announce a plan to put them under greater regulatory scrutiny.
You think they haven't done that sort of things yet?
Shouldn't be, yes. Isn't? Have you seen the rhetoric around tariffs? A lot of people thought they wanted the government run like a business, so welcome to the for-profit government society.
What happens now if one of these companies implodes? does it pull everything with it?
Why would anything that isn't Intel implode? And what's "everything"?
a plateauing in AI development leading to another AI Winter causing dotcom bubble 2 electric boogaloo.
too big to fail
"too big to fail" https://en.wikipedia.org/wiki/Evergrande_Group
Well, the AI bubble will eventually pop since none of the major AI chatbots are remotely profitable, even on OpenAI's eyewatering $200/month pay plan which very few have been willing to pay, and even on that OpenAI is still loosing money on it. And when it pops, so will Nvidia's stock, it's only a matter of time.
The AI hype train was built on the premise that AI will progress linearly and eventually end up replacing a lot of well paid white collar work, but it failed to deliver on that promise by now, and progress has flatlined or sometimes even gone backwards (see GPT-5 vs 4o).
FAANG companies can only absorb these losses for so long before shareholders pull out.
The AI bubble pop is probably not something NVIDIA is super looking forward to, but of anybody near the bubble they are the least likely to really get hurt by it.
They don’t make AI chips really, they make the best high-throughput, high-latency chips. When the AI bubble pops, there’ll be a next thing (unless we’re really screwed). They’ve got as good chance of owning that next thing as anybody else does. Even better odds if there are a bunch of unemployed CUDA programmers to work on it.
> AI will replace a lot of well paid white collar work, but it failed to deliver on that promise
This is comically premature.
>This is comically premature.
When you follow the progress in the last 12 months, it really isn't. Big AI companies spent "hella' stacks" of cash, but delivered next to no progress.
Progress has flatlined. The "rocket to the moon" phase has already passed us by now.
The white collar worker doesn't need to be replaced for the bots to be profitable. They just need to become dependent on the bots to increase their productivity to the point where they feel they cannot do their job without the chatbot's help. Then the white collar worker will be happy to fork over cash. We may already be there.
Also never forget that in technology moreso than any other industry showing a loss while actually secretly making a profit is a high art form. There is a lot of land grabbing happening right now, but even so it would be a bit silly to take the profit/loss public figures at face value.
>We may already be there.
Numbers prove we aren't. Sales figures show very few customers are willing to pay $200 per month for the top AI chatbots, and even at $200/month, OpenAI is still taking a loss on that plan so they're still loosing money even with top dollar customers.
I think you're unaware just how unprofitable the big AI products are. This can only go on for so long. We're not in the ZIRP era anymore where SV VC funded unicorns can be unprofitable indefinitely and endlessly burn cash on the idea that when they'll eventually beat all competitors in the race to the bottom and become monopolies they can finally turn a profit by squeezing users with higher real-world price. That ship has sailed.
I don't think you can confidently say how it will pan out. Maybe OpenAI is only unprofitable at the 200/month tier because those users are using 20x more compute than the 20/month users. OpenAI claims that they would be profitable if they weren't spending on R&D [1], so they clearly can't be hemorrhaging money that badly on the service side if you take that statement as truthful.
[1] https://www.axios.com/2025/08/15/sam-altman-gpt5-launch-chat...
"OpenAI claims that they would be profitable if they weren't spending on R&D "
Ermmm dude they are competing with Google. They have to keep reinvesting otherwise Google captures the users OAI currently has.
Free cash flows matter. Not accounting earnings. On a FCFF basis they largely in the red. Which means they have to keep raising money, at some point somebody will turn around and ask the difficult questions. This cannot go on forever.
And before someone mentions Amazon... Amazon raised enough money to sustain their reinvestment before they eventually got to the place where their EBIT(1-t) was greater than reinvestment.
This is not at all whats going on with OAI.
Regulators? In this administration?
There is no such thing.
Oh, there absolutely is where freedom of the press is concerned. Look no further than the new 'bias monitor' at CBS.
Wdym FCC just shut down that antifa Jimmothy Kimmithy.
Not shut down; regulated.
https://www.youtube.com/watch?v=1plPyJdXKIY
The FCC does not have the power to shut down broadcasts based on their content.
It does have the power to intimidate broadcasters and pressure them in a variety of ways.
ostensibly it does.
Sure seems like they're trying to invent one
[flagged]
He didn’t say anything violent. Have you watched the monologue?
Even if he did (which he didn’t), I don’t see Fox shutting down anything when one of their presenters recently stated, on air, that we should euthanize our homeless population.
To be clear (not that I agree with this situation): Fox News (where that presenter works) is a cable network, beholden to the cable providers but not a broadcaster. The FCC has relatively little leverage to regulate it, because it does not rely on broadcast licenses.
ABC is a broadcast network. It relies on a network of affiliates (largely owned by a few big companies) who selectively broadcast its programming both over the airwaves and to cable providers. Those affiliates have individual licenses for their radio broadcasting bandwidth which the FCC does have leverage over (and whose content the FCC has a long history of regulating, but not usually directly over politics, e.g. public interest requirements, profanity, and obscenity laws).
To be fair I don't see what the FCC has to do with it. This is classic Manufacturing Consent behavior.
Of course I watched it, many times. I didn't say he said anything directly violent, but he spread hateful disinformation about someone's death, entirely against FBI's findings and common sense, during a time of the highest temperatures in a while. Just to try to win the attention of people that'd rather not look in the mirror.
This is exactly the kind of disingenuous, dehumanizing behavior that radicalizes people like Tyler. And saying that right now would be like if Reagan got in to a spat about something personal during the cold war.
Did you actually watch the clip? Or are you just repeating what you heard on social media?
Yes I did. The whole clip. A few times. Not easy to watch.
Where exactly did he say anything remotely violent
If you watch the clip and know what's going on in the US right now, saying such vile disinformation now nothing but aims to up the temperature.
That's the deciding reason he got shut down too. Absolute inability to read the "room", even though what he said would be ugly at any point.
Actually - he didn't he made fun of Trump ditching the memorial service...
Which Trump did.
You clearly didn't see the clip; what you describe was just a part of it, and there Trump appears to just have not heard the question.
[dead]
Intel had an opportunity to differentiate themselves by offering more VRAM than Nvidia is willing to put in their consumer cards. It seemed like that was where Battlemage was going.
But now, are they really going to undermine this partnership for that? Their GPUs probably aren't going to become a cash cow anytime soon, but this thing probably will. The mindset among American business leaders of the past two decades has been to prioritize short-term profits above all else.
It may be that Nvidia doesn’t really see Intel as a competitor. Intel serve a part of the GPU market that Nvidia has no interest in. This reminds me a bit of Microsoft’s investment into Apple. Microsoft avoided the scorn of regulators by keeping Apple around as a competitor and if they succeed, great, they make money off of the deal.
I remember when I was studying for an MBA.. a professor was talking about the intangible value of a brand .. and finance.. and how they would reflect on each other .. At some point we were decomposing the parts of a balance sheet and they asked if one could sell the goodwill to invest in something else .. and the answer was of course .. no… well.. America has proven us wrong .. the way you sell the goodwill is to basically enshittification.. you quickly burn all your brand reputation by lowering your costs with shittier products .. your goodwill goes to 0 but your income increases so stock go up .. the CEO gets a fat bonus for it .. even tho the company itself is destroyed .. then the CEO quickly abandons ship and does the same on their next company .. rinse and repeat… infinite money!
We always called this “monetizing the brand” and it’s been annoying me since at least when Sperry when private equity and the shoes stopped being multi-year daily drivers
I don’t follow how it’s a death knell to intel AI chips. Nvidia bought shares, not a board seat. May be that’s the plan, but if you take the example of Microsoft buying apple shares that only gave apple a lifeline to build better. I do understand nvidia wants to have the whole gpu market to themselves but how will they do it?
> Nvidia bought shares, not a board seat.
I think the assumption there is that the strategic partnership that is part of the deal would in effect preclude Intel from aggressively competing with NVIDIA in that market, perhaps with the belief that the US governments financial stake in Intel would also lead to reduced anti-trust scrutiny of such an agreement not to compete.
They literally bought board seats - not today, but shares entitle you to vote on the board members on the next shareholder meeting. And 5$bn of shares buy you a lot of votes.
5$bn may not buy a huge amount of voting power, but if there are close votes on important things then it could be enough to affect the company. Keeping ones enemies closer, regardless of voting, can also help overall.
The likelihood intel AI was going to catch up with efforts like AWS Trainium, let alone Nvidia was already vanishingly small. This gives intel a chance at maintaining leading edge fab technologies.
I feel bad for gamers - I’ve been considering buying a B580 - but honestly the consumer welfare of that market is a complete sidenote.
I don’t agree. OneAPI gets a lot of things right that ROCM doesn’t, simply because ROCM is a 1:1 rip of what nvidia provides (warts and historical baggage included) whereas OneAPI was thoughtfully designed and did away with all of that. Intel has a strong history in networking, much stronger than Xilinx/AMD, and really was the best hope we had for an open standard to replace nvidia’s hellscape.
> This gives intel a chance at maintaining leading edge fab technologies.
I don't think so:
> The chip giant hasn’t disclosed whether it will use Intel Foundry to produce any of these products yet.
It seems pretty likely this is an x86 licensing strategy for nvidia. I doubt they're going to be manufacturing anything on intel fabs. I even wonder if this is a play to get an in with Trump by "supporting" his nationalizing intel strategy.
nvidia doesn’t need x86, they’re moving forward on aarch64 and won’t look back. For example, one of the headlines from CUDA 13 is that sbsa can be targeted from all toolkits, not as a separate download, which is important for making it easy to target grace. They have c2c silicon on grace for native host side nvlink. They’re not looking back.
They're clearly looking back though, investing in Intel and announcing quite substantial partnerships. Maybe they're not looking back for technical reasons, but they are looking back.
> The likelihood intel AI was going to catch up with efforts like AWS Trainium, let alone Nvidia
...and yet Nvidia is not gambling with the odds. Intel could have challenged Nvidia on performance-per-dollar or per watt, even if they failed to match performance in absolute terms (see AMD's Zen 1 vs Intel)
The regulators want this because it’s bolstering the last domestic owned fab.
Any down the road repercussions be damned from their perspective.
Intel doesn’t deserve anything. They deserve to disappear based on how they ran the company as a monopoly. No lessons were learned.
That was quite a long time ago! Intel going down the chutes now isn’t an effective punishment for how it behaved under Andy Grove and won’t deter others from Grove-like behaviour. Instead it’ll just mean even less restraint on any of the big players with market power now, like nVidia, AMD and TSMC.
This is likely true in a vacuum, but US national security concerns means the US needs Intel.
Consumer gpus are totally different products from the high end gpus now. Intel has failed on the gpu market and has effectively zero market share, so it is not actually clear there is an antitrust issue in that market. It would be nice if there was more competition but there are other players like AMD and a long tail of smaller ones
>Consumer gpus are totally different products from the high end gpus now. Intel has failed on the gpu market and has effectively zero market share, so it is not actually clear there is an antitrust issue in that market. It would be nice if there was more competition but there are other players like AMD and a long tail of smaller ones
I'm sorry that's just not correct. Intel is literally just getting started in the GPU market, and their last several releases have been nearly exactly what people are asking for. Saying "they've lost" when the newest cards have been on the market for less than a month is ridiculous.
If they are even mediocre at marketing, the Arc Pro B50 has a chance to be an absolute game changer for devs who don't have a large budget:
https://www.servethehome.com/intel-arc-pro-b50-review-a-16gb...
I have absolutely no doubt Nvidia sees that list of "coming features" and will do everything they can to kill that roadmap.
"Intel getting started in GPU market" is like a chain smoker quitting smoking. It's so easy that they have done it 20 times!
The lastest Arc GPUs were doing good, and were absolutely an option for entry/mid level gamers. I think lack of maturity was one of the main things keeping sales down.
I've been seeing a lot of homelab types recommending their video cards for affordable Plex transcoding as well.
Intel has been making GPUs for over 25 years. Claiming they are just getting started is absurd.
To that point, they've been "just getting started" in practically every chip market other than x86/x64 CPUs for over 20 years now, and have failed miserably every time.
If you think Nvidia is doing this because they're afraid of losing market share, you're way off base.
There's a very big difference between the MVP graphics chips they've included in CPUs and the Arc discrete GPU.
Sure, but claiming they have literally just started is completely inaccurate.
They've been making discrete GPUs on and off since the 80s, and this is at least their 3rd major attempt at it as a company, depending on how you define "major".
They haven't even just started on this iteration, as the Arc line has been out since 2022.
The main thing I learned from this submission is how much people hate Nvidia.
I love GPU differentiation, but this is one of those areas where Nvidia is justified shipping less VRAM. With less VRAM, you can use fewer memory controllers to push higher speeds on the same memory!
For instance, both the B50 and the RTX 2060 use GDDR6 memory. But the 2060 has a 192-bit memory bus, and enjoys ~336 GB/s bandwidth because of it.
Tell me again, how fast can you move data from system ram to vram?
Over a PCIe5 x8, ~31.5gb/s.
I don't know what anybody would do with such a weak card.
My RTX 5090 is about 10x faster (measured by FP32 TFLOPS) and I still don't find it to be fast enough. I can't imagine using something so slow for AI/ML. Only 2.2 tokens/sec on an 8B parameter Llama model? That's slower than someone typing.
I get that it's a budget card, but budget cards are supposed to at least win on a pure price/performance ratio, even with a lower baseline performance. The 5090 is 10x faster but only 6-8x the price, depending on where in the $2-3,000 price range you can find one at.
I feel as though you are measuring tokens/s wrong, or have a serious bottleneck somewhere. On my i5-10210u (no dedicated graphics, at standard clock speeds), I get ~6 tokens/s on phi4-mini, a 4b model. That means my laptop CPU with a power draw of 15 watts, that was released 6 years ago, is performing better than a 5090.
> The 5090 is 10x faster but only 6-8x the price
I don't buy into this argument. A B580 can be bought at MSRP for 250$. A RTX 5090 from my local Microcenter is around 3250$. That puts it at around 1/13th the price.
Power costs can also be a significant factor if you choose to self-host, and I wouldn't want to risk system integrity for 3x the power draw, 13x the price, a melting connector, and Nvidia's terrible driver support.
EDIT: You can get an RTX 5090 for around 2500$. I doubt it will ever reach MSRP though.
The B60 is ridiculously good for scientific workloads. it's 50% more fp64 flops than a 5090 and 3/4ths the VRAM for 1/4th the price.
you have outlier needs if an rtx, the fastest consumer grade card, is not good enough for you.
the intel card is great for 1080p gaming. especially if you're just playing counterstrike, indie games, etc, you don't need a beast.
very few people are trying to play 4k tombraider on ultra with high refresh rate.
FWIW, my slowness is because of quantizing.
I've been using Mistral 7B, and I can get 45 tokens/sec, which is PLENTY fast, but to save VRAM so I can game while doing inference (I run an IRC bot that allows people to talk to Mistral), I quantize to 8 bits, which then brings my inference speed down to ~8 tokens/sec.
For gaming, I absolutely love this card. I can play Cyberpunk 2077 with all the graphics settings set to the maximum and get 120+ fps. Though when playing a much more graphically intense game like that, I certainly need to kill the bot to free up the VRAM. But I can play something simpler like League of Legends and have inference happening while I play with zero impact on game performance.
I also have 128 GB of system RAM. I've thought about loading the model in both 8-bit and 16-bit into system RAM and just swap which one is in VRAM based on if I'm playing a game so that if I'm not playing something, the bot runs significantly faster.
Hold on, you're only getting 45 tokens/sec with Mistral 7B on a 5090 of all things? That gets ~240 tokens/sec with Llama 7B quantized to 4 bits on llama.cpp [1] and those models should be pretty similar architecturally.
I don't know exactly how the scaling works here but considering how LLM inference is memory bandwidth limited you should go beyond 100 tokens/sec with the same model and a 8 bit quantization.
1. https://github.com/ggml-org/llama.cpp/discussions/15013
My understanding is that quantizing lowers memory usage but increases compute usage because it still needs to convert the weights to fp16 on the fly at inference time.
Clearly I'm doing something wrong if it's a net loss in performance for me. I might have to look more into this.
> My RTX 5090 is about 10x faster (measured by FP32 TFLOPS) and I still don't find it to be fast enough. I can't imagine using something so slow for AI/ML. Only 2.2 tokens/sec on an 8B parameter Llama model? That's slower than someone typing.
Its also orders of magnitudr slower than what I normally see cited by people using 5090s; heck, its even much slower than I see on my own 3080Ti laptop card for 8B models, though usually won’t use more than an 8bpw quant for that size model.
Yeah, I must be doing something wrong. Someone else pointed out that I should be getting much better performance. I'll be looking into it.
[dead]
> it is not actually clear there is an antitrust issue in that market
Preempting a (potential) future competitor from entering a market is also an antitrust issue.
Other than the market segmentation over RAM amounts, I don't see very much difference. There's some but there's been some for a long time. Isn't AMD re-unifying their architectures?
> There's some but there's been some for a long time. Isn't AMD re-unifying their architectures?
Yes.
> Other than the market segmentation over RAM amounts, I don't see very much difference.
The difference between CDNA and RDNA is pretty much how fast it can crunch FP64 and SR-IOV. Prior to RDNA, AMD GPUs were jacks of all trades with compute bias. Which made them bad for gaming unless the game is specifically written around async compute. Vega64 has more FP64 compute than the 4080 for context.
I think if AMD was able to get a solid market share of datacenter GPUs, they wouldn't have unified. This feels like CDNA team couldn't justify its existence.
The alternative is currently looking like cutting up of intel into piecemeal to make a quick buck just to stay afloat. The GPU division is not profitable and may be destroyed if overall financials don't improve.
Does Nvidia now have controlling interest? A bunch of board seats?
Why would it matter if not? This is a nice partnership. Each gets something the other lacks.
And it strengthens domestic manufacturing. Taiwan is going to be subumed soon, and we need more domestic production now.
NVidia only owns 4% of Intel. They won't be able to dictate it's direction.
If anything, it might be more of a strategic retreat or a hedged bet
Right now, for the US national interests, our biggest concern is that Intel continues to exist. Intel has been making crappy GPUs for 25 years. They weren’t going to start making great GPUs now.
Besides, who would actually use them if they don’t support CUDA?
Everyone designs better GPUs than Intel - even Apple’s ARM GPUs have been outpacing Intel for a decade even before the M series.
> They weren’t going to start making great GPUs now.
But that's exactly what they started doing with Battlemage? It's competitive in its price range and was showing generational strides.
> Besides, who would actually use them if they don’t support CUDA?
ML is starting to trend away from CUDA towards Vulkan, even on Nvidia hardware, for practical reasons (e.g. performance overhead).
Intel has been trying to make decent GPUs for 25+ years. No company is going to invest billions buying Intel GPUs - especially not the hyper scalers.
Why does it matter if Intel exists if they can't compete? AMD exists. The only point of hoping they remain is to create an environment of competition as that drives development and progress.
Though fair and free markets is not at all what the current regime in the US believes in, instead it will be consolidation, leading waste, and little innovation and progress.
AMD doesn’t have a foundery. They are irrelevant.
Well, I guess enjoy using your 3rd world Intel GPUs. A shitty foundery is irrelevant.
Intel isn’t that far behind. But it is dumb to depend on fabs in a country that is just one Chinese missile away from getting destroyed.
That's most of the world, including the USA. https://en.wikipedia.org/wiki/China_and_weapons_of_mass_dest...
So you don’t see the difference in the threat level of China bombing and invading Taiwan - which they already claim they own - and China attacking the US directly?
I don't, because I'm not in the US. But my comment was in reply to the actual text of the grandparent, not some imagined subtext between the lines.
So its just an imagined subtext that China that has been rabble rousing about taking over Taiwan is more likely to attack a tiny island nation right next to than attack the US?
And why would they when TSMC is in both China and the US in some fashion?
And Taiwan is forbidding TSMC from building their cutting edge fabs in the US…
https://www.asiafinancial.com/taiwan-says-tsmc-not-allowed-t...
That may have changed since then. But do you really want to depend on a foreign government for chip manufacturing?
Wait what. Intel GPU+AI efforts. People had to come together to fund the abandoned Intel SW development team. Intel GPUs are great at what they do but they are no nvidia. I don't even think that was on the roamdap. Also you don't know what nvidia wants. Maybe they want to flood the low end to destroy AMD benefiting consumers. We just don't know
The reason why Nvidia is buying now does not have to do anything with Arc or GPU competition. There are mainly two reasons.
1) This year, Intel, TSMC, and Samsung announced their latest factories' yields. Intel was the earliest, with 18A, while Samsung was the most recent. TSMC yieled above 60%, Intel below 60%, and Samsung around 50% (but Samsung's tech is basically a generation ahead and technically more precise), and Samsung could improve their yields the most due to the way set up the processes, where 70% is the target. Until last year, Samsung was in the second place, and with the idea that Intel caught up so fast and taking Samsung's position at least for this year, Nvidia bought Intel's stock since it's been getting cheaper since COVID.
2) It's just generally good to diversify into your competitors. Every company does this, especially when the price is cheap.
I am curious where you get your information about Samsung being more “precise”.
I was recently looking into 2nm myself, and based on wikipedia article on 2nm, TSMC 2nm is about 50% more dense than the samsung and intel equivalent. They aren’t remotely the same thing. Samsung 2nm and Intel 18A are about as dense as TSMC 3nm, that’s been in production for years.
This information is a bit dated but ...
Since "nm" is meaningless these days, the transistor count/mm2 is below.
As reference: TSMC 3nm is ~290 million transistors/mm2 (MTr/mm2).
https://news.ycombinator.com/item?id=27063034https://www.techradar.com/news/ibm-unveils-worlds-first-2nm-...
I think the intel 7nm is unrealistic. If true intel wouldn’t be “behind”
According to Wikipedia intel 7nm density is ~62 MTr/mm2. I cannot find the source wikichip page mentioned in your reference post.
FWIW, I am not in the semi industry and all my info are from Wikipedia https://en.m.wikipedia.org/wiki/7_nm_process https://en.m.wikipedia.org/wiki/2_nm_process
> I was recently looking into 2nm myself, and based on wikipedia article on 2nm, TSMC 2nm is about 50% more dense than the samsung and intel equivalent.
I did the math on TSMC N2 vs Intel 18A, and the former is 30% denser according to TSMC
>2) It's just generally good to diversify into your competitors. Every company does this, especially when the price is cheap.
This definitely isn't a thing that every company does (or even close to every company).
Not every company, but the largest ones do.
Microsoft once owned a decent amount of Apple & Facebook for example.
Didn't Japanese camera makers do this and it pushed Olympus' board members to force the selloff of that division?
How do you know Intel 18A yield if this is one of the biggest secrets?
> For personal computing, Intel will build and offer to the market x86 system-on-chips (SOCs) that integrate NVIDIA RTX GPU chiplets. These new x86 RTX SOCs will power a wide range of PCs that demand integration of world-class CPUs and GPUs.
https://www.intc.com/news-events/press-releases/detail/1750/...
What’s old is new again: back in 2017, Intel tried something similar with AMD (Kaby Lake-G). They paired a Kaby Lake CPU with a Vega GPU and HBM, but the product flopped: https://www.tomshardware.com/news/intel-discontinue-kaby-lak...
I don't think this is Intel trying to save itself, it's nVidia. Intel GPUs have been in 3rd place for a long time, but their integrated graphics are widely available and come in 2nd place because nVidia can't compete in the x86 space. Intel graphics have been closing the gap with AMD and are now within what? A factor of 2 or less (1.5?)
IMHO we will soon see more small/quiet PCs without a slot for a graphics card, relying on integrated graphics. nVidia has no place in that future. But now, by dropping $5B on Intel they can get into some of these SoCs and not become irrelevant.
The nice thing for Intel is that they might be able to claim graphics superiority in SoC land since they are currently lagging in CPU.
Way back in the mid-late 2000s Intel CPUs could be used with third party chipsets not manufactured by Intel. This had been going on forever but the space was particularly wild with Nvidia being the most popular chipset manufacturer for AMD and also making in-roads for Intel CPUs. It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.
This was all for naught as AMD purchased ATi, shutting out all other chipsets and Intel did the same. Things actually looked pretty grim for Nvidia at this point in time. AMD was making moves that suggested APUs were the future and Intel started releasing platforms with very little PCIe connectivity, prompting Nvidia to build things like the Ion platform that could operate over an anemic pcie 1x link. There were really were the beginnings of strategic moves to lock Nvidia out of their own market.
Fortunately, Nvidia won a lawsuit against Intel that required them to have pcie 16x connectivity on their main platforms for 10 years or so and AMD put out non-competitive offerings in the CPU space such that the APU take off never happened. If Intel had actually developed their integrated GPUs or won that lawsuit or if AMD had actually executed Nvidia might well be an also-ran right around now.
To their credit, Nvidia really took advantage of their competitors inability to press their huge strategic advantage during that time. I think we're in a different landscape at the moment. Neither AMD nor Intel can afford boot Nvidia since consumers would likely abandon them for whoever could still slot in an Nvidia card. High performance graphics is the domain of add-in boards now and will be for awhile. Process node shrinks aren't as easy and cooling solutions are getting crazy.
But Nvidia has been shut out of the new handheld market and haven't been a good total package for consoles as SoC both rule the day in those spaces so I'm not super surprised at the desire for this pairing. But I did think nvidia had given up these ambitions was planning to try to build an adjacent ARM based platform as a potential escape hatch.
> It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.
This feels like a 'brand new sentence' to me because I've never met an ALi chipset that I liked. Every one I ever used had some shitty quirk that made VIA or SiS somehow more palatable [0] [1].
> Intel started releasing platforms with very little PCIe connectivity,
This is also a semi-weird statement to me, in that it was nothing new; Intel already had an established history of chipsets like the i810, 845GV, 865GV, etc which all lacked AGP. [2]
[0] - Aladdin V with it's AGP Instabilities, MAGiK 1 with it's poor handling of more than 2 or 3 'rows' of DDR (i.e. two double-sided sticks of DDR turned it into a shitshow no matter what you did to timings. 3 usually was 'ok-ish' and 2 was stable.)
[1] - SIS 730 and 735 were great chipsets for the money and TBH the closest to the AMD760 for stability.
[2] - If I had a dollar for every time I got to break the news to someone that there was no real way to put a Geforce or 'Radon' [3] in their eMachine, I could have had a then-decent down payment for a car.
[3] - Although, in an odd sort of foreshadowing, most people who called it a 'Radon', would specifically call it an AMD Radon... and now here we are. Oddly prescient.
ALi was indeed pretty much on the avoid list for me for most of their history. It was only when they came out with the ULi M1695 made famous by the Asrock939dual-sata2 that they were a contender for best out of nowhere. One of the coolest boards I ever owned and was rock solid for me even with all of the weird configs I ran on it. I kind of wish I hadn't sold it even today!
I remember a lot disappointed people on forums who couldn't upgrade their cheap PCs as well, but there were still motherboards available with AGP to slot into for Intel's best products. Intel couldn't just remove it from the landscape altogether (assuming they wanted to) because they weren't the only company making Intel supporting chipsets. IIRC Intel/AMD/Nvidia were not interested in making AGP+PCIe supporting chipsets at all, but VIA/ALi and maybe SiS made them instead because it was a free for all space still. Once that went away Nvidia couldn't control their own destiny.
nvidia does build SOCs already. The AGXs and other offerings. I'm curious why they want intel despite having that technical capability of building SOCs.
I realize the AGX is more of a low power solution and it's possible that nvidia is still technically limited when building SOCs but this is just speculation.
Does anybody know actual ground truth reasoning why Nvidia is buying Intel despite the fact that nvidia can make their own SOCs?
Why is Nvidia partnering with Mediatek for CPU cores in the dgx spark? Different question, probably the same answer.
Nvidia just doesn't care about console and handheld markets. They are unwilling to make customisations and it's low margin business.
? https://blogs.nvidia.com/blog/nintendo-switch-2-leveled-up-w...
The point stands, they're not willing to make something that could go in an ROG Ally, for example.
The rog ally probably won't sell a million units. The switch will sell 100 million. The switch is the mobile market, like it or not.
Sometimes HN users appear to have absolutely zero sense of scales. Lifetime sales numbers of those are like hours to days worth equivalent of Switch 2.
You mean like this? https://www.rockpapershotgun.com/msi-claw-8-ai-plus-review
You can take a Nintendo Switch 1, hack open the bootloader, and install Linux with Vulkan-compatible drivers.
Make no mistake - there is no reason to do this besides shortening the hardware lifespan with Box86. But it is possible, most certainly.
Xe2 is superior to current AMD integrated already
I think the comparison was between Nvidia standalone graphics chips and Intel integrated graphics capabilities.
I'm curious, why do you type it as nVidia instead of NVIDIA or Nvidia ?
Intel hasn't had desktop GPUs for a long time. Your timescale is off compared to how long AMD and Nvidia have had to polish their GPUs.
RIP Arc and Gaudi. There is no other way how to read this. Fewer competitors => higher prices.
I think it is bad news for the GPU market (AMD has had a beachhead with their integrated solution here as they've lost out elsewhere) but good for x86 which I've worried would be greatly diminished as Intel became less competitive.
I just realized there's worse possibility. They might offer it as successor to xx50/60 RTX GPUs to unsupport CUDA on low ends.
Absolutely. This is terrible news for high emission gamers, who have been living under the boot of Nvidia for decades.
That was targeted at supporting more tightly integrated and performant Macbooks .... it flopped because Apple came up with M1, not because it was bad per se.
The ryzen APUs had a rocky start but are properly good now, the concept is sound
apple never shipped a product with that, but it made for an excellent hackintosh
To me, this just validates what AMD has been doing for over a decade. Integrated GPUs for personal computing are the way forward.
Stick some CUDA cores on the CPU and market is for AI?
> Intel tried something similar with AMD (Kaby Lake-G). They paired a Kaby Lake CPU with a Vega GPU and HBM, but the product flopped
/me picturing Khaby Lame gesturing his hands at an obvious workaround.
Remember when Microsoft invested in Apple when Apple was down in the dumps? This is giving similar vibes. That deal was arguably what saved Apple near its nadir. I’m not a fan of Intel’s past monopolistic practices, but for the sake of sustaining competition in the CPU/GPU market, I hope this deal works out for them even half as well as the MS deal did for Apple.
>Remember when Microsoft invested in Apple when Apple was down in the dumps? This is giving similar vibes.
Doesn't feel the same because the 1997 investment was arranged by Apple co-founder Steve Jobs. He had a long personal relationship with Bill Gates so could just call him to drop the outstanding lawsuits and get a commitment for future Office versions on the Mac. Basically, Steve Jobs at relatively young age of 42 was back at Apple in "founder mode" and made bold moves that the prior CEO Gil Amelio couldn't do.
Intel doesn't have the same type of leadership. Their new CEO is a career finance/investor instead of a "new products new innovation" type of leader. This $5 billion investment feels more like the result of back-channel discussions with the US government where they "politely" ask NVIDIA to help out Intel in exchange for less restrictions selling chips to China.
> This $5 billion investment feels more like the result of back-channel discussions with the US government where they "politely" ask NVIDIA to help out Intel in exchange for less restrictions selling chips to China.
Stinks of Mussolini-style Corporatism to me.
This style of classical fascism or economic fascism, or whatever the term is differentiate it from the modern unrelated usage of fascism, being used in the US is a bit unnerving, and it's crazy that it's usually from the Republican party, who claims to espouse free markets.
It also happened under G. W. Bush with banks and auto manufacturers, but the worst offense was under Nixon with his nationalization of passenger rail.
At least with the bank and car manufacturer bailouts the government eventually sold off their stocks, and with the Intel investment the government has non-voting shares, but the government completely controls the National Railroad Passenger Corporation, (the NRPC aka Amtrak) with the board members being appointed by the president of the United States.
We lost 20 independent railroads overnight, and created a conglomerate that can barely function.
Yeah, the thing about the economy is it's too big for one mind to grasp, you need statistics to make sense of it in aggregate.
If you fiddle and concentrate only on the top performers, the bottom falls out. Most of the US economy is still in small companies.
That's how post-WW2 France was actually rebuilt. You could also see big hints of that in the US WW2 economic effort, which couldn't have been done without the Government taking a direct hold of things and instituting central-ish planning.
You're speaking of what is referred to as neo-corporatism [0] and it's a tripartite, democratic process, not the fascist sort where everything is within and for the benefit of the state [1].
[0] https://en.wikipedia.org/wiki/Corporatism#Neo-corporatism
[1] https://en.wikipedia.org/wiki/Corporatism#Fascist_corporatis...
> democratic process,
There was not that much democracy in the French post-WW2 technocratic establishment, but I agree that they were not technically fascist (nor otherwise).
You try to pin this (hypothetical) as fascism.
Let's assume Trump admin pressured Nvidia to invest in intel.
Chips act (voted by Democrats / Biden) gave Intel up to $7.8 billion of YOUR money (taxes) in form of direct grants.
Was it more of "Mussolini-style corporatism" to you or not?
There's big difference between government allocating tax payer dollars by passing a bill than a president using their influence to force dealings between corporate entities that benefit the ruling party.
The parent comment is speculation. But yes, speculatively, a legislative act of investment would be less authoritarian than the whims of an executive that puts tariffs on your product constantly unless you do what he says.
Is the method by which it’s communicated what gives you negative feelings? Because this is an approach to handling the labor dumping that’s been allowed in nearly every industry since the 1980s, and it’s been used numerous times in the US and abroad. They typically only offer temporary relief, while domestic industries should be adjusting and better trade deals get negotiated. The last I checked, that’s been happening to some degree… but it also probably needs to be supported by the ability for companies to borrow money, which the Fed (until recently) seemed hell bent on preventing, while we continued to watch the job market burn to the ground. So cash flush businesses investing in each other to keep competition alive seems like a positive here. Maybe that’s just me?
My comment was only referring to the manner of implementation, not the positive or negative view of the investment.
It isn't the "method of communication". It's legislation vs. coercion (in the speculative scenario from the parent comment).
Most regulation is effectively coercion. The difference is regulation isn’t easily rolled back, whereas the current approach to modifying behavior is (as we’ve seen, numerous times in the last few months even). One is more tolerant of failure than the other.
There is an extreme where policy cannot be modified, and there is an extreme where the whims of one person, and the precedent of having the US government defined as the whims and whiplashes of one person, is immensely harmful to our national credibility. It fucks with investment, immigration and education.
Except ofc. China has banned Nvidia.
https://www.ft.com/content/12adf92d-3e34-428a-8d61-c91695119...
I don't think that's an apt comparison, given that Microsoft and Apple were more direct competitors than Intel and Nvidia; the latter have a more symbiotic relationship. I think the rationale is closer to the competitor of my competitor is my friend -- they face two threats by AMD growing larger in the CPU market:
- a bigger R&D budget for their main competitor in the GPU market
- since Nvidia doesn't have their own CPUs, they risk becoming more dependent on their main competitor for total system performance.
> since Nvidia doesn't have their own CPUs, they risk becoming more dependent on their main competitor for total system performance.
This is why they built the Grace CPU - noting that they're using Arm's Neoverse V2 cores rather than their own design.
Here's Nvidia's CPUs, which are increasingly a required part of their data center offerings:
https://www.nvidia.com/en-us/data-center/grace-cpu/
Required in that Nvidia would like to sell them to you. But customers seem to be hesitant and prefer x86-based DGX and similar systems. At least from what I've heard and seen.
Nah I have feeling this is part of the result of the arm twisting to be allowed to sell to China.
This is a big ask for a shrinking market- with the pressure that the Chinese government is putting on their domestic companies to not buy H20's, I'm not sure how big this is going to be going forward. 5 billion (plus whatever it costs to build these products) is a lot for a market that is probably going to be closed soon.
It's even more ironic when you remember in 2005 the tables were turned, and Intel was trying to buy Nvidia.
Does intel have someone who will return and change the course of the company or return to its original mission or something of that sort?
Oh, I bet Elon has been handed some ideas.
That Microsoft-Apple deal was part lifeline, part strategic insurance. Intel clearly needs a win, and Nvidia needs more control over its ecosystem without being chained to TSMC forever
All they need now is a CEO like Steve Jobs…
Jensen moves to intel …
Competition?
> Remember when Microsoft invested in Apple when Apple was down in the dumps?
Had Apple failed, Microsoft would probably have been found to have a clear monopolistic position. And microsoft was already in hot waters due to InternetExplorer IIRC.
Yep. MSFT needed Apple because of Anti-trust issues.
Apples demise wouldve nailed the case.
Possibly more curious than the investment:
> Nvidia will also have Intel build custom x86 data center CPUs for its AI products for hyperscale and enterprise customers.
Hell has frozen over at Intel. Actually listening to people that want to buy your stuff, whatever next? Presumably someone over there doesn't want the AI wave to turn into a repeat of their famous success with mobile.
In the event Intel ever do get US based fabrication semi competitive again (and the national security motivation for doing so is intense) nVidia will likely have to be a major customer, so this does make sense. I remain doubtful that Intel can pull it off, and it will have to come from someone else.
If you were a big enough customer you could get a SKU for you, too. E.g. hyperscalers have Xeons which are not available for any other customers for any price.
But what they've completely resisted so far is any non trivial modification.
They turned down Acorn about the 286, which led to Acorn creating the Arm, they have turned down various console makers, they turned down Apple on the iPhone, and so on. In all cases they thought the opportunities were beneath them.
Intel has always been too much about what they want to sell you, not what you need. That worked for them when the two aligned over backwards compat.
Clearly the threat of an Arm or RISC-V finding itself fused to a GPU running AI inference workloads has woken someone up, at last.
Intel’s test for new business ideas has always been: will it make $1B in the first year?
It leads to mistakes like you mention, where a new market segment or new entrant is not a sure thing. And then it leads to mistakes like Larrabee and Optane where they talk themselves into overconfidence (“obviously this is a great product, we wouldn’t be doing it if it wasn’t guaranteed to make $1B in the first year”).
It is very hard to grow a business with zero risk appetite. You can’t take risky high return bets, and you can’t acknowledge the real risk in “safe” bets.
Larrabee could have grown into something very cool if they had not dropped it and made it available on the open market, donated to universities and so on. Transputer vibes.
I think for Larrabee it was intel experimenting to find other markets for their Atom cores, and if there was market for it they needed to have the tenacity to cultivate it. Similar to how nvidia took huge amounts of time establishing GPGPU, CUDA, then machine learning, through to reaping the rewards over the past few years.
2010-2011 was also the time that AMD were starting to moan a bit about DX11 and the higher level APIs not being sufficient to get the most out of GPUs, which led to Mantle/Vulkan/DX12 a few years down the road. Intel did a bit regarding massively parallel software rendering, with the flexibility to run on anything x86 and implement features as you liked, or AMD's efforts for 'fusion' (APU+GPU, after recently acquiring ATi) or HSA which I seem to recall was about dispatching different types of computing to the best suited processor(s) in the system for it. However I got the impression a lot of development effort is more interested in progressing on what they already have instead of starting in a new direction, and game studios want to ship finished and stable/predictable product, which is where support from intel would have helped.
It’s entirely possible that Larrabee could have been the platform for Transformers. Maybe, maybe not.
But certainly Intel wasn’t willing to wait for the market. Didn’t make $1 billion instantly; killed.
If Intel had a server SKU with fully integrated, competitive performance GPU cores that work with CUDA + unified memory, they’d sell billions worth in a day to the CSPs alone.
Sounds like they will someday soon.
There will always be giant, faraway GPU supercomputer clusters to train models. But the future of inference (where the model fits) is local to the CPU.
> will it make $1B in the first year?
It's typical corporate venturing and reporting to a CFO. Google is not much better with them cutting their small(er) projects.
Console makers only get trivial modifications. ASRock sold a cryptocurrency miner, the BC-250, with the PS5 APU, and it works just like any of their other APUs, albeit with limited driver support.
The BC250 does not use a PS5 APU, it uses another APU which has the same CPU core. By that measure the Cell in the PS3 and the Xenon of the XBox 360 were the same, or any AMD Jaguar device is a PS4.
This relates to the Intel problem because they see the world the way you just described, and completely failed to grasp the importance of SoC development where you are suddenly free to consider the world without the preexisting buses and peripherals of the PC universe and to imagine something better. CPU cores are a means to an end, and represent an ever shrinking part of modern systems.
There's almost no chance it isn't using rejected PS5 APU dies. It has fused off two of the eight CPU cores, as well as 12 of the 36 GPU compute units, but otherwise has the exact same specifications. The one customization Sony did get, the use of GDDR6 RAM, is still present. It also exhibits the same very short-lived mix of Zen 2 with RDNA 2 and has the same die size and aspect ratio.
> they have turned down various console makers
The problem is, console manufacturers know precisely how much of their product they anticipate to sell, and it's usually a lot. The PlayStation 5 is 80 million units so far.
And at that scale, the console manufacturers want to squeeze every vendor as hard as they can... and Intel didn't see the need to engage in a bidding war with AMD that would have given them a sizable revenue but very little profit margin compared to selling Xeon CPUs to hyperscalers where Intel has much more leverage to command higher prices and thus higher margins.
> they turned down Apple on the iPhone
Intel just was (and frankly, still is) unable to compete on the power envelope with ARM, that's why you never saw x86 take off on Android as well despite quite a few attempts at it.
Apple only chose to go for Intel with its MacBook line as PowerPC was practically dead and offered no way to extract more performance, and they dropped Intel as soon as their own CPUs were competitive. To get Intel CPUs to the same level of power efficiency that M-series CPUs have would require a full rework of the entire CPU infrastructure and external stack, that would require money that even Intel at its best frankly did not have. And getting x86 to be power effective enough for a phone? Just forget it.
> Clearly the threat of an Arm or RISC-V finding itself fused to a GPU running AI inference workloads has woken someone up, at last.
Actually, that is surprising for me as well. NVIDIA's Tegra should easily be powerful enough to run the OS for training or inference workload. If I were to guess, NVIDIA wants to avoid getting caught too hard on the "selling AI shovels" train.
> And at that scale, the console manufacturers want to squeeze every vendor as hard as they can... and Intel didn't see the need to engage in a bidding war with AMD that would have given them a sizable revenue but very little profit margin compared to selling Xeon CPUs to hyperscalers where Intel has much more leverage to command higher prices and thus higher margins.
And so that gave AMD an opening, and with that opening they got to experiment with designs, tailor a product, get experience and industrial marketshare, and they were able to continue to offer more and better products. Intel didn't just miss a mediocre business opportunity, they missed out on becoming a trusted partner for multiple generations, and they handed market to AMD that AMD used to be a better market competitor.
> and they handed market to AMD that AMD used to be a better market competitor.
AMD isn't precisely a market competitor. The server and business compute market is still firmly Intel and there isn't much evidence of that changing unless Apple drops M series SoCs to the wide open market which Apple won't do. Intel could probably release a raging dumpster fire and still go strong, oh wait, that's what they've been doing the last few years.
AMD is only a competitor in the lower end of the market, a market Intel has zero issue handing to AMD outright - partially because a viable AMD keeps the antitrust enforcers from breathing down their neck, but more because it drags down per-unit profit margins to engage in consoles and the lower rungs and niches.
> The server and business compute market is still firmly Intel and there isn't much evidence of that changing
This is not true anymore, as it IS changing, and very rapidly. AMD has shot up to 27.3% of the server market share, which they haven't had since the Opteron days 20 years ago. Five years ago their server market share was very small single digits. They're half of desktops, too. https://www.pcguide.com/news/no-amd-and-intel-arent-50-50-in...
Apple did not want their x86 chips, they wanted their Xscale stuff. Apple went to Intel to get chips, the power envelope was appealing to Apple. Intel was the one to say no.
Right. But of course, intel was busy spinning off their Xscale business to Marvell. If they had seriously invested in it, they could have owned the coming mobile revolution.
They did push hard on their UMPC x86 SoCs (Paulsbo and derivatives) to Sony, Nokia, etc. These were never competitive on heat or battery life.
> The PlayStation 5 is 80 million units so far.
80 million in 5 years is a nothing burger as far as volume.
Estimates are at 1M Xeons a month [1], so there have been more units of PS5 and thus CPUs sold to a single customer in the same timeframe than units of Xeon CPUs over all customers.
NVDA sold 153 million Tegra units to Nintendo in 8 years, so 1.5M units a month. That's just as comparable.
[1] https://www.servethehome.com/on-ice-lake-intel-xeon-volumes-...
Spot on about the AI/mobile parallel. Intel sat out the smartphone wave while pretending it didn’t matter, and now they’re scrambling not to miss the AI train
> Actually listening to people that want to buy your stuff, whatever next?
This is very likely the new culture that LBT is bringing in. This can only be good.
It was intel culture at one time - when I started, everyone got a card to wear with your badge with intel values, there were only 6 and ‘customer orientation’ was one. It definitely influenced my personal development, but was clearly not adopted equally across the company.
Intel is a strategically important company for the United States. This smells like a token investment to appease the US government. Not saying it’s bad, but very much looks like that.
Only the fab part is. Intel needs to separate the two. Maybe Nvidia, AMD, or Qualcomm can buy the the fab part.
AMD sold off its foundries, why would they buy some again?
Why would either of these three be interested in buying a fab? The only other large player with its own fab is Samsung and Samsung has the same problem that Intel has, namedly a fab that is nowhere near close to TSMC.
I agree that Intel would be better served to spin off its fab division, a potential buyer could be the US government for military and national security relevant projects.
Someone could be interested. It could also be Global Foundries. High risk big reward bet which the government is willing to help mitigate some of the risk with funding.
They just need to separate business units.
Not an expert in the area, but I think the highest of the high-end chips is a big market, but not the biggest market as revenue for fabs. It is just the most profitable part of the market.
Maybe this changed with the AI race but there are plenty of people buying older chips by the millions for all sorts of products.
The key for getting (financial) value out of fabs is their time after they are the overtaken by the next node. The ability to keep the order book full after you have a better node is what pays off the fab. So its all the other chips- the chips for cars, for low-power internet connected devices, etc. that make the fab profitable. That is where TSMC's ability to work with different customers enables them to extract value from a fab that pure-play CPU makers struggle with.
Being fabless is a huge strategic advantage to chip designers. Intel's biggest problem has been that theyre stuck on shitty fabs. Nvidia, amd, and qualcomm do not want to be in that position.
Yeah, it definitely has that “optics-first” vibe to it.
I wonder what this means for Intel's Arc lineup. Would be a bit crazy to have privileged access to a competitor's roadmap through just owning a chunk of them. I also have to admit I really hope they dont cancel them. A triopoly is at least better than a duopoly (or realistically, a monopoly as AMD's competitiveness in gpus is pretty questionable)
It probably kills any prospect of Intel releasing a market disrupter card that many were calling for - a 64GB or 92GB card with even middling performance for under $1k.
It's pretty clear AMD and Nvidia are gatekeeping memory so they can iterate over time and protect their datacenter cards.
Intel had a prime opportunity to blow this up.
That's what I think of, along with favour from their new investment sibling, the US government. AMD doesn't want to be super competitive, they like their margins and being second choice in a hypetastic market. Even though Arc has very low adoption, it was making signs of doing scrappy things, like enabling two 24GB GPUs on one card from third party vendors, which got the hobby/upstart community pretty excited. Ultimately it's not a real market giving the people what they want via competition, it's all contrived by politics and the biggest players.
imho the entire point of this for nvidia is to kill arc
Which is arguably kind of weird because where is it actually competing with NVIDIA? A hypothetical future, I guess?
But also, does this amount of ownership even give them the ability to kill anything on Intel's roadmap without broad shareholder consensus (not that that's even how roadmaps are handled anyway)?
Also, the US Govt bought $8.9B in stock last month I guess
https://www.intc.com/news-events/press-releases/detail/1748/...
Correction: Renegotiated a prior loan as a $8.9B stock purchase.
Does that mean Intel doesn't ever have to pay it back?
It means Intel already paid it back. The dilution hurt all other owners.
Nvidia sees the forest of the trees. The consequences of the US government buying steaks and Intel are that there will be Federal requirements for us companies using Intel. This is entirely about the foundry business. Nvidia is at risk when 100% of the production of its intellectual property occurs in Taiwan. They're more interested than anyone else in diversifying their foundry solutions. Intel has just been a terrible partner and totally disregards its customers. It's only because of the new strategic need for the US to have a foundry business that the government is saving until. NVIDIA is understandably supportive of this.
Could AMD argue that Intel's licence agreement for x86-64 is at risk since it requires that Intel (and AMD) may not change hands?
I can think of _nothing_ with a better shot at unseating nvidia than a merger with intel. Fingers crossed for ever closer union between the two.
They aren’t merging - this is Nvidia ensuring their tech is in Intel chips.
Might rather see it the other way around - Nvidia getting license to create products with x86(_64) CPUs integrated in the silicon. Nvidia are the big boy in this transaction and they'll get what they want out of it. But I can see the attraction for Intel.
I don't think they can, as AFAIK the agreement for x86_64 is that Intel and AMD cannot change hands. AMD will surely fight this tooth and nail in the courts
But with the state of the courts today... who knows..
Yes indeed. It's still a step in that direction that opens up a bunch of communication channels between the execs of the two companies. Things move slowly.
You can't be serious.
Intel was well on its way to be a considerable threat to NVIDIA with their Arc line of GPUs, which are getting better and cheaper with each generation. Perhaps not in the enterprise and AI markets yet, but certainly on the consumer side.
This news muddies this approach, and I see it as a misstep for both Intel and for consumers. Intel is only helping NVIDIA, which puts them further away from unseating them than they were before.
Competition is always a net positive for consumers, while mergers are always a net negative. This news will only benefit shareholders of both companies, and Intel shareholders only in the short-term. In the long-term, it's making NVIDIA more powerful.
I'm not convinced. The latest Battlemage benchmarks I've seen put the B580 at the same performance as the RTX 4060 (which is a two years old entry-level card) but with 50% more power consumption (80W vs 125W average). It's good to have more than one open source supporting graphics vendor, but I don't think Nvidia is losing any sleep over Intel's GPU offerings.
Battlemage had the best perf/% and most the driver issues from Alchemist had been ironed out. Another generation or two of steady progress and intel have a big winner on their hands.
Intel's foundry costs are probably competitive with nvidia too - nvidia has too much opportunity cost if nothing else.
What is performance per percent?
It was a typo for $. The Arc B580 competed with the 4060 at 50 dollars less MSRP and 4GB more vram
This is very short-sighted. The cards are improving, which can't really be said about AMD, the only other potential threat to Nvidia. It's also well known that Nvidia purposefully handicaps their consumer cards to avoid cannibalizing their enterprise cards. That means that the consumer market at least is not as efficient/optimal as it could be, so a competitor actually trying to compete (unlike AMD, apparently) should be able to do that without even having to out-innovate Nvidia or anything like that. Just get close on compute performance, but offer more VRAM or cheaper multi-gpu setups.
>cheaper multi-gpu setups
Nah, nobody cares about that. Even in their heyday, SLI and CrossFire barely made sense technologically. That market is basically non-existent. There's more people now wanting to run multiple GPUs for inference than there ever were who were interested in SLI, and those people can mix and match GPUs as they like.
The B580 was released in December 2024, and the 4060 in May 2023. So not quite a two year difference.
While it doesn't quite compete at performance and power consumption, it does at price/performance and overall value. It is a $250 card, compared to the $300 of the 4060 at launch. You can still get it at that price, if there's stock, while the 4060 hovers around $400 now. It's also a 12GB card vs the 8GB of the 4060.
So, sure, this is not competitive at the high-end segment, but it's remarkable what they've accomplished in just a few years, compared to the decades that AMD and NVIDIA have on them. It's definitely not far fetched to assume that the gap would only continue to close.
Besides, Intel is not only competing at GPUs, but APUs, and CPUs. Their APU products are more performant and efficient than AMD's (e.g. 140V vs 890M).
nvidia's margins are over 80% for datacenter products. If Intel can produce chips with enough vram and performance on par with nvidia from 2 years ago at 30% margins theyd steal a lot of business, if they can figure out the cuda side of things.
I'm sure Larrabee will be superb any year now. The Xeon phi will rise again. For supporting evidence, the success of Aurora. Weren't the loss-leading arc GPUs cancelled as well? Maybe that only one generation of them, it does look like some are on the market now.
I think this partnership will damage nvidia. It might damage intel, but given they're circling the drain already, it's hard to make matters worse.
It's probably bad for consumers in every dimension.
Or to take the opposite, if nvidia rolled over intel and fired essentially everyone in the management chain and started trying to run the fabs themselves, good chance they'd turn the ship around and become even more powerful than they already are.
Has Nvidia has ran any fab successfully ?
Nope. It will/would be a learning curve. They'd probably seed it with strategic hires from TSMC.
> It might damage intel, but given they're circling the drain already, it's hard to make matters worse.
How was Intel "circling the drain"?
They have a very competitive offering of CPUs, APUs, and GPUs, and the upcoming Panther Lake and Nova Lake architectures are very promising. Their products compete with AMD, NVIDIA, and ARM SoCs from the likes of Apple.
Intel may have been in a rut years ago, but they've recovered incredibly well.
This is why I'm puzzled by this decision, and as a consumer, I would rather use a fully Intel system than some bastardized version that also involves NVIDIA. We've seen how well that works with Optimus.
None of their products are competitive, they fired the CEO who was meant to save them, fired tens of thousands of their engineers, sold off massive chunks of the company, they're still bleeding money and begging for state support?
Also their network cards no longer work properly which is deeply aggravating as that used to be something I could rely on, just bought some realtek ones to work around the intel ones falling over.
I have bad news about realtek networking...
Realtek is actually alright nowadays, even on Linux.
> None of their products are competitive
We must live in different universes, then.
Intel's 140V competes with and often outperforms AMD's 890M, at around half the power consumption.[1]
Intel's B580 competes with AMD's RX 7600 and NVIDIA's RTX 4060, at a fraction of the price of the 4060.[2]
They're not doing so well with desktop and laptop CPUs, although their Lunar Lake and Arrow Lake CPUs are still decent performers within their segments. The upcoming Panther Lake architecture is promising to improve this.
If these are not the signs of competitive products, and that they're far from "circling the drain", then I don't know what is.
FWIW, I'm not familiar with the health of their business, and what it takes to produce these products. But from a consumer's standpoint, Intel hasn't been this strong since... the early 00s?
[1]: https://www.notebookcheck.net/Radeon-890M-vs-Arc-140V_12524_...
[2]: https://www.notebookcheck.net/Intel-Arc-B580-Benchmarks-and-...
No way, man. Peak consumer Intel was from Core 2 up to Skylake-ish. That was when they started coasting and handed the market to AMD. Right now they're losing market share to them on mobile, desktop, and server. If we ignore servers, most PCs have an AMD CPU inside.
The GPUs might be competitive on price, but that's about it. It's pretty much a hardware open beta.
When your own most competitive products are being made by your competitor for you, while you still have the cost center of running your own production fabs incapable of producing your most competitive products, and receiving bailouts just to keep the lights on...
Some would say that's circling the drain.
Mergers where one company is on the verge of failing can be a net positive for consumers. Most obviously this happens when banks fail and people’s bank cards still work etc and at least initially the branches stay open.
Intel isn’t at that point, but the companies trajectory isn’t looking good. I’d happily sacrifice ARC to keep a duopoly in CPU’s.
The consumer gpu market is a rounding error compared to enterprise AI. And Intel is zero threat to Nvidia there.
> This news muddies this approach, and I see it as a misstep for both Intel and for consumers.
Consumers still have AMD as an alternative for very decent and price attractive GPUs (and CPUs).
Not everybody wants GPUs for games or for AI.
AMD has always followed closely NVIDIA in crippling their cheap GPUs for any other applications.
After many years of continuously decreasing performance of the "consumer" GPUs, only Intel has offered in the Battlemage GPUs FP64 performance comparable with what could be easily obtained 10 years ago, but no longer today.
Therefore, if the Intel GPUs disappear, then the choices in GPUs will certainly become much more restricted than today. AMD has almost never attempted to compete with NVIDIA in features, but whenever NVIDIA dropped some feature, so did AMD.
The only consumer GPUs ten years ago that offered decent FP64 performance were the GTX TITAN series. And they were beasts! It's a shame nothing quite like them exists anymore. But they were the highest of high-end cards, certainly not that common or cheap.
I hope this isn't "Shut-up" money to end ARC gpu development. i have an A770, i am very happy with it.
It's absolutely not, the ARC line is not a threat in any way to nVidia, it's to get it's feet into the CPU market without the initial setup costs and research it would take to start from scratch.
They will be dominating AMD now on both fronts if things go smoothly for them.
This really wasn't a surprise, nVidia has seemed to be itching for a meaningful entry to the CPU market and when intel's CEO started undoing all and any future investment in the company it was clear everything was being setup for a sell off.
5 Billion is just a start but this is a gift for nVidia to eventually squire intel.
I think if Nvidia wanted to acquire Intel, they would acquire Intel.
Intel has never been so cheap relative to the kinds of IP assets that Nvidia values and probably will not be ever again if this and other investments keep it afloat.
Trump's FTC would not block.
You write with proper case-sensitivity for their titles which suggests some historic knowledge of the two. They have been very close partners on CPU+GPU for decades. This investment is not fundamentally changing that.
The current CEO is more like a CFO--cutting costs and eliminating waste. There are two exits from that: sell off, as you say, and re-investment in the products of most likely future profit. This could be a signal that the latter is the plan and that the competitive aspects of the nVidia-intel partnership will be sidelined for a while.
I'm guessing NVidia didn't do this by choice. Propping up Intel doesn't seem in their best interests, nor does it do their share holders any favors by diluting their rapid growth.
There's some case for self-interest – propping up another fab etc. – but I do wonder how much of it is USG. (The economic case for Intel integrating Nvidia silicon on-chip doesn't too much sense to me: there's no growth potential in commodity/consumer x86, and maybe they can shove their new integrated Nvidia in front of enterprise buyers, but I'd be a dubious re: ROI.)
> I'm guessing NVidia didn't do this by choice. Propping up Intel doesn't seem in their best interests
In a top-down oligarchy, their best interests are served by focusing on the desires of the great leader, in contrast to a competitive bottom-up market economy, where they would focus on the desires of customers and shareholders.
There's a lot of concern in the comments here about what this means for ARC. The size of this investment while large isn't enough to warrant jeopardizing ARC though. Intel has a responsibility to all shareholders, and diminishing ARC would be a bad move for overall shareholder value.
If Nvidia did try to exert any pressure to scrap ARC, that would be both a huge financial and geopolitical scandal. It's in the best interest of the US to not only support Intel's local manufacturing, but also it's GPU tech.
Called it. I knew Nvidia had nowhere left to go, with that insanely high valuation, other than to start buying competitors and adjacent companies. I don't think this is the end, either.
> It is unclear if Intel will issue new stock for Nvidia to purchase
Erm, a rather important point to bury down the story. The fiest question on anyone’s lips will be is this $5bn to build new chip technology, or $5bn for employees to spend on yachts?
It’s the most important part of the story. It’s so gross that companies can just dilute and create stock out of thin air like this. Why hold stock in Intel if the only people that ever buy the real stock and create buy pressure are the plebs? Here is the previous time…
> Intel stock experienced dilution because the U.S. government converted CHIPS Act grants into an equity stake, acquiring a significant ownership percentage at a discounted price, which increased the total number of outstanding shares and reduced existing shareholders' ownership percentage, according to The Motley Fool and Investing.com. This led to roughly 11% dilution for existing shareholders
> It’s so gross that companies can just dilute and create stock out of thin air like this.
To get money from the outside, you either have to take on debt or you have to give someone a share in the business. In this case, the board of directors concluded the latter is better. I don't understand why you think it is gross.
To get a share in the business, you can also just buy stock in the business like everyone else, not increasing the total share count or causing dilution. They chose not to do this because it would have been more expensive due to properly compensating existing shareholders. So it's spiritually just theft.
> To get a share in the business, you can also just buy stock
The business is looking for additional capital. You can only do that by either selling new shares or raising debt.
> in the business like everyone else, not increasing the total share count or causing dilution. They chose not to do this because it would have been more expensive due to properly compensating existing shareholders. So it's spiritually just theft.
Shareholder dilution isn't inherently theft. Specific circumstances, motivations, and terms of issuance have a bearing on whether the dilution is harmful or whether it is necessary for the business.
For instance, it can be harmful if: minority shareholders are oppressed, shares are issued at a deeply discounted price with no legitimate business need or to benefit insiders at the expense of other shareholders, or if the raised capital isn't used effective to grow the company.
Dilution can be beneficial, such as when the raised capital is used for growth, employee compensation via employee stock options, etc.
> It’s so gross that companies can just dilute and create stock out of thin air like this.
Intel is up 30% pre market on this news so I think the existing shareholders will be fine.
i stole $100 from you, but you later won a scratch off for $300, so my stealing was ok
Exactly what Buffett put into Bank of America in 2011. Not sure what that means, but. . . data points:
https://markets.businessinsider.com/news/stocks/warren-buffe...
nVidia has also been licensing their GPU IP to MediaTek recently, who are working on a 2nd generation of a SoC that combines their ARM cores with nVidia GPUs now, catering to e.g. the automotive market.
Looks like using GPU IP to take over other brands' product lines is now officially an nVidia strategy.
I guess the obvious worry here is whether Intel will continue development of their own dGPUs, which have a lovely open driver stack.
Unless Nvidia outright absorbs intel I think Intel would have to be kind of crazy to stop developing GPUs.
So long as the AI craze is hanging in there it feels like having that expertise and IP is going to have high potential upside.
I'd agree, but Intel has also halted dGPU development efforts before, cf. the canned Larrabee project. Which was more troubled on the technology side however.
Yeah, Larrabee was nowhere near what they have now with Intel Arc.
Would be foolish to throw that away now that they're finally getting closer to "a product someone may want to buy" with things like B50 and B60.
Seems Nvidia needs an alternative to MediaTek or wants to pressure MediaTek given the announcement of x86 Intel/Nvidia SoCs and the delay of DGX Spark, GB10 and N1X.
They wanted to launch DGX Spark early summer and it's nowhere to be seen, while strix halo is shipping in over 30+ SKUs from all major manufacturers.
Intel should never have existed in the first place. We should have gone the RISC route in the 80s and 90s and it took 30 years for the world to realize that with ARM. It’s like we’re continuously resuscitating a zombie to terrorize us, instead of just letting it die.
About 16 years ago, Intel was considered an ugly monopoly that Nvidia didn't like [0]. It seems as if they have switched sides now.
[0]: <https://www.fudzilla.com/6882-nvidia-continues-comic-campaig...>
they need domestic chip capabilities
After the arm buyout fell through, I guess this is the next best thing. Plus a good deal for nvidia since Intel is pretty desperate at this point.
NVDA stock is up 3.5% so far today, which is ~$150B, or slightly more than one Intel. $5B is pocket change in comparison.
Except that Nvidia, Inc. doesn't own any of that NVDA stock, other people do, and they cannot access that money. Nvidia, Inc. owns net profit, which is orders of magnitude less than market cap. Last year's net profit was just under $73 billion. ($5 billion is still very affordable, too be sure).
> Except that Nvidia, Inc. doesn't own any of that NVDA stock, other people do, and they cannot access that money. Nvidia, Inc. owns net profit, which is orders of magnitude less than market cap. Last year's net profit was just under $73 billion. ($5 billion is still very affordable, too be sure).
Not all deals are made in cash, they can borrow money against their market share.
I think you may have missed AMC and TSLA; for quite a while their best selling products and biggest revenue drivers were their stock. Reflexivity is a thing; NVDA could issue 5, 10 or 20B and I don't think the price would move very much in this market. (Note that could change tomorrow.)
The board could issue more stock, which would dilute existing shareholders by about 0.1% and (in theory) cause the stock price to drop by 0.1%.
But yeah, it's probably easier to just use cash on hand.
I'm mixed on this, only because when they've done similar hybrid chips with AMD GPUs in the past the support has been poor and dropped off rather quickly.
A weird kind of full-circle moment: Intel used to laugh off Nvidia, then tried Kaby Lake-G with AMD (RIP), and now they're handing over CPU real estate to the company that wiped the floor with their own GPU efforts
People have been talking about the concentration of risk in popular indices, now large caps are buying each other's stakes? Intel's stock is up 27% today..
So AMD got ATI, and now NVidia gets Intel.
Difference is, AMD wasn't a competitor for ATi. One mostly built CPU's, while another- GPUs. These two, on the other hand, are competing in several major product categories. Overall, not a good look
I doubt we would be seeing Dell selling NVidia ARM CPUs anytime soon.
However I do imagine Intel GPUs, that were never great to start with, might be doomed, long term.
Also another possibility would be, there goes One API, which I doubt many people would care about, given how many rebrands SYSCL already went through.
>One mostly built CPU's, while another- GPUs.
I mean that also applies to Intel and Nvidia. Intel does make GPUs but their market impact is basically zero.
Fitting. Then you used to bolt a GPU to a computer. These days you bolt a computer to a GPU.
It feels like the end is in sight for dedicated graphics chips in consumer devices. Phones, consoles, and now Apple silicon are proving that SoC designs with unified memory and focused thermals are a winning strategy for efficiency and speed. Nvidia may be happy enough to move the graphics strategy onto an SoC and keep discrete boards just for AI.
Yes for efficiency, not for speed.
Yet here I am just frothing for GPUs
Great news for all involved. It also would seem to validate Apple’s unified architecture for inference, and imply AMD is getting close…
You mean AMD's unified architecture. They were a founder of the HSA Foundation that drove innovation in this space complete with Linux kernel investments and unified compute SDKs, and they had the first shipping hardware support.
This is a strong take.
AMD's actual commitment to open innovation over the past ~20 years has been game changing in a lot of segments. It is the aspect of AMD that makes it so much more appealing than intel from a hacker/consumer perspective.
No way this doesn't get blocked by antitrust. This will make them way too large and Intel is already trying to sell off (US govt bought $10B couple weeks ago)
The government is the one pushing this deal. They won't block it.
> No way this doesn't get blocked by antitrust
That action may cease to exist soon, especially after Vance is POTUS and the courts stacked with Peter Thiel loyalists that back his vision of anti-competition. Bet on it.
USG's far more interested in (scare quotes intentional) "AI dominance", and pushing American tech to fuse to do so – damn the consumer.
who's running antitrust these days
Almost 15% of Intel is now owned between the US govt. and NVidia.
AMD is much stronger in unified memory architectures than Nvidia at this point. It kinda makes sense, with the AI push.
I wonder what this means for the ARC line of GPUs?
explains why lspci queries integrated NVIDIA graphics.
bit of bearish market given 1bn is equivalent to a 1% stake in one of the largest chip manifactuers on the face of the world
SemiAccruate reported that NVidia had been dipping its toes into manufacturing its products using Intel's fabs several months ago, I'd assume that that's related.
I would say it's basically a bailout.
Probably government-mandated.
Microsoft bailing out Apple vibes.
This time around Nvidia should HOLDL the stock
Apple is so glad to be out of the Intel hellhole,
I'm very pessimistic about this. Goodbye to those nice, budget-friendly intel GPUs. nGreedia is going to continue selling 8 gig cards to consumers forever.
smart move for nvidia, as amd is the true competitor, keeping using amd's cpu will just help to build up a competitor fast. also this helps intel to figure out its foundry business and it might work someday, which also benefits nvidia as now its only choice is tsmc.
What's the significance of $5B of stock? Does that mean controlling share in Intel?
It’s a corporate engagement ring.
Article mentions it amounts to ~5% ownership
I would assume not given their market cap.
It's written in the article that the $5B represents about 5% of Intel stock outstanding.
No, but it's still a big stake from the largest player in semis. You wouldn't expect a move like that if they didn't see an opportunity there.
Seems like when Microsoft invested in Apple to keep Apple from going out of business and turning Microsoft into a potential Monopoly.
This has been an interesting 1.5 months for Intel on all fronts. I wonder how long this deal was in the making, since the timing is impeccable, looking at the current administration's involvement with Intel.
> Nvidia announced that it will buy $5 billion in Intel common stock at $23.28 per share, representing a roughly 5% ownership stake in Intel. (Intel stock is now up 33% in premarket trading.)
Why/how is INTC premarket up from $24.90 around 30% (to $32), when Nvidia is buying the stock at $23.28 ? Who is selling the stock?
I suppose the Intel board decided this? Why did they sell under the current market price? Didn't the Intel board have fiduciary duty to get as good a price from Nvidia as possible? If Nvidia buying stock moves it up so much, it seems like a bad deal to sell the stock for so little.
It's typical in these situations that the price per stock is negotiated, with current SP as a starting point. It's fairly unusual, I think, for the company selling stock to get a price significantly higher than market price. It's more typical that there's a slight discount. At least that's been the case for every stock I owned where dilution has occured. We also don't know yet when exactly this deal was negotiated and approved, so it's hard to actually say. Considering where INTC has been very recently(below $20), $23.28 seems very reasonable to me.
The reason the stock surged up past $30 is the general market's reaction to the news, and subsequent buying pressure, not the stock transaction itself. It seems likely that once the exuberance cools down, the SP will pull back, where to I can't say. Somewhere between $25 and $30 would be my bet, but this is not financial advice, I'm just spitballing here.
Are we getting a new iteration of sub-$200 mini PCs with an RTX chiplet?! That would be an amazing replacement for my N100!
Looks like I sold my Intel too soon!
Grandma smiling from heaven
This is the start of a state orchestrated AI war economy.
Man, talk about tables turning.
Nowadays I always wonder to what extent such deals are actually driven by market considerations and to what extent it's catering to the Trump administration. Token investments into this state enterprise named Intel seems to be a practical way to cater goodwill with the autocrats.
best news i've heard in days
The enemy of my enemy is my friend
I know AMD used to be lacking but these days I guess they're probably the go-to on Linux because they share changes with the community
I don't like the idea of using Intel given their lack of disclosure for Spectre/Meltdown and some of their practices (towards AMD)
How to buy out competition, kill their product offerings, and further your own market dominance without improving your own product.
Why there is no antitrust involved? this I think can affect AMD.
No idea what to think of this. I don't want Intel to die, but what will this do to their GPU business they're competing with NVIDIA on. And at worst this leads to even more consolidation
I won't trust Nvidia not doing it for nefarious purposes.
Wasn't Nvidia working on their own CPU design? Will they drop that?
They're shipping arm derived cpus and have been for years.
They also use RISC-V cores throughout their products
Intel will soon say, "Et tu, Brute?"
Nana can stop spinning in her grave now, i think he just broke even....
Give up 0.1% of shares to get 5% of Intel.
Seems to be an easy bet, if for no other reason than to make the US Government (Trump) happy. Trump gets to tout his +30% return on investment.
NVIDIA is Jensen Huang life, and he is probably the best CEO in the USA. But he should be careful. Possible Shareholders lawsuits come with Discovery. NVIDIA sales to Coreweave for example, a company they have shares on is starting to look a lot like self-dealing.
Also, since this Intel deal makes no sense for NVIDIA, a good observer would notice that lately, he seems to spend more time on Air Force One than with NVIDIA teams. The leak of any evidence, showing this was an investment ordered by the White House, will make his company hostage of future demands from the current corrupt administration. The timing is already incredibly suspicions.
We will know for sure he become a hostage, if the next NVIDIA investment is on World Liberty Financial.
"Anatomy of Two Giant Deals: The U.A.E. Got Chips. The Trump Team Got Crypto Riches." - https://www.nytimes.com/2025/09/15/us/politics/trump-uae-chi...
The entire AI ecosystem that is being built out looks very suspect frankly..
I'm taking this investment as a validation of the competitiveness of AMD's APUs & Apple's Silicon.
This is the first step that Nvidia takes to devour Intel.
INTC is strategically important company. They won't be allowed to fail. Of course, that doesn't mean the stock is a good investment. During the GFC, all the equity holders were wiped out all the bond holders got all their money back. Figure that one out.
That's quite literally why bonds are bonds and equity is equity...
Perhaps, but you do understand that the probability of every tranche regardless of seniority getting paid in full and the equity getting nothing is zero. It's mathematically impossible to pin the waterfall like this.
With good enough lawyers mathematically impossible is practically relative. Assume the game is rigged, play accordingly. If it doesn't make sense, it makes sense.
So that's probably it for the dedicated Intel GPUs. :/
Precursor to full acquisition perhaps...also maybe Jensen play to Trump a bit in this.
If there’s a time to do it, now would be the time with the current administration looking at all the regulatory blowback.
If you wanted to acquire Intel you'd do it now. Maybe Intel's future products are garbage and they do worse - but the upside seems pretty high otherwise. This seems like a bit of a firesale price to acquire an advanced fab and CPU maker. Sure, it's Intel and they haven't been doing great, but companies with solid reliable outlooks don't trade this cheaply.
Ofc I would kind of hope/expect antitrust to object given that Intel makes both GPUs and CPUs, and Nvidia is/has dipped their toes into CPU production as well.
> If you wanted to acquire Intel you'd do it now.
Intel still has to go through a lot of reorg (i.e. massive cuts) to get to a happy place, and this is what their succession of CEOs have been procrastinating over.
I recall reading a reddit comment (resounding source, I know) that claimed the reason Intel's e-cores are crushing it is because they actually synthesise them, while the P-cores are a bunch of bespoke circuits bodged together.
One wonders just how bad things must have been internally for that to be the state of one of their core IPs in this day and age...
Judging by my linkedin feed the 'a lot of reorg' is underway.
Bluntly, Intel has corporate cancer, and it requires removing the actual cancers, not a sort of 20% haircut.
It would be 100% Trump to have Nvidia buy Intel and then announce how good of an investment decision he made by buying a slice of intel.
USA, where the federal government is picking winners and losers by making risky stock bets with public money.
Not even the government at this point. The oligarchs are now in full control of the US and are dividing up their kingdoms. The plans for glulags for detractors are also being placed.
https://news.ycombinator.com/item?id=45289785 - one of the comments in the linked NYT article suggests we all read Eugenia Ginsburg’s GULAG story as preparation.
> The plans for glulags for detractors are also being placed.
This needlessly divisive and devoid of any factual basis. No gulags will exist and you know it.
> This needlessly divisive and devoid of any factual basis. No gulags will exist and you know it.
What about "Alligator Alcatraz", that has been called "concentration camp" [1] (so comparable with a gulag), or where the Korean detainees from the raid on the Hyundai/LG plant ended up, alleging utterly horrible conditions [2]? And there's bound to be more places like the latter, that was most likely just the tip of the iceberg and we only know about the conditions there because the South Korean government raised a huge stink and got the workers out of there.
Okay, Alcatraz 2.0 did get suspended in August to my knowledge, but that's only temporary. It's bound to get the legal issues cleaned up and then be re-opened - or the case makes its way through to the Supreme Court with the same result to be expected.
[1] https://newrepublic.com/article/197508/alligator-alcatraz-tr...
[2] https://www.bbc.com/news/articles/c07v1j98ydvo
Those people aren't American citizens, the comparison doesn't fit.
They didn't get due process, so there's no way to be sure that American citizens aren't getting sent there.
Of course they're making that distinction but I'll accept your tacit agreement that it's ok as long as they're non citizens.
I do not agree with that. In some cases it is acceptable to detain non-citizens for immigration-related offenses, but only if they receive due process to establish that they indeed should be detained.
Any denial of due process to any person is a gross violation of our most important right. Without the guarantee of due process to everyone, no one has any rights because those in power can violate rights at a whim.
There have been reported cases where ICE just ignored people's legal residence status or that they also snatched up citizens who didn't have paperwork on them just for "walking while black".
ICE doesn't reliably make any distinction, not since they hired thugs off of the streets and issued arrest quotas. Doesn't matter if the arrested have to be released later on.
Capitalism, all at the hands of just a bunch of people.
5B is a fairly tiny stake (Intel's market cap is around 120B), other than the "we're now working together" signal, why is this news?
In terms of voting stock, they become the biggest owner after US Commerce Department.
As customer they get better access to Intel Foundry and can offload some capacity from TSMC.
> In terms of voting stock, they become the biggest owner after US Commerce Department.
As I understand it the government's shares are non-voting.
The U.S. government won’t have a seat on the board and agreed to vote with Intel’s board on matters requiring shareholder approval “with limited exceptions.”
tbf, If I were Nvidia and antitrust wasn't an issue I'd be tempted to buy the whole thing.
Intel has a market cap just 2.5% of NVDA, so you could give away just 2.5% of your stock to buy the entirety of Intel. It's bonkers.
If that happened I would expect the same success story as with Boeing-McDonnell Douglas merger.
Why? That is an example of a bad engineering company being acquired and then poisoning the quality of the acquirer with its toxic, low-quality, corporate-politics-above-engineering culture.
There have been a lot of mergers where that has not happened.
Doubtful. The gpus are usually securely mounted and there is no chance for them to ram themselves into the ground at mach speed.
There are two scenarios here. In one, the AI bubble bursts (so Nvidia is overpriced now) and almost any value stock deal is good for them. In the other, it doesn't, and this gives them a limited hedge against problems with their most critical strategic partner (TSMC).
It looks like a good deal either way and in any amount. But of course I am no expert.
I suppose the problem is Intel doesn't actually have the fab capacity anyway. They were building it, but that's all on ice now, and probably wasn't close to TSMC anyway, I'd guess.
This all ignores the near complete lack of product out of their advanced processes as well.
Isn't 5% somewhat significant chunk? I really wouldn't call it tiny one. Maybe not even small anymore.
This is a technology forum first and foremost. I know it might not look that way given the recent flood of political activism articles. But, in the technology field, this is pretty big news. This stake makes Nvidia one of Intel's biggest shareholders.
There is a good chance this was required by politicians, and is therefore political activism.
:-)
It's a good deal for Nvidia, because custom x86 server CPUs have optimization potential for AI computing clusters, which matters now that Nvidia has competitors that they didn't just 2 years ago. I think that the next several years of Nvidia will be ones of fending off growing competition.
They basically baked in a massive investment profit into the deal. When you factor in the stock jump since this announcement, Nvidia has already made billions.
Who are NVidia's competitors? I thought they were the only game in town when it came to CUDA/AI chips.
AMD, Broadcom, Huawei, etc
Market cap was closer to 90B before this deal was announced
Strategically this is good for the US and the West. Intel needs to survive because they have the only advanced fabs that aren't within reach of China.
But as a consumer, I hate this. Intel APUs have become quite good and are great for Linux users. I don't want Nvidia's bullshit infecting them. Jenson wants to be the Apple of chips and we'll all be worse off if Nvidia SoCs become ubiquitous.
So USA now owns 10% of Intel
Did we make $500 Million off this?
Do we own 10% of Nvidia too? Or is that coming soon?
Hard to keep up with the now acceptable socialism
Any chance this move gives $INTC some legs long term?