>Overall the Intel Arc Pro B50 was at 1.47x the performance of the NVIDIA RTX A1000 with that mix of OpenGL, Vulkan, and OpenCL/Vulkan compute workloads both synthetic and real-world tests. That is just under Intel's own reported Windows figures of the Arc Pro B50 delivering 1.6x the performance of the RTX A1000 for graphics and 1.7x the performance of the A1000 for AI inference. This is all the more impressive when considering the Arc Pro B50 price of $349+ compared to the NVIDIA RTX A1000 at $420+.
I guess it's a boon for Intel that NVidia repeatedly shoots their own workstation GPUs in the foot...
Such an appliance could plug into literally any modern computer — even a laptop or NUC. (And for inference, "running on an eGPU connected via Thunderbolt to a laptop" would actually work quite well; inference doesn't require much CPU, nor have tight latency constraints on the CPU<->GPU path; you mostly just need enough arbitrary-latency RAM<->VRAM DMA bandwidth to stream the model weights.)
(And yeah, maybe your workstation doesn't have Thunderbolt, because motherboard vendors are lame — but then you just need a Thunderbolt PCIe card, which is guaranteed to fit more easily into your workstation chassis than a GPU would!)
With 16GB everybody will just call it another in the long list of Intel failures.
My first software job was at a place doing municipal architecture. The modelers had and needed high end GPUs in addition to the render farm, but plenty of roles at the company simply needed anything with better than what the Intel integrated graphics of the time could produce in order to open the large detailed models.
In these roles the types of work would include things like seeing where every pipe, wire, and plenum for a specific utility or service was in order to plan work between a central plant and a specific room. Stuff like that doesn’t need high amounts of VRAM since streaming textures in worked fine. A little lag never hurt anyone here as the software would simply drop detail until it caught up. Everything was pre-rendered so it didn’t need large amounts of power to display things. What did matter was having the grunt to handle a lot of content and do it across three to six displays.
Today I’m guessing the integrated chips could handle it fine but even my 13900K’s GPU only does DisplayPort 1.4 and up to only three displays on my motherboard. It should do four but it’s up to the ODMs at that point.
For a while Matrox owned a great big slice of this space but eventually everyone fell to the wayside except NVidia and AMD.
I don't get why there's people trying to twist this story or come up with strawmen like the A2000 or even the RTX5000 series. Intel's coming into this market competitively, which as far as I know is a first, and it's also impressive.
Coming into the gaming GPU market had always been too ambitious a goal for Intel, they should have started with competing in the professional GPU market. It's well known that Nvidia and AMD have always been price gouging this market so it's fairly easy to enter it competitively.
If they can enter this market successfully and then work their way up on the food chain then that seems like good way to recover from their initial fiasco.
Toss in a 5060 Ti into the compare table, and we're in an entirely different playing field.
There are reasons to buy the workstation NVidia cards over the consumer ones, but those mostly go away when looking at something like the new Intel. Unless one is in an exceptionally power-constrained environment, yet has room for a full-sized card (not SFF or laptop), I can't see a time the B50 would even be in the running against a 5060 Ti, 4060 Ti, or even 3060 Ti.
We could just as well compare it to the slightly more capable RTX A2000, which was released more than 4 years ago. Either way, Intel is competing with the EoL Ampere architecture.
There are huge markets that does not care about SOTA performance metrics but needs to get a job done.
That's a bold claim when their acceleration software (IPEX) is barely maintained and incompatible with most inference stacks, and their Vulkan driver is far behind it in performance.
At 16GB I'd still prefer to pay a premium for NVidia GPUs given its superior ecosystem, I really want to get off NVidia but Intel/AMD isn't giving me any reason to.
PS5 has something like 16GB unified RAM, and no game is going to really push much beyond that in VRAM use, we don’t really get Crysis style system crushers anymore.
This isn't really true from the recreational card side, nVidia themselves are reducing the number of 8GB models as a sign of market demand [1]. Games these days are regularly maxing out 6 & 8 GB when running anything above 1080p for 60fps.
The prevalence of Unreal Engine 5 also recently with a low quality of optimization for weaker hardware is causing games to be released basically unplayable for most.
For recreational use the sentiment is that 8GB is scraping the bottom of the requirements. Again this is partly due to bad optimizations, but games are being played in higher resolutions also, which required more memory for larger texture sizes.
[1] https://videocardz.com/newz/nvidia-reportedly-reduces-supply...
500$ 32GB consumer GPU is an obvious best seller.
Thus let's call it how it is: they don't want to cannibalize their higher end GPUs.
We're already seeing competitors of AWS but only targeting things like Qwen , deepseek, etc.
There's Enterprise customers who have compliance laws and literally want AI but cannot use any of the top models because everything has to be run on their own infrastructure.
That's pretty funny considering that PC games are moving more towards 32GB RAM and 8GB+ VRAM. The next generation of consoles will of course increase to make room for higher quality assets.
You're wrong. It's probably more like 9 HN posters.
They also announced a 24 GB B60 and a double-GPU version of the same (saves you physical slots), but it seems like they don't have a release date yet (?).
https://www.asrock.com/Graphics-Card/Intel/Intel%20Arc%20Pro...
This to me is the gamer perspective. This segment really does not need even 32GB, let alone 64GB or more.
How so? The prosumer local AI market is quite large and growing every day, and is much more lucrative per capita than the gamer market.
Gamers are an afterthought for GPU manufacturers. NVIDIA has been neglecting the segment for years, and is now much more focused on enterprise and AI workloads. Gamers get marginal performance bumps each generation, and side effect benefits from their AI R&D (DLSS, etc.). The exorbitant prices and performance per dollar are clear indications of this. It's plain extortion, and the worst part is that gamers accepted that paying $1000+ for a GPU is perfectly reasonable.
> This segment really does not need even 32GB, let alone 64GB or more.
4K is becoming a standard resolution, and 16GB is not enough for it. 24GB should be the minimum, and 32GB for some headroom. While it's true that 64GB is overkill for gaming, it would be nice if that would be accessible at reasonable prices. After all, GPUs are not exclusively for gaming, and we might want to run other workloads on them from time to time.
While I can imagine that VRAM manufacturing costs are much higher than DRAM costs, it's not unreasonable to conclude that NVIDIA, possibly in cahoots with AMD, has been artificially controlling the prices. While hardware has always become cheaper and more powerful over time, for some reason, GPUs buck that trend, and old GPUs somehow appreciate over time. Weird, huh. This can't be explained away as post-pandemic tax and chip shortages anymore.
Frankly, I would like some government body to investigate this industry, assuming they haven't been bought out yet. Label me a conspiracy theorist if you wish, but there is precedent for this behavior in many industries.
The number of chips on the bus is usually pretty low (1 or 2 of them on most GPUs), so GPUs tend to have to scale out their memory bus widths to get to higher capacity. That's expensive and takes up die space, and for the conventional case (games) isn't generally needed on low end cards.
What really needs to happen is someone needs to make some "system seller" game that is incredibly popular and requires like 48GB of memory on the GPU to build demand. But then you have a chicken/egg problem.
Example: https://wccftech.com/nvidia-geforce-rtx-5090-128-gb-memory-g...
Why not just buy 3 card then? These cards doesn't require active cooling anyways and you can just fit 3 in decent sized case. You will get 3x VRAM speed and 3x compute. And if your usecase is llm inference, it will be a lot faster than 1x card with 3x VRAM.
> 3x VRAM speed and 3x compute
LLM scaling doesn’t work this way. If you have 4 cards, you may get 2x performance increase if you use vLLM. But you’ll also need enough VRAM to run FP8. 3 cards would only run at 1x performance.
AMD has lagged so long because of the software ecosystem but the climate now is that they'd only need to support a couple popular model architectures to immediately grab a lot of business. The failure to do so is inexplicable.
I expect we will eventually learn that this was about yet another instance of anti-competitive collusion.
Why would you bother with any Intel product with an attitude like that, gives zero confidence in the company. What business is Intel in, if not competing with Nvidia and AMD. Is it giving up competing with AMD too?
The new CEO of Intel has said that Intel is giving up competing with Nvidia.
No, he said they're giving up competing against Nvidia in training. Instead, he said Intel will focus on inference.That's the correct call in my opinion. Training is far more complex and will span multi data centers soon. Intel is too far behind. Inference is much simpler and likely a bigger market going forward.
That's how you get things like good software support in AI frameworks.
Foundry business. The latest report on Discreet Graphics Market share Nvidia has 94%, AMD at 6% and Intel at 0%.
I may still have another 12 months to go. But in 2016 I made a bet against Intel engineers on Twitter and offline suggesting GPU is not a business they want to be in, or at least too late. They said at the time they will get 20% market share minimum by 2021. I said I would be happy if they did even 20% by 2026.
Intel is also losing money, they need cashflow to compete in Foundry business. I have long argued they should have cut off GPU segment when Pat Gelsinger arrives, turns out Intel bound themselves to GPU by all the government contract and supercomputer they promised to make. Now that they have delivered it all or mostly they will need to think about whether to continue or not.
Unfortunately unless US point guns at TSMC I just dont see how Intel will be able to compete, as Intel needs to be a leading edge position in order to command the margin required for Intel to function. Right now in terms of density Intel 18A is closer to TSMC N3 then N2.
If NVidia gets complacent as Intel has become when they had the market share in the CPU space, there is opportunity for Intel, AMD and others in NVidias margin.
They may not have to, frankly, depending on when China decides to move on Taiwan. It's useless to speculate—but it was certainly a hell of a gamble to open a SOTA (or close to it—4 nm is nothing to sneeze at) fab outside of the island.
I want hardware that I can afford and own, not AI/datacenter crap that is useless to me.
1. https://youtu.be/iM58i3prTIU?si=JnErLQSHpxU-DlPP&t=225
2. https://www.intel.com/content/www/us/en/developer/articles/t...
I like to Buy American when I can but it's hard to find out which fabs various CPUs and GPUs are made in. I read Kingston does some RAM here and Crucial some SSDs. Maybe the silicon is fabbed here but everything I found is "assembled in Taiwan", which made me feel like I should get my dream machine sooner rather than later
There really is no such thing as "buying American" in the computer hardware industry unless you are talking about the designs rather than the assembly. There are also critical parts of the lithography process that depend on US technology, which is why the US is able to enforce certain sanctions (and due to some alliances with other countries that own the other parts of the process).
Personally I think people get way too worked up about being protectionist when it comes to global trade. We all want to buy our own country's products over others but we definitely wouldn't like it if other countries stopped buying our exported products.
When Apple sells an iPhone in China (and they sure buy a lot of them), Apple is making most of the money in that transaction by a large margin, and in turn so are you since your 401k is probably full of Apple stock, and so are the 60+% of Americans who invest in the stock market. A typical iPhone user will give Apple more money in profit from services than the profit from the sale of the actual device. The value is really not in the hardware assembly.
In the case of electronics products like this, almost the entire value add is in the design of the chip and the software that is running on it, which represents all the high-wage work, and a whole lot of that labor in the US.
US citizens really shouldn't envy a job where people are sitting at an electronics bench doing repetitive assembly work for 12 hours a day in a factory wishing we had more of those jobs in our country. They should instead be focused on making high level education more available/affordable so that they stay on top of the economic food chain, where most/all of its citizens are doing high-value work rather than causing education to be expensive and beg foreign manufacturers to open satellite factories to employ our uneducated masses.
I think the current wave of populist protectionist ideology is essentially blaming the wrong causes of declining affordability and increasing inequality for the working class. Essentially, people think that bringing the manufacturing jobs back and reversing globalism will right the ship on income inequality, but the reality is that the reason that equality was so good for Americans m in the mid-century was because the wealthy were taxed heavily, European manufacturing was decimated in WW2, and labor was in high demand.
The above of course is all my opinion on the situation, and a rather long tangent.
EDIT: I did think of, what is the closest thing to artisan silicon and thought of the POWER9 CPUs and found out those are made in USA Talos II is also manufactured in the US with the IBM POWER9 processors being fabbed in New York while the Raptor motherboard is manufactured in Texas along with where their systems are assembled.
I have a service that runs continuously and reencodes any videos I have into h265 and the iGPU barely even notices it.
I'll have to consider pros and cons with Ultra chips, thanks for the tip.
Apologies for the video link. But a recent pretty in depth comparison: https://youtu.be/kkf7q4L5xl8
Also, do these support SR-IOV, as in handing slices of the GPU to virtual machines?
Both their integrated and dedicated GPUs have been steadily improving each generation. The Arc line is both cheaper and comparable in performance to more premium NVIDIA cards. The 140T/140V iGPUs do the same to AMD APUs. Their upcoming Panther Lake and Nova Lake architectures seem promising, and will likely push this further. Meanwhile, they're also more power efficient and cooler, to the point where Apple's lead with their ARM SoCs is not far off. Sure, the software ecosystem is not up to par with the competition yet, but that's a much easier problem to solve, and they've been working on that front as well.
I'm holding off on buying a new laptop for a while just to see how this plays out. But I really like how Intel is shaking things up, and not allowing the established players to rest on their laurels.
Is HDMI seen as a “gaming” feature, or is DP seen as a “workstation” interface? Ultimately HDMI is a brand that commands higher royalties than DP, so I suspect this decision was largely chosen to minimize costs. I wonder what percentage of the target audience has HDMI only displays.
Converting from DisplayPort to HDMI is trivial with a cheap adapter if necessary.
HDMI is mostly used on TVs and older monitors now.
Only now are DisplayPort 2 monitors coming out
[0] https://www.amazon.co.uk/ASUS-GT730-4H-SL-2GD5-GeForce-multi...
Otherwise HDMI would have been dead a long time ago.
https://www.theregister.com/2024/03/02/hdmi_blocks_amd_foss/
(Note that some self-described “open” standards are not royalty-free, only RAND-licensed by somebody’s definiton of “R” and “ND”. And some don’t have their text available free of charge, either, let alone have a development process open to all comers. I believe the only thing the phrase “open standard” reliably implies at this point is that access to the text does not require signing an NDA.
DisplayPort in particular is royalty-free—although of course with patents you can never really know—while legal access to the text is gated[2] behind a VESA membership with dues based on the company revenue—I can’t find the official formula, but Wikipedia claims $5k/yr minimum.)
[1] https://hackaday.com/2023/07/11/displayport-a-better-video-i...
I assume you have to pay HDMI royalties for DP ports which support the full HDMI spec, but older HDMI versions were supersets of DVI, so you can encode a basic HDMI compatible signal without stepping on their IP.
> Is HDMI seen as a “gaming” feature
It's a tv content protection feature. Sometimes it degrades the signal so you feel like you're watching tv. I've had this monitor/machine combination that identified my monitor as a tv over hdmi and switched to ycbcr just because it wanted to, with assorted color bleed on red text.
It's not competing with amd/nvidia at twice the price on terms of performance, but it's also too expensive for a cheap gaming rig. And then there are people who are happy with integrated graphics.
Maybe I'm just lacking imagination here, I don't do anything fancy on my work and couch laptops and I have a proper gaming PC.
With SR-IOV* there is a low cost path for GPU in virtual machines. Until now this has (mostly) been a feature exclusive to costly "enterprise" GPUs. Combine that with the good encoders and some VDI software and you have VM hosted GPU accelerated 3D graphics to remote displays. There are many business use cases for this, and no small number of "home lab" use cases as well.
Linux is a first class citizen with Intel's display products, and B50/60 is no different, so it's a nice choice when you want a GPU accelerated Linux desktop with minimum BS. Given the low cost and power, it could find its way into Steam consoles as well.
Finally, Intel is the scrappy competitor in this space: they are being very liberal with third parties and their designs, unlike the incumbents. We're already seeing this with Maxsun and others.
* Intel has promised this for B50/60 in Q4
Therefore I can install Proxmox VE and run multiple VMs, assigning a vGPU to each of them a for video transcoding (IPCam NVR), AI and other applications.
This reminds me a lot of the LLM craze and how they wanted to charge so much for simple usage at the start until China released deepseek. Ideally we shouldn't rely on China but do we have a choice? the entire US economy has become reliant on monopolies to keep their insanely high stock prices and profit margins
(A half-height single-slot card would be even smaller, but those are vanishingly rare these days. This is pretty much as small as GPUs get unless you're looking more for a "video adapter" than a GPU.)
All current Intel Flex cards seem to be based on the previous gen "Xe".
[1] https://www.maxsun.com/products/intel-arc-pro-b60-dual-48g-t...
I would happily buy 96 Gb for $3490, but this makes very little sense.
It clocks in at 1503.4 samples per second, behind the NVidia RTX 2060 (1590.93 samples / sec, released Jan 2019), AMD Radeon RX 6750 XT (1539, May 2022), and Apple M3 Pro GPU 14 cores (1651.85, Oct 2023).
Note that this perf comparison is just ray-tracing rendering, useful for games, but might give some clarity on performance comparisons with its competition.
Intel has many, many solid customers at the government, enterprise and consumer levels.
They will be around.
Were they really? I don't think Intel is going anywhere any time soon either, but damn do they seem in bad shape. AMD, didn't they just have lackluster products for a few years and they were kind of the scrappy budget underdogs? I don't recall their fate seeming so...hopeless.
I have this cool and quiet fetish so 70 W is making me extremely interested. IF it also works as a gaming GPU.