');$('.tpu-fancybox-wrap').css('maxWidth', maxWidth);*/instance.$refs.stage.on('transitionend', function() {updateButtonPos(instance);});},onUpdate: updateButtonPos,afterShow: function(instance, slide) {updateButtonPos(instance);instance.$refs.inner.find('.fancybox-tpu-nav').show();},beforeClose: function(instance, slide) {instance.$refs.inner.find('.fancybox-tpu-nav').hide();},afterClose: function(instance, slide) {$('.tpu-fancybox-wrap').contents().unwrap();$('body').removeClass('tpu-fancybox-body-wrap')},baseTpl: '
',});});}loadjs.ready(['jquery', 'fancybox', 'swiper'], function() {attachLightbox('a[data-fancybox]');if ($(window).width()<600) {$('.imgcontainer').each(function() {var $this=$(this);if (($this.find('a').length==1) || ($this.find('a').length>7))return;$this.addClass('swiper-container');$this.find('a').addClass('swiper-slide').css('width', 'auto').wrapAll('
');new Swiper ($this.eq(0), { slidesPerView: 'auto', slidesPerGroup: 1, spaceBetween: 15, pagination: { el: '.swiper-pagination', clickable: true } });});}$('.newspost').on('click', '.spoiler > .button, .spoiler > a', function(e) {e.preventDefault();$(this).next('div').toggle();});$('.newspost').on('click', '.ispoiler', function(e) {e.preventDefault();$(this).find('div').css('filter', '');$(this).removeClass('ispoiler');});$('.contnt').on('click', '.newspoll_btn', function() {popup.Show('TechPowerUp Quick Poll','Loading...');$.get('/news-poll/options?id='+$(this).data('id'), function(data) {$('#popup_content').html(data);});});});
Thursday, July 4th 2024
by btarunrDiscuss (30 Comments)
Intel very much does intend to make discrete gaming GPUs based on its Xe2 "Battlemage" graphics architecture, which made its debut with the Core Ultra 200V "Lunar Lake-MX" processor as an iGPU. With its next generation, Intel plans to capture an even bigger share of the gaming graphics market, both on the notebook and desktop platforms. "Battlemage" will be crucial for Intel, as it will be able to make its case with Microsoft and Sony for semi-custom chips, for their next-generation consoles. Intel has all pieces of the console SoC puzzle that AMD does. A Xe2 "Battlemage" discrete GPU sample, codenamed "Churchill Falls," has been spotted making its transit in and out of locations known for Intel SoC development, such as Bangalore in India, and Shanghai in China.
Such shipping manifests tend to be incredibly descriptive, and speak of an Arc "Battlemage" X3 and Arc "Battlemage" X4 SKUs, each with 448 execution units (EU), across 56 Xe cores. Assuming an Xe core continues to have 128 unified shaders in the "Battlemage" architecture, you're looking at 7,168 unified shaders for this GPU, a staggering 75% increase in just the numerical count of the shaders, and not accounting for IPC increase and other architecture-level features. The descriptions also speak of a 256-bit wide memory bus, although they don't specify memory type or speed. Given that at launch, the Arc A770 "Alchemist" was a 1440p-class GPU, we predict Intel might take a crack at a 4K-class GPU. Besides raster 3D performance, Intel is expected to significantly improve the ray tracing and AI performance of its Xe2 discrete GPUs, making them powerful options for creative professionals.
Related News
- Tags:
- A770
- AI
- Alchemist
- Arc
- Battlemage
- Intel
- Intel Arc
- IPC
- Lunar Lake
- Aug 9th 2023 PSA: Intel Graphics Drivers Now Collect Telemetry (after Opt-In) (43)
- Dec 20th 2023 Intel Arc "Battlemage" GPUs Confirmed for 2024 Release (42)
- Jun 12th 2024 AMD Says Ryzen 9000 Series Won't Beat 7000X3D Series at Gaming (141)
- Jul 6th 2024 Intel Arc Xe2 "Battlemage" Discrete GPUs Made on TSMC 4 nm Process (55)
- May 29th 2024 AMD Ryzen 9000 Zen 5 Single Thread Performance at 5.80 GHz Found 19% Over Zen 4 (132)
- Jun 26th 2024 Intel Core Ultra 200V Lunar Lake Family Leaks: Nine Models with One Core 9 Ultra SKU (74)
- May 8th 2024 Core Configurations of Intel Core Ultra 200 "Arrow Lake-S" Desktop Processors Surface (101)
- May 2nd 2024 Intel Prepares Core Ultra 9 285K, Core Ultra 7 265K, and Core Ultra 5 245K Arrow Lake-S Desktop CPUs (67)
- Mar 12th 2024 Qualcomm Snapdragon X Elite Benchmarked Against Intel Core Ultra 7 155H (55)
- Jul 1st 2024 Intel Core Ultra "Arrow Lake" Desktop Platform Map Leaked: Two CPU-attached M.2 Slots (36)
Better Than Native
I am very interested to see how Battlemage pans out; Intel has come a hell of a long way since launching ARC driver-wise. If the hardware and, importantly, price are appealing, this could be a winner for many.
Specs look promising for a big uplift at the top end of the stack.
Hopefully they have improved a lot.
The A770 was comparable to a 4060 I believe, but lower priced.
I'm looking forward what Intel could do with Battlemage in discrete GPU market I'm very close of purchasing A770 but I held back for this.
wolfI am very interested to see how Battlemage pans out; Intel has come a hell of a long way since launching ARC driver-wise. If the hardware and, importantly, price are appealing, this could be a winner for many.Specs look promising for a big uplift at the top end of the stack.
See AlsoNVIDIA GeForce RTX 4090 Specs
Guys like Redtechgaming keep saying they are aiming for 4070 to 4070 Ti performance. If they can deliver at say 60% of the price it would be great news. But it would be up against a RDNA4 8800XT said to be 7900XT/XTX raster level with much better RT for around $500. XeSS seems better than FSR so Battlemage would have it's fans for that alone. Still, Intel have to hit the road running with drivers to really succeed.
Better Than Native
Minus InfinityGuys like Redtechgaming keep saying they are aiming for 4070 to 4070 Ti performance. If they can deliver at say 60% of the price it would be great news. But it would be up against a RDNA4 8800XT said to be 7900XT/XTX raster level with much better RT for around $500. XeSS seems better than FSR so Battlemage would have it's fans for that alone. Still, Intel have to hit the road running with drivers to really succeed.
Interesting indeed, I certainly don't expect them to deliver 4090 performance, but whatever it delivers if it's VRAM rich, priced to undercut and keeps delivering on those features like XeSS, I'd very much consider one to go in one of my rigs at least.
As you say Drivers are likely the biggest make or break here, but I'd predict a significantly stronger launch (driver wise) than Alchemist was, assuming what Battlemage is builds upon Alchemist.
@Minus Infinity
4070 Ti performance at, say, 350-400 bucks would be a decent offering. Regular 4070 level - not so much, that would have to be more in the 250-300 ballpark. RT-wise they are already in a decent place, so good pricing is what will be necessary for them to succeed, that, and, as you mentioned, getting drivers right out of the gate. This go around it won’t be excusable to have them at Alchemist launch level.
4070 performance? Prepare for disappointment
Caring1Hopefully they have improved a lot.
The A770 was comparable to a 4060 I believe, but lower priced.
Yes, even in latest benchmarks it trails it about 9% in 1080p, but it's about the same performance in 1440p and 4K.
Better Than Native
usiname4070 performance? Prepare for disappointment
When an A770 Is only just behind or equal to a 4060, I don't think it's a stretch for a chip with 75% more cores + any other improvements to IPC, clocks etc to at least match a 4070.
wolfWhen an A770 Is only just behind or equal to a 4060, I don't think it's a stretch for a chip with 75% more cores + any other improvements to IPC, clocks etc to at least match a 4070.
It's a bizarrely large step for just a single Nvidia model difference.
In average frames in 1080p difference between A770 and Nvidia RTX 4070 is 80%! 75.8 > 136.5 = 80.1%.
Minus InfinityGuys like Redtechgaming keep saying they are aiming for 4070 to 4070 Ti performance. If they can deliver at say 60% of the price it would be great news. But it would be up against a RDNA4 8800XT said to be 7900XT/XTX raster level with much better RT for around $500. XeSS seems better than FSR so Battlemage would have it's fans for that alone. Still, Intel have to hit the road running with drivers to really succeed.
I guess this shines some light on AMDs decision to focus on cheap midrange next. They feel the pressure on their console business.
Better Than Native
BwazeIt's a bizarrely large step for just a single Nvidia model difference.In average frames in 1080p difference between A770 and Nvidia RTX 4070 is 80%! 75.8 > 136.5 = 80.1%.
I'd say 1080p isn't the most relevant resolution to compare, that'd either be 1440p or at a stretch 4k, techpowerup gpu specs show the 4070 as being 65% faster, but I agree that varies and it is a large step. Still if the new top arc chip comes out and can't even match a 4070 I'd call that disappointing. My 3080 was that from almost 4 years ago, and the 4070 is soon to be superceded.
Generative AI performance for Intel Arc at home is just on paper, because any potential user has to jump through hoops for hours on end to get shit to work and tutorials are sparse, compared to nvidia-land or even AMD. The only thing that works out of the box (supposed to - it didn't for me, I had to troubleshoot) is AI Quickset from Asrock that gets you a single build of Stable Diffusion with GUI. Everything else - good luck!
However I'm still excited to see how Battlemage will do in games. I found the experience alright, even in older titles.
wolfI'd say 1080p isn't the most relevant resolution to compare, that'd either be 1440p or at a stretch 4k, techpowerup gpu specs show the 4070 as being 65% faster, but I agree that varies and it is a large step. Still if the new top arc chip comes out and can't even match a 4070 I'd call that disappointing. My 3080 was that from almost 4 years ago, and the 4070 is soon to be superceded.
Maybe there's lots of users with higher resolution monitors, but I think performance at 1080p is still very relevant - because lots of times gamers will use upscaling for these higher resolutions - some new games require really high GPU power even without raytracing, so you have no choice with these "low end" 300€+ cards (Nvidia RTX 4060 is now actually cheaper in Europe than Intel A770)...
They honestly need to release it before AMD 8700XT. Because last time it was just too little, too late and the market already bought the cards they wanted.
If Nvidia and AMD start release their cards November and this gets released same time or later, then it will fail.
So they need to push and get it released ASAP, before September.
And it will still only compete with current old generations from AMD and Nvidia, with new lines of cards at the doorstep? The only people who buy GPUs right before new generation comes out are people who have very little idea what are they buying - but they usually buy pre built, and people have no other choice due to various reasons - old card died etc - but even they would rather turn to second hand market.
It's difficult to extract more IPC from GPU architectures so realistically speaking 70% is about as much of an uplift as you can expect from this increase in shader count, far from enough for these things to be competitive.
V2 should be a big step up, since V1 had some significant hardware limitations that the poor driver team had to manually overcome, game-by-game. A more complete hardware set should make it all the more exciting and practical for everyday use.
Darmok N JaladV2 should be a big step up, since V1 had some significant hardware limitations that the poor driver team had to manually overcome, game-by-game. A more complete hardware set should make it all the more exciting and practical for everyday use.
Yes, Alchemist was half product, half proof of concept. I'm curious to see what Battlemage will turn out to be.
wolfI am very interested to see how Battlemage pans out; Intel has come a hell of a long way since launching ARC driver-wise. If the hardware and, importantly, price are appealing, this could be a winner for many.Specs look promising for a big uplift at the top end of the stack.
I'm looking forward to Battlemage as well. A750 and A770 used to much power. Hopefully Intel wil release something similar to RX7600 or RTX4600 with better performance and similar TDP.
Intel's possible success in the dGPU market WON'T be decided on their ability to innovate, it'll be decided by the same means Intel has managed to stay relevant in the x86 space despite making inferior products compared to AMD for the last few years (say what you want, the fact that Intel has an R&D budget over 3x larger than AMD's, and AMD has been basically able to edge out Intel with better products is a clear victory)...they'll use their HUGE superiority in capital (money) over AMD to manipulate the market. **I'm purposely leaving Nvidia out because any success Intel has in the dGPU market will come almost entirely at the cost of AMD's limited marketshare, NOT Nvidia's, at least not in the beginning.
Intel has been selling some their products at cost or even cheaper, this has been documented and it's an undeniable fact. They've also basically bribed OEMs with "joint development funds" to keep AMD's chips out of the best laptops....this is EXACTLY the type of stuff Intel will do in the dGPU market. Again, they're not gunning for Nvidia like some of you have deluded yourselves into believing, they're gunning for AMD, and the mention of the console business in this article reminds us of that. Any success in dGPUs at AMD's expense will be further used by Intel to disadvantage AMD further in x86 (i.e. Intel will just tell OEMs that they'll give them the dGPU at cost or even less if they if they use Intel CPUs and not AMD CPUs in their prebuilts).
I know a lot of you think that Intel's entry into dGPUs will somehow usher in a a new era that's better for consumers....but that's just not going to happen. Basically what's happening is that another company, with a proven history of anti-competitive and anti-consumer practices (that it still does at present) is entering the market, not some brand new startup up committed to spending the status quo, Intel IS the status quo. I'm sure some of you will try and say "It's better than nothing", but even that cannot be assumed, and you can basically justify anything no matter how bad, with such thinking. Mark my words....this will NOT be a net gain for consumers, enthusiasts, or the dGPU market as a whole.
Battleimage, now with Fast Z-clear!!!1
AnarchoPrimitivIntel's possible success in the dGPU market WON'T be decided on their ability to innovate, it'll be decided by the same means Intel has managed to stay relevant in the x86 space despite making inferior products compared to AMD for the last few years (say what you want, the fact that Intel has an R&D budget over 3x larger than AMD's, and AMD has been basically able to edge out Intel with better products is a clear victory)...they'll use their HUGE superiority in capital (money) over AMD to manipulate the market. **I'm purposely leaving Nvidia out because any success Intel has in the dGPU market will come almost entirely at the cost of AMD's limited marketshare, NOT Nvidia's, at least not in the beginning.Intel has been selling some their products at cost or even cheaper, this has been documented and it's an undeniable fact. They've also basically bribed OEMs with "joint development funds" to keep AMD's chips out of the best laptops....this is EXACTLY the type of stuff Intel will do in the dGPU market. Again, they're not gunning for Nvidia like some of you have deluded yourselves into believing, they're gunning for AMD, and the mention of the console business in this article reminds us of that. Any success in dGPUs at AMD's expense will be further used by Intel to disadvantage AMD further in x86 (i.e. Intel will just tell OEMs that they'll give them the dGPU at cost or even less if they if they use Intel CPUs and not AMD CPUs in their prebuilts).
I know a lot of you think that Intel's entry into dGPUs will somehow usher in a a new era that's better for consumers....but that's just not going to happen. Basically what's happening is that another company, with a proven history of anti-competitive and anti-consumer practices (that it still does at present) is entering the market, not some brand new startup up committed to spending the status quo, Intel IS the status quo. I'm sure some of you will try and say "It's better than nothing", but even that cannot be assumed, and you can basically justify anything no matter how bad, with such thinking. Mark my words....this will NOT be a net gain for consumers, enthusiasts, or the dGPU market as a whole.
That's a fact, especially about the laptop market part. Selling their AMD equivalent with 'disabilities' like only available in single channel, putting low speed RAM even if the CPU/APU can handle much higher speeds, low resolution screen etc.
At the end of the day they are selling parts and wanted to make profit out of it, the reason why they sell their first gen cheap is because its underperforming even with the superior spec (256-bit memory bus, 16GB VRAM etc). AMD should've put more effort to consumer GPU market now rather than CPU. We all know now that RDNA3 is a flop, the upcoming RDNA4 is what RDNA3 supposed to be, but even so we haven't seen any RDNA4 parts today. I just hope with Intel getting better in consumer GPU market, it will accelerate AMD effort to make theirs better too. Lately they only focus on AI bullshit.
ApocalypseeThat's a fact, especially about the laptop market part. Selling their AMD equivalent with 'disabilities' like only available in single channel, putting low speed RAM even if the CPU/APU can handle much higher speeds, low resolution screen etc.At the end of the day they are selling parts and wanted to make profit out of it, the reason why they sell their first gen cheap is because its underperforming even with the superior spec (256-bit memory bus, 16GB VRAM etc). AMD should've put more effort to consumer GPU market now rather than CPU. We all know now that RDNA3 is a flop, the upcoming RDNA4 is what RDNA3 supposed to be, but even so we haven't seen any RDNA4 parts today. I just hope with Intel getting better in consumer GPU market, it will accelerate AMD effort to make theirs better too. Lately they only focus on AI bullshit.
Trouble is, AMD has its sights set on the AI market and it's miles ahead of Intel in that area, with a more widely accepted software stack. AMD will make many more billions from that market than us pleb desktop consumers. They have no intention of competing against Nvidia it would appear as they have stood by and watch market share decline and kept prices stupidly high. They'll make an effort but will always be the second choice for most people. Furthermore, they need to offer a standout AI upscaling solution with FSR4 and improve RTing 200% at least for RDNA5 to truly compete, and even then it'll need to be 30% cheaper than Nvidia, not 10%
Minus InfinityTrouble is, AMD has its sights set on the AI market and it's miles ahead of Intel in that area, with a more widely accepted software stack.
In what way? ROCm versus OneAPI? I haven't heard talk of OneAPI but I've heard mainly negative talk about ROCm - either that it's unavailable on hardware or unavailable on software you want to use. Or cases where AMD's forced to do operations with 32 bit width instead of 16 and therefore needs twice the VRAM to store it.
Intel and AMD both have scored supercomputer deals recently. But I think such computers tend to work around software limitations that would burden other market segments, so I can't tell whether AMD or Intel is really meaningfully outdoing each other relative to Nvidia.
On the consumer side, www.tomshardware.com/pc-components/gpus/stable-diffusion-benchmarks showed intel had surprisingly competitive performance compared to amd. But, there's a few caveats. They were using Windows and all the AMD SD users I know claim the windows performance is a disaster compared to using AMD + Linux + SD. I haven't heard if the same is true for Intel because I don't know anyone who has even tried. On Nvidia it doesn't really matter, the performance between linux and windows for SD isnt that significant.