Intel Arc "Battlemage" Xe2 GPUs with 448 EUs (56 Xe cores) Spotted in Transit (2024)

');$('.tpu-fancybox-wrap').css('maxWidth', maxWidth);*/instance.$refs.stage.on('transitionend', function() {updateButtonPos(instance);});},onUpdate: updateButtonPos,afterShow: function(instance, slide) {updateButtonPos(instance);instance.$refs.inner.find('.fancybox-tpu-nav').show();},beforeClose: function(instance, slide) {instance.$refs.inner.find('.fancybox-tpu-nav').hide();},afterClose: function(instance, slide) {$('.tpu-fancybox-wrap').contents().unwrap();$('body').removeClass('tpu-fancybox-body-wrap')},baseTpl: '

',});});}loadjs.ready(['jquery', 'fancybox', 'swiper'], function() {attachLightbox('a[data-fancybox]');if ($(window).width()<600) {$('.imgcontainer').each(function() {var $this=$(this);if (($this.find('a').length==1) || ($this.find('a').length>7))return;$this.addClass('swiper-container');$this.find('a').addClass('swiper-slide').css('width', 'auto').wrapAll('

');new Swiper ($this.eq(0), { slidesPerView: 'auto', slidesPerGroup: 1, spaceBetween: 15, pagination: { el: '.swiper-pagination', clickable: true } });});}$('.newspost').on('click', '.spoiler > .button, .spoiler > a', function(e) {e.preventDefault();$(this).next('div').toggle();});$('.newspost').on('click', '.ispoiler', function(e) {e.preventDefault();$(this).find('div').css('filter', '');$(this).removeClass('ispoiler');});$('.contnt').on('click', '.newspoll_btn', function() {popup.Show('TechPowerUp Quick Poll','Loading...');$.get('/news-poll/options?id='+$(this).data('id'), function(data) {$('#popup_content').html(data);});});});

Thursday, July 4th 2024

Intel Arc "Battlemage" Xe2 GPUs with 448 EUs (56 Xe cores) Spotted in Transit (1)

by

btarunr
Discuss (30 Comments)

Intel very much does intend to make discrete gaming GPUs based on its Xe2 "Battlemage" graphics architecture, which made its debut with the Core Ultra 200V "Lunar Lake-MX" processor as an iGPU. With its next generation, Intel plans to capture an even bigger share of the gaming graphics market, both on the notebook and desktop platforms. "Battlemage" will be crucial for Intel, as it will be able to make its case with Microsoft and Sony for semi-custom chips, for their next-generation consoles. Intel has all pieces of the console SoC puzzle that AMD does. A Xe2 "Battlemage" discrete GPU sample, codenamed "Churchill Falls," has been spotted making its transit in and out of locations known for Intel SoC development, such as Bangalore in India, and Shanghai in China.

Such shipping manifests tend to be incredibly descriptive, and speak of an Arc "Battlemage" X3 and Arc "Battlemage" X4 SKUs, each with 448 execution units (EU), across 56 Xe cores. Assuming an Xe core continues to have 128 unified shaders in the "Battlemage" architecture, you're looking at 7,168 unified shaders for this GPU, a staggering 75% increase in just the numerical count of the shaders, and not accounting for IPC increase and other architecture-level features. The descriptions also speak of a 256-bit wide memory bus, although they don't specify memory type or speed. Given that at launch, the Arc A770 "Alchemist" was a 1440p-class GPU, we predict Intel might take a crack at a 4K-class GPU. Besides raster 3D performance, Intel is expected to significantly improve the ray tracing and AI performance of its Xe2 discrete GPUs, making them powerful options for creative professionals.

Source:Tweaktown

Related News

  • Tags:
  • A770
  • AI
  • Alchemist
  • Arc
  • Battlemage
  • Intel
  • Intel Arc
  • IPC
  • Lunar Lake
  • Aug 9th 2023 PSA: Intel Graphics Drivers Now Collect Telemetry (after Opt-In) (43)
  • Dec 20th 2023 Intel Arc "Battlemage" GPUs Confirmed for 2024 Release (42)
  • Jun 12th 2024 AMD Says Ryzen 9000 Series Won't Beat 7000X3D Series at Gaming (141)
  • Jul 6th 2024 Intel Arc Xe2 "Battlemage" Discrete GPUs Made on TSMC 4 nm Process (55)
  • May 29th 2024 AMD Ryzen 9000 Zen 5 Single Thread Performance at 5.80 GHz Found 19% Over Zen 4 (132)
  • Jun 26th 2024 Intel Core Ultra 200V Lunar Lake Family Leaks: Nine Models with One Core 9 Ultra SKU (74)
  • May 8th 2024 Core Configurations of Intel Core Ultra 200 "Arrow Lake-S" Desktop Processors Surface (101)
  • May 2nd 2024 Intel Prepares Core Ultra 9 285K, Core Ultra 7 265K, and Core Ultra 5 245K Arrow Lake-S Desktop CPUs (67)
  • Mar 12th 2024 Qualcomm Snapdragon X Elite Benchmarked Against Intel Core Ultra 7 155H (55)
  • Jul 1st 2024 Intel Core Ultra "Arrow Lake" Desktop Platform Map Leaked: Two CPU-attached M.2 Slots (36)
Add your own comment
#1
wolf

Better Than Native

I am very interested to see how Battlemage pans out; Intel has come a hell of a long way since launching ARC driver-wise. If the hardware and, importantly, price are appealing, this could be a winner for many.

Specs look promising for a big uplift at the top end of the stack.

#2
Caring1

Hopefully they have improved a lot.
The A770 was comparable to a 4060 I believe, but lower priced.

#3
Apocalypsee

I'm looking forward what Intel could do with Battlemage in discrete GPU market I'm very close of purchasing A770 but I held back for this.

#4
Minus Infinity
wolfI am very interested to see how Battlemage pans out; Intel has come a hell of a long way since launching ARC driver-wise. If the hardware and, importantly, price are appealing, this could be a winner for many.

Specs look promising for a big uplift at the top end of the stack.

Guys like Redtechgaming keep saying they are aiming for 4070 to 4070 Ti performance. If they can deliver at say 60% of the price it would be great news. But it would be up against a RDNA4 8800XT said to be 7900XT/XTX raster level with much better RT for around $500. XeSS seems better than FSR so Battlemage would have it's fans for that alone. Still, Intel have to hit the road running with drivers to really succeed.

#5
wolf

Better Than Native

Minus InfinityGuys like Redtechgaming keep saying they are aiming for 4070 to 4070 Ti performance. If they can deliver at say 60% of the price it would be great news. But it would be up against a RDNA4 8800XT said to be 7900XT/XTX raster level with much better RT for around $500. XeSS seems better than FSR so Battlemage would have it's fans for that alone. Still, Intel have to hit the road running with drivers to really succeed.

Interesting indeed, I certainly don't expect them to deliver 4090 performance, but whatever it delivers if it's VRAM rich, priced to undercut and keeps delivering on those features like XeSS, I'd very much consider one to go in one of my rigs at least.

As you say Drivers are likely the biggest make or break here, but I'd predict a significantly stronger launch (driver wise) than Alchemist was, assuming what Battlemage is builds upon Alchemist.

#6
Onasi

@Minus Infinity
4070 Ti performance at, say, 350-400 bucks would be a decent offering. Regular 4070 level - not so much, that would have to be more in the 250-300 ballpark. RT-wise they are already in a decent place, so good pricing is what will be necessary for them to succeed, that, and, as you mentioned, getting drivers right out of the gate. This go around it won’t be excusable to have them at Alchemist launch level.

#7
usiname

4070 performance? Prepare for disappointment

#8
Bwaze
Caring1Hopefully they have improved a lot.
The A770 was comparable to a 4060 I believe, but lower priced.

Yes, even in latest benchmarks it trails it about 9% in 1080p, but it's about the same performance in 1440p and 4K.

#9
wolf

Better Than Native

usiname4070 performance? Prepare for disappointment

When an A770 Is only just behind or equal to a 4060, I don't think it's a stretch for a chip with 75% more cores + any other improvements to IPC, clocks etc to at least match a 4070.

#10
Bwaze
wolfWhen an A770 Is only just behind or equal to a 4060, I don't think it's a stretch for a chip with 75% more cores + any other improvements to IPC, clocks etc to at least match a 4070.

It's a bizarrely large step for just a single Nvidia model difference.

In average frames in 1080p difference between A770 and Nvidia RTX 4070 is 80%! 75.8 > 136.5 = 80.1%.

#11
Vayra86
Minus InfinityGuys like Redtechgaming keep saying they are aiming for 4070 to 4070 Ti performance. If they can deliver at say 60% of the price it would be great news. But it would be up against a RDNA4 8800XT said to be 7900XT/XTX raster level with much better RT for around $500. XeSS seems better than FSR so Battlemage would have it's fans for that alone. Still, Intel have to hit the road running with drivers to really succeed.

I guess this shines some light on AMDs decision to focus on cheap midrange next. They feel the pressure on their console business.

#12
wolf

Better Than Native

BwazeIt's a bizarrely large step for just a single Nvidia model difference.

In average frames in 1080p difference between A770 and Nvidia RTX 4070 is 80%! 75.8 > 136.5 = 80.1%.

I'd say 1080p isn't the most relevant resolution to compare, that'd either be 1440p or at a stretch 4k, techpowerup gpu specs show the 4070 as being 65% faster, but I agree that varies and it is a large step. Still if the new top arc chip comes out and can't even match a 4070 I'd call that disappointing. My 3080 was that from almost 4 years ago, and the 4070 is soon to be superceded.

#13
Dristun

Generative AI performance for Intel Arc at home is just on paper, because any potential user has to jump through hoops for hours on end to get shit to work and tutorials are sparse, compared to nvidia-land or even AMD. The only thing that works out of the box (supposed to - it didn't for me, I had to troubleshoot) is AI Quickset from Asrock that gets you a single build of Stable Diffusion with GUI. Everything else - good luck!

However I'm still excited to see how Battlemage will do in games. I found the experience alright, even in older titles.

#14
Bwaze
wolfI'd say 1080p isn't the most relevant resolution to compare, that'd either be 1440p or at a stretch 4k, techpowerup gpu specs show the 4070 as being 65% faster, but I agree that varies and it is a large step. Still if the new top arc chip comes out and can't even match a 4070 I'd call that disappointing. My 3080 was that from almost 4 years ago, and the 4070 is soon to be superceded.

Maybe there's lots of users with higher resolution monitors, but I think performance at 1080p is still very relevant - because lots of times gamers will use upscaling for these higher resolutions - some new games require really high GPU power even without raytracing, so you have no choice with these "low end" 300€+ cards (Nvidia RTX 4060 is now actually cheaper in Europe than Intel A770)...

#15
Quicks

They honestly need to release it before AMD 8700XT. Because last time it was just too little, too late and the market already bought the cards they wanted.

If Nvidia and AMD start release their cards November and this gets released same time or later, then it will fail.

So they need to push and get it released ASAP, before September.

#16
Bwaze

And it will still only compete with current old generations from AMD and Nvidia, with new lines of cards at the doorstep? The only people who buy GPUs right before new generation comes out are people who have very little idea what are they buying - but they usually buy pre built, and people have no other choice due to various reasons - old card died etc - but even they would rather turn to second hand market.

#17
Vya Domus

It's difficult to extract more IPC from GPU architectures so realistically speaking 70% is about as much of an uplift as you can expect from this increase in shader count, far from enough for these things to be competitive.

#18
Darmok N Jalad

V2 should be a big step up, since V1 had some significant hardware limitations that the poor driver team had to manually overcome, game-by-game. A more complete hardware set should make it all the more exciting and practical for everyday use.

#19
Kyan
Darmok N JaladV2 should be a big step up, since V1 had some significant hardware limitations that the poor driver team had to manually overcome, game-by-game. A more complete hardware set should make it all the more exciting and practical for everyday use.

Yes, Alchemist was half product, half proof of concept. I'm curious to see what Battlemage will turn out to be.

#20
tommo1982
wolfI am very interested to see how Battlemage pans out; Intel has come a hell of a long way since launching ARC driver-wise. If the hardware and, importantly, price are appealing, this could be a winner for many.

Specs look promising for a big uplift at the top end of the stack.

I'm looking forward to Battlemage as well. A750 and A770 used to much power. Hopefully Intel wil release something similar to RX7600 or RTX4600 with better performance and similar TDP.

#21
AnarchoPrimitiv

Intel's possible success in the dGPU market WON'T be decided on their ability to innovate, it'll be decided by the same means Intel has managed to stay relevant in the x86 space despite making inferior products compared to AMD for the last few years (say what you want, the fact that Intel has an R&D budget over 3x larger than AMD's, and AMD has been basically able to edge out Intel with better products is a clear victory)...they'll use their HUGE superiority in capital (money) over AMD to manipulate the market. **I'm purposely leaving Nvidia out because any success Intel has in the dGPU market will come almost entirely at the cost of AMD's limited marketshare, NOT Nvidia's, at least not in the beginning.

Intel has been selling some their products at cost or even cheaper, this has been documented and it's an undeniable fact. They've also basically bribed OEMs with "joint development funds" to keep AMD's chips out of the best laptops....this is EXACTLY the type of stuff Intel will do in the dGPU market. Again, they're not gunning for Nvidia like some of you have deluded yourselves into believing, they're gunning for AMD, and the mention of the console business in this article reminds us of that. Any success in dGPUs at AMD's expense will be further used by Intel to disadvantage AMD further in x86 (i.e. Intel will just tell OEMs that they'll give them the dGPU at cost or even less if they if they use Intel CPUs and not AMD CPUs in their prebuilts).

I know a lot of you think that Intel's entry into dGPUs will somehow usher in a a new era that's better for consumers....but that's just not going to happen. Basically what's happening is that another company, with a proven history of anti-competitive and anti-consumer practices (that it still does at present) is entering the market, not some brand new startup up committed to spending the status quo, Intel IS the status quo. I'm sure some of you will try and say "It's better than nothing", but even that cannot be assumed, and you can basically justify anything no matter how bad, with such thinking. Mark my words....this will NOT be a net gain for consumers, enthusiasts, or the dGPU market as a whole.

#22
pavle

Battleimage, now with Fast Z-clear!!!1

#23
Apocalypsee
AnarchoPrimitivIntel's possible success in the dGPU market WON'T be decided on their ability to innovate, it'll be decided by the same means Intel has managed to stay relevant in the x86 space despite making inferior products compared to AMD for the last few years (say what you want, the fact that Intel has an R&D budget over 3x larger than AMD's, and AMD has been basically able to edge out Intel with better products is a clear victory)...they'll use their HUGE superiority in capital (money) over AMD to manipulate the market. **I'm purposely leaving Nvidia out because any success Intel has in the dGPU market will come almost entirely at the cost of AMD's limited marketshare, NOT Nvidia's, at least not in the beginning.

Intel has been selling some their products at cost or even cheaper, this has been documented and it's an undeniable fact. They've also basically bribed OEMs with "joint development funds" to keep AMD's chips out of the best laptops....this is EXACTLY the type of stuff Intel will do in the dGPU market. Again, they're not gunning for Nvidia like some of you have deluded yourselves into believing, they're gunning for AMD, and the mention of the console business in this article reminds us of that. Any success in dGPUs at AMD's expense will be further used by Intel to disadvantage AMD further in x86 (i.e. Intel will just tell OEMs that they'll give them the dGPU at cost or even less if they if they use Intel CPUs and not AMD CPUs in their prebuilts).

I know a lot of you think that Intel's entry into dGPUs will somehow usher in a a new era that's better for consumers....but that's just not going to happen. Basically what's happening is that another company, with a proven history of anti-competitive and anti-consumer practices (that it still does at present) is entering the market, not some brand new startup up committed to spending the status quo, Intel IS the status quo. I'm sure some of you will try and say "It's better than nothing", but even that cannot be assumed, and you can basically justify anything no matter how bad, with such thinking. Mark my words....this will NOT be a net gain for consumers, enthusiasts, or the dGPU market as a whole.

That's a fact, especially about the laptop market part. Selling their AMD equivalent with 'disabilities' like only available in single channel, putting low speed RAM even if the CPU/APU can handle much higher speeds, low resolution screen etc.

At the end of the day they are selling parts and wanted to make profit out of it, the reason why they sell their first gen cheap is because its underperforming even with the superior spec (256-bit memory bus, 16GB VRAM etc). AMD should've put more effort to consumer GPU market now rather than CPU. We all know now that RDNA3 is a flop, the upcoming RDNA4 is what RDNA3 supposed to be, but even so we haven't seen any RDNA4 parts today. I just hope with Intel getting better in consumer GPU market, it will accelerate AMD effort to make theirs better too. Lately they only focus on AI bullshit.

#24
Minus Infinity
ApocalypseeThat's a fact, especially about the laptop market part. Selling their AMD equivalent with 'disabilities' like only available in single channel, putting low speed RAM even if the CPU/APU can handle much higher speeds, low resolution screen etc.

At the end of the day they are selling parts and wanted to make profit out of it, the reason why they sell their first gen cheap is because its underperforming even with the superior spec (256-bit memory bus, 16GB VRAM etc). AMD should've put more effort to consumer GPU market now rather than CPU. We all know now that RDNA3 is a flop, the upcoming RDNA4 is what RDNA3 supposed to be, but even so we haven't seen any RDNA4 parts today. I just hope with Intel getting better in consumer GPU market, it will accelerate AMD effort to make theirs better too. Lately they only focus on AI bullshit.

Trouble is, AMD has its sights set on the AI market and it's miles ahead of Intel in that area, with a more widely accepted software stack. AMD will make many more billions from that market than us pleb desktop consumers. They have no intention of competing against Nvidia it would appear as they have stood by and watch market share decline and kept prices stupidly high. They'll make an effort but will always be the second choice for most people. Furthermore, they need to offer a standout AI upscaling solution with FSR4 and improve RTing 200% at least for RDNA5 to truly compete, and even then it'll need to be 30% cheaper than Nvidia, not 10%

#25
bitsandboots
Minus InfinityTrouble is, AMD has its sights set on the AI market and it's miles ahead of Intel in that area, with a more widely accepted software stack.

In what way? ROCm versus OneAPI? I haven't heard talk of OneAPI but I've heard mainly negative talk about ROCm - either that it's unavailable on hardware or unavailable on software you want to use. Or cases where AMD's forced to do operations with 32 bit width instead of 16 and therefore needs twice the VRAM to store it.
Intel and AMD both have scored supercomputer deals recently. But I think such computers tend to work around software limitations that would burden other market segments, so I can't tell whether AMD or Intel is really meaningfully outdoing each other relative to Nvidia.

On the consumer side, www.tomshardware.com/pc-components/gpus/stable-diffusion-benchmarks showed intel had surprisingly competitive performance compared to amd. But, there's a few caveats. They were using Windows and all the AMD SD users I know claim the windows performance is a disaster compared to using AMD + Linux + SD. I haven't heard if the same is true for Intel because I don't know anyone who has even tried. On Nvidia it doesn't really matter, the performance between linux and windows for SD isnt that significant.

Add your own comment
Intel Arc "Battlemage" Xe2 GPUs with 448 EUs (56 Xe cores) Spotted in Transit (2024)

FAQs

What are Intel ARC GPUs for? ›

Intel® Arc™ GPUs unlock powerful AI capabilities, from advanced creation to immersive gaming. Get the most out of today's advanced creation software, including text-to-image generation and AI-accelerated video editing.

What is the highest end Intel ARC GPU? ›

Intel Arc
Fabrication processN6
Cards
Entry-levelArc 3
Mid-rangeArc 5
High-endArc 7
16 more rows

Do Intel arc GPUs have ray tracing? ›

Intel Arc Graphics feature hardware level ray tracing units, intended to improve performance of this algorithm. For this purpose, any game or app developer looking to take advantage of this feature, will need to integrate it using Microsoft's DirectX 12 or Vulkan APIs.

What are the specs of Intel Battlemage? ›

Firstly, the new Battlemage GPUs are supposedly equipped with 20 Xe-Cores (160 compute units) and 24 Xe-Cores (192 compute units) respectively. Both of the GPUs also feature a base clock of 1.8GHz, 12 GB of video memory, and 8 MB of L2 cache.

Is Intel Arc discontinued? ›

In a rather unexpected move, Intel this week discontinued its Arc A770 Limited Edition graphics card, which was its flagship discrete graphics offering for desktops.

Is Intel Arc better than 3060? ›

Intel Arc A770 fails miserably against RX 6600 / RTX 3060 in Remnant 2 new Benchmarks show. The online mode is incredibly popular and is praised far and wide by every review site.

Does Intel arc increase FPS? ›

Some of the most impressive jumps are a 65% framerate increase in Call of Duty: Infinite Warfare and Assassin's Creed Syndicate, and a massive framerate increase of 155% in Just Cause 4. It's important to note that this driver covers DirectX 11 titles at 1080p resolution with 'Medium' settings enabled.

Is Intel ARC GPU better than Nvidia? ›

Value for money: Intel Arc GPUs often offer competitive performance at lower price points than comparable Nvidia models. XeSS upscaling: Intel's XeSS technology provides image upscaling similar to Nvidia's DLSS, improving performance without sacrificing visual quality.

How strong is Intel Arc? ›

Exceptional Performance. The Intel Arc A750 limited edition GPU holds impressive performance metrics, especially for 1080p games. As a whole, it is your go-to option if you are looking for a good inexpensive graphics card, capable of delivering smooth gameplay and excellent frame rates.

Can I use an Intel Arc GPU with AMD CPU? ›

Which AMD CPUs can you use with Intel Arc GPUs? In reality, any AMD CPU should work with Intel Arc GPUs, as there isn't some lock or fundamental compatibility issue that means only certain CPUs work with the new Intel cards.

Does Intel XeSS work on any GPU? ›

GPU COMPATIBILITY

Intel Xe Super Sampling works with all graphics cards supporting Shader Model 6.4+ and DP4a instructions. Naturally, XeSS performs best on Intel's Arc graphics processors, but it will also work with Nvidia and AMD graphics hardware.

What GPU do you need for ray tracing? ›

NVIDIA RTX™ is the most advanced platform for ray tracing and AI technologies that are revolutionizing the ways we play and create. Over 500 top games and applications use RTX to deliver realistic graphics, incredibly fast performance, and new cutting-edge AI features like NVIDIA DLSS 3.5 with Ray Reconstruction.

How good will Intel Battlemage be? ›

Finally, the leaks state allege that Battlemage should be 30% faster, twice as fast as the Arc Alchemist line. This also means that Battlemage would be roughly equal to Nvidia's recently released RTX 4070 Super graphic card.

How much will Battlemage cost? ›

Based on the specs, there's a good chance Intel is targeting the RTX 4070 with its flagship Battlemage GPU. That's probably a good call, too. There are half a dozen GPUs in the price bracket between $400 and $600, making it the most competitive part of the graphics card market right now.

Will Intel make another generation of Arc GPUs? ›

Intel has confirmed that its Arc Xe2 Battlemage graphics cards will be released in late 2024. The first batch of Battlemage silicon is already being tested in labs, with the software being worked on by 30% of the Arc GPU team.

What is the purpose of Intel ARC control? ›

Intel® Arc™ Control is the only all-in-one software that easily/automatically updates drivers, adjusts game/system performance, and enables users to stream/broadcast gameplay. See below for more information on Intel Arc Control settings and features.

Are Intel ARC worth it? ›

It's clear from both of these sets of tests that, while Intel Arc IGPs are way better than their predecessors and can legitimately play today's games at decent resolution and detail, we still see an ocean between the Arc IGP and even the lowest-end discrete GPU from the latest generation.

What are Intel GPUs good for? ›

GPUs can process many pieces of data simultaneously, making them useful for machine learning, video editing, and gaming applications. GPUs may be integrated into the computer's CPU or offered as a discrete hardware unit.

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Twana Towne Ret

Last Updated:

Views: 6313

Rating: 4.3 / 5 (64 voted)

Reviews: 95% of readers found this page helpful

Author information

Name: Twana Towne Ret

Birthday: 1994-03-19

Address: Apt. 990 97439 Corwin Motorway, Port Eliseoburgh, NM 99144-2618

Phone: +5958753152963

Job: National Specialist

Hobby: Kayaking, Photography, Skydiving, Embroidery, Leather crafting, Orienteering, Cooking

Introduction: My name is Twana Towne Ret, I am a famous, talented, joyous, perfect, powerful, inquisitive, lovely person who loves writing and wants to share my knowledge and understanding with you.