This site may earn affiliate commissions from the links on this page. Terms of use.
(Photo: Michael Justin Allen Sexton/PCMag)The day we have all been waiting for has finally arrived. The review embargo for Intel’s A7-series GPUs has lifted, and its strengths and shortcomings are exposed for all to see. As expected, it’s a mercurial product that shows glimmers of hope alongside weird bugs and inconsistent behavior. This is not exactly shocking for a brand-new GPU architecture, but curious upgraders should tread carefully.
Our sister site PCMag received the flagship GPU for testing: the Arc A770 Limited Edition. Note the A770 will be offered in both 8GB and 16GB versions. However, the one Intel makes will be the 16GB version. Also, it won’t be “limited” much, according to Intel. It will be producing the $349 GPUs for the foreseeable future, as opposed to a small batch. Notably, there are no partner boards at launch, which will cost $20 less and feature 8GB of RAM. With Intel’s board having twice as much memory for a tiny bit more cash, we can see why no partners have been announced as it’s a terrible value.
For basic specs, it’s a 225W GPU with an eight-pin and six-pin power connector. It sports three DisplayPort 2.0 ports and a lone HDMI 2.1 connector. Memory is clocked at 17.5Gb/s on a 256-bit bus, offering 560GB/s of bandwidth. Cooling is provided by two 90mm fans, a large heatsink, and a copper vapor chamber with four heat pipes. There’s even some tasteful “Intel blue” lighting thrown in as well.
The A770 was marketed as a challenger to Nvidia’s RTX 3060 and AMD’s RX 6600. It largely succeeds in that effort, with several huge caveats. The first is Resizable BAR is required. This is a feature that’s only supported on new-ish platforms. That includes Ryzen 3000 CPUs, Intel 10th Gen, and newer. If you’re on an older platform, fuggetaboutit. This is an odd “feature” for Intel to rely on, as AMD and Nvidia GPUs don’t need it to perform as expected.
You also need to be playing games that run on DirectX 12 or Vulkan APIs. Older games that use DX11 run much worse on Intel’s GPU than on its competitors. This was a conscious choice by Intel as it wants to focus “on the future” instead of older games. Still, a lot of popular games still use DX11 and older APIs, but Arc is not made for those gamers. As one example, Rainbow Six Siege ran at 93fps at 1080p in DX11, and 315 fps in Vulkan.
The good news is in certain games that are DX12 and optimized by Intel, it competes very well against its counterparts. In some scenarios, it even competes with the Radeon RX 6700 XT, which is almost twice as expensive. It was able to hit 63fps in Red Dead Redemption 2 at 1440p, for example, which is a GPU killer. It also matched the RTX 3060 at 1080p in Shadow of the Tomb Raider; both being capable of over 121fps. Enabling Intel’s super-sampling XeSS technology boosted frame rates considerably too. Unfortunately, it’s not supported by many games yet. Overall, this GPU can perform quite well, in certain games.
PCMag also discovered two very bizarre situations in its testing; abnormally high scores in synthetic tests, and high overall power draw. The synthetic test scores could be revealing untapped potential for Intel’s drivers. Or it’s possible Intel just heavily optimized the drivers for these tests. Either way, it’s odd enough that in Furmark the A770 performed better than the RTX 3090. Something is clearly not right with that test.
As far as power consumption goes, it drew more than twice the wattage of competing systems at idle. Whereas all of the AMD and Nvidia systems sipped about 60W at idle, the Intel-based system drew 113W. Under load in games, it also drew more power for the whole system than competing cards as well. PCMag says at idle the card was consuming around 50W, with nothing open, which is just weird. The GPU usage was also hovering between 10 to 20 percent, again, with nothing running. This could be a bug due to the drivers, as it’s certainly unexpected.
Intel’s first GPU certainly has some advantages in the current market, in that it’s offering 16GB of RAM in a $349 GPU. Neither AMD nor Nvidia can match that combination currently. They likely won’t have to; the competing cards put up a good fight with Intel, but without the same caveats. This leaves Intel’s GPU as a curiosity. It basically boils down to whether you play the games this GPU excels at, and if you need to upgrade. It’s very likely that both AMD and Nvidia won’t be launching new next-gen, midrange GPUs until next year sometime. That means Intel will be competing in the upper-mid-range market against last-gen cards for the foreseeable future. Assuming it continues to improve its drivers, it might have a fighting chance.
Still, it will be a long time until Intel gains appreciable market share to become a threat to AMD and Nvidia. Despite the long road ahead of it, we’re sure its competition is watching the situation closely. Regardless of Intel’s fumbles out of the gate, it’s still exciting to finally see a third player enter the game. Whether that excitement (and curiosity) will lead to actual sales remains to be seen. Intel’s driver team has a lot of work to do in the coming months, and it’ll be very interesting to see what it comes up with for its next-gen architecture, dubbed Battlemage.
For now, it’s a decent first offering, with the promise of better things to come. Intel says it’s committed to Arc for the long haul, and that most of its engineers are already working on Battlemage. At least now Intel has a product it can iterate on that’s in consumers’ hands. Sure, those consumers will be doing some real-world driver testing for the company. However, that’s always been part of the “early adopter tax.” If you’re one of them, you’ll be able to buy an Arc GPU on Oct. 12.
Now Read:
Source by www.extremetech.com