Skip to content

H[ARC]! A New Challenger [Intel] Enters The Arena

Last updated on 2021/08/19

Today Intel took the wraps off its long-awaited GPU platform.

Arc.

Technology is great, wonderful, and amazing, and all of that jazz. The best news here is that there is more competition in the gaming space.

Once we get beyond that aspect of this announcement, I find myself just shrugging my shoulders and moving on with my day.

Don’t get me wrong – I have spent more than my fair share of time reading articles about transistor counts and CUDA cores and ray tracing.

Even having done more research and reading than most, what stands out here is watching a younger gamer turn on ray tracing on a brand new Nvidia 2060 expecting something absolutely amazing. The telling comment came not too long afterward.

Is it working?

Then I had to explain to a 14-year-old what ray tracing actually is and then point out the small reflections in a few areas of the screen that were, indeed, showing that it was working.

The problem here is very simple. Computer graphics processing has fallen into the same trap as cell phones.

Every year Apple builds and releases new cellphones. But going back to 2004 there hasn’t been much change in them. They have been maddeningly iterative.

Sure, the cameras get slightly better every year, the processors get slightly faster. The memory bumps up every few years. The screens get gradually bigger and sharper.

GPUs are mired in the same morass of slogging along with higher and higher transistor counts and more and more buzzwords heaped atop larger and large silicon dies.

Nobody is doing anything revolutionary here. I don’t think that Intel is even trying to tackle that topic.

Yes, Intel has already answered the big question that is basic table stakes for the way we have gamed for years now.

Their GPUs can run Crysis.

For ages that has been a rallying cry that has become the defacto memed litmus test for quality hardware: Can it run Crysis?

We are 14 years beyond when this question was first posed. That is an incredibly long time in computing years. Isn’t it time that we start looking for the next revolution in this space?


Having managed to score an overclocked Nvidia 3070 last year after coming from a Radeon 580, sure, the graphics looked a little better. However, once the initial excitement of new hardware faded the bigger, beefier, and supposedly more impressing GPU didn’t change my gaming experience that much at all.

So back to Intel. The fact that there is another big-name player in the graphics space is pretty awesome. Anything that provides more options and more competition is hugely welcome in a space that has for years suffered from lack of product, price hikes, and overinflated hype.

But don’t expect me to go stand in line somewhere to grab one of these new Intel GPUs.

The coolest thing about the Intel announcement is that they have chosen some awesome codenames for their upcoming lineups of cards.

The first in the lineup has been tagged with the “Alchemist” moniker.

Down the road, we should see “Battlemage” “Celestial” and “Druid” versions come out.

From a marketing aspect, those are pretty sweet.

However, there isn’t much else substance to the announcements beyond the usual fanfare that accompany whatever next-gen cards Nvidia and AMD tease that we might be able to buy in a few years when they learn how to make enough chips to go around.

After thinking about it for a while, it all feels kind of like Intel is late to a game that is going to follow Moore’s Law into obsolescence at some point.

But we should all take a step back here. Computer graphics have been iterative for a long time now. The Crysis question is 14 years old.

Even as amazing as AI super-sampling is, it is still only an iterative step forward in this space.

And, yes, Intel has its own variant to contend with the Deep Learning Super Sampling (DLSS) from Nvidia and FidelityFX Super Resolution (FSR) from AMD.

Yes, it is cool and allows for higher output from lower spec graphics cards, which is a good thing.

However, it is still showing us the same graphics, just sharper and with more polygons.

Where is the tech pushing immersive gaming or pushing to break through the uncanny valley?

Our imaginations have already been piqued by the specter of virtual reality gaming.

Hollywood has made sure that this concept is firmly rooted in our minds. Movies like Ready Player One, Gamer, and others have laid this groundwork for what next-gen gaming can look (and feel) like.

If we want to go deeper down the white rabbit hole, then we can go back even further to classics like Tron, Total Recall, and even the Matrix movies to see the groundwork that has been built up over time.

If Intel wants to convert someone like me, then tackle the generational problem and show me what is next – not what is now.

Show me what this thing can do to run multiple 4k displays in a VR headset to completely remove any hint of a screen door effect.

Work with the teams writing the revolutionary new platforms that will sweep away the DirectX, Vulcan, and OpenGLs of the world.

Find ways to take gaming to the next level and create ways for game makers to create new, immersive worlds that we can get lost in.


The entry of a new GPU player to the gaming space is a welcome addition, hands down.

Will they continue nudging the ball forward on what feels like a linear track of growth and new functionality?

Sure.

Will they present something absolutely amazing and wonderful and make people forget about looking for that elusive RTX 3090 and jump ship to “Battlemage”?

Probably not.

When it is all said and done, I am rooting for Intel. You just won’t find me grabbing one of their cards to slap into my gaming rig anytime soon.

Thanks for reading!

Comments are closed, but trackbacks and pingbacks are open.