Skip to main content

AMD just revealed a game-changing feature for your graphics card

AMD logo on the RX 7800 XT graphics card.
Jacob Roach / Digital Trends

AMD is set to reveal a research paper about its technique for neural texture block compression at the Eurographics Symposium on Rendering (EGSR) next week. It sounds like some technobabble, but the idea behind neural compression is pretty simple. AMD says it’s using a neural network to compress the massive textures in games, which cuts down on both the download size of a game and its demands on your graphics card.

Recommended Videos

We’ve heard about similar tech before. Nvidia introduced a paper on Neural Texture Compression last year, and Intel followed up with a paper of its own that proposed an AI-driven level of detail (LoD) technique that could make models look more realistic from farther away. Nvidia’s claims about Neural Texture Compression are particularly impressive, with the paper asserting that the technique can store 16 times the data in the same amount of space as traditional block-based compression.

AMD hasn’t revealed its research yet, so there aren’t a ton of details about how its method would work. The key with Nvidia’s approach is that it leverages the GPU to decompress textures in real time. This has been an issue in several games released in the past couple of years, from Halo Infinite to The Last of Us Part I to Redfall. In all of these games, you’ll notice low-quality textures if you run out of VRAM, which is particularly noticeable on 8GB graphics cards like the RTX 4060 and RX 7600.

One detail AMD did reveal is that its method should be easier to integrate. The tweet announcing the paper reads, “unchanged runtime execution allows easy game integration.” Nvidia hasn’t said if its technique is particularly hard to integrate, nor if it will require specific hardware to work (though the latter is probably a safe bet). AMD hasn’t made mention of any particular hardware requirements, either.

We'll present "Neural Texture Block Compression" @ #EGSR2024 in London.

Nobody likes downloading huge game packages. Our method compresses the texture using a neural network, reducing data size.

Unchanged runtime execution allows easy game integration. https://t.co/gvj1D8bfBf pic.twitter.com/XglpPkdI8D

— AMD GPUOpen (@GPUOpen) June 25, 2024

At this point, neural compression for textures isn’t a feature available in any game. These are just research papers, and it’s hard to say if they’ll ever turn into features on the level of something like Nvidia’s DLSS or AMD’s FSR. However, the fact that we’re seeing AI-driven compression from Nvidia, Intel, and now AMD suggests that this is a new trend in the world of PC gaming.

It makes sense, too. Features like DLSS have become a cornerstone of modern graphics cards, serving as an umbrella for a large swath of performance-boosting features. Nvidia’s CEO has said the company is looking into more ways to leverage AI in games, from generating objects to enhancing textures. As features like DLSS and FSR continue become more prominent, it makes sense that AMD, Nvidia, and Intel would look to expand their capabilities.

If we do see neural texture compression as marketable features, they’ll likely show up with the next generation of graphics cards. Nvidia is expected to reveal its RTX 50-series GPUs in the second half of the year, AMD could showcase its next-gen RDNA 4 GPUs in a similar time frame, and Intel’s Battlemage architecture is arriving in laptops in a matter of months through Lunar Lake CPUs.

Jacob Roach
Former Digital Trends Contributor
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
AMD did it! Now we need to keep the pressure up for price cuts
Benchmark for the RX 9070 XT.

Well, look at that. AMD actually released a graphics cards that was competitive on price, performance, and features with Nvidia. And it managed to keep enough cards in stock for the launch that it wasn't immediately ruined by scalpers. Although that might seem like a low bar to reach, it's what passes for a success story for GPU launches in 2025, because Nvidia's has been one of the worst we've ever seen.

As exciting as it is that there's a new graphics card that's actually kind of good and worth paying money for, though, it's not time for AMD (or fans) to rest on their laurels. There's more to push for: most notably that prices should come down further.
The RX 9070 is still too expensive

Read more
AMD’s RX 9070 XT could soon cost a lot more than it does now
An Asus RX 9070 XT TUF GPU.

After the way Nvidia's RTX 50-series ended up being called a "paper launch," many breathed a sigh of relief when AMD's RX 9000 series appeared on the shelves in much larger quantities. However, once this initial shipment is sold, AMD could face the same problem as the rest of the best graphics cards: Price hikes, price hikes everywhere.

The cards officially hit the shelves yesterday, and many were spotted far above the recommended list price (MSRP), with some overclocked models priced at up to $250 more than the $600 starting price. However, AMD spoke several times about working with its partners to ensure wide availability at MSRP, and indeed, many retailers had some models up for sale. Those MSRP cards were only around for a short time, though, and they might never come back, according to retailers.

Read more
An AMD RX 9060 XT with 16GB would ruin Nvidia’s second-hand market
Several AMD RX 9000 series graphics cards.

I know, I know we're all hopped up about the RX 9070 XT and 9070 launch -- I know I am. But looking beyond the potential big win AMD is on for with its first RDNA4 graphics cards, I'm also particularly excited about the potential for the rumored 9060 XT. Not because it'll be cheaper again -- it will be -- but because it might have up to 16GB of VRAM. That's going to wreck Nvidia's long-term second-hand card market, which could have a much greater impact on AMD's market share over the long term.

It's all just rumors for now, and some of my colleagues are much less excited by this than I am, but I think there's some real potential here for this little card to be a game changer.
Hitting the VRAM wall
Video memory, or VRAM, has been a front-and centre feature of graphics card spec sheets for generations, but it's started to matter a lot more in recent years. While flagship graphics cards have exploded in their VRAM quantities, with the 5090 now offering 32GB, most mainstream cards have been getting by with less. It was only a couple of generations ago that the flagship RTX 3080 only had 10GB of VRAM, and outside of the top few models, you'll still see 12GB, 10GB, or even 8GB.

Read more