Patreon Logo Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal Logo PayPal. You can also buy games using our partner links for GOG and Humble Store.
You are about to report this comment from such

Quoting: Kimyrielle
They may have hoped for that 50xx refresh that ended up getting postponed for at least a year.
Well, typically you don't design games for hardware that's not even out yet. You go for stuff that has been out for 3-4 years, so there is an installed basis. Even IF the upgraded 50XX GPUs become a thing (there are rumors that they might get canceled), the game will likely launch before any of these has reached a customer.
You'd think that... When was the last time you tried cranking up a game to max settings? A 5080, arguably the top consumer card right now, straight up crashes if you try that at 1440p in Indiana Jones, a game from 2024, one released prior to the paper release of this generation of GPUs. And that's after a year of patching and some optimisation. Not enough VRAM. 007 was supposed to release roughly at a point where nVidia typically released (or would've been close to releasing) their refreshes. 12GB VRAM for 1080p60 sends a clear message here, from my point of view.

I mean, just a peek at raster performance a notch above 1080p (a reminder that nVidia was advertising Turing GPUs as 8k-capable) tells you where the current ceiling is, and it's not at 16GB VRAM where the top consumer cards card are, which is also well past what's reasonable to spend on graphics.

007 will function on 8GB, I don't doubt that, but if you need a current xx90 GPU to play a current game possibly on max settings that game isn't designed with current-gen hardware in mind on the high-end. What used to be GPUs intended basically for professionals is now needed for silly games to not even run that well at resolutions that match the current TV standards?

Basically, looking at those requirements I don't see how IOI wasn't counting on the industry to finally move on from 8GB to 12GB, and consequently from 16GB to whatever nVidia would deem us consumer peasants worthy of...

Thank you for helping to keep GamingOnLinux civil and safe!
Please tell us why you're reporting this content. Please note we store your IP address for all reports to help prevent spam and abuse. You can also email any complaints to: [email protected].
First, please select a general reason:

This post contains content that a reasonable person would consider to be offensive, abusive, to be hateful conduct or a violation of our rules.

This post is spam including the likes of advertising, bot posts or vandalism.

This post contains illegal content.

This post requires attention that isn't listed above.

Next, enter more details on why you're reporting: