Sponsor: Montech HyperFlow 360 Cooler on Amazon https://geni.us/dWBIbF6
NVIDIA killed 32-bit CUDA support on the 50-series, including PhysX 32-bit on the RTX 5090, 5080, 5070 Ti, and everything else on the 50 series. This means that we get to dust off the GTX 580 and GTX 980 and pit them vs. the RTX 5080 in Mirror's Edge, Mafia II, Borderlands, Batman, and Metro in what feels like the ultimate throwback to about 15 years ago. Although this may feel like an isolated issue for a small list of old games, we think it speaks to a potentially broader concern of intercompatibility and vendor lock-in. We also get an opportunity for a PhysX and physics GPU history lesson.
The best way to support our work is through our store: https://store.gamersnexus.net/
Grab a PC building Modmat! https://store.gamersnexus.net/products/large-modmat-gn15-anniversary
And check out or high heat resistant silicone project and soldering mat! https://store.gamersnexus.net/products/gn-project-soldering-mat
Or grab our PC case magnets and badges! https://store.gamersnexus.net/products/gn-3d-multi-level-pc-case-magnets-amd-ryzen-intel-gn-logo
Like our content? Please consider becoming our Patron to support us: http://www.patreon.com/gamersnexus
RELATED CONTENT
Watch our video about Fake MSRPs: https://www.youtube.com/watch?v=SPE95_RnL_Q
And our video about the missing ROPs: https://www.youtube.com/watch?v=PEXYZgVfOBM
TIMESTAMPS
00:00 - GTX 580 is Back and Better Than Ever
02:39 - NVIDIA Hasn't Fully Killed PhysX... Yet
03:58 - History of PhysX
06:31 - This Was Predicted
10:14 - Test Methods and Choices
13:27 - Benchmarks: Mafia II
17:27 - Benchmarks: Metro Last Light
20:58 - Accelerator GPU Utilization
21:36 - Benchmarks: Mirror's Edge
23:42 - Benchmarks: Borderlands 2
25:22 - Benchmarks: Batman Arkham City
28:14 - Conclusion
** Please like, comment, and subscribe for more! **
Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
Follow us in these locations for more gaming and hardware updates:
t: http://www.twitter.com/gamersnexus
f: http://www.facebook.com/gamersnexus
w: http://www.gamersnexus.net/
Our policies, processes, and ethics statements relating to review samples, advertising, travel, errors, and more are transparently and publicly available on this page: https://gamers.nexus/ethics-statements
Steve Burke: Host, Writing
Patrick Lathan: Testing, Writing
Vitalii Makhnovets: Editing
Tim Phetdara: Editing
NVIDIA killed 32-bit CUDA support on the 50-series, including PhysX 32-bit on the RTX 5090, 5080, 5070 Ti, and everything else on the 50 series. This means that we get to dust off the GTX 580 and GTX 980 and pit them vs. the RTX 5080 in Mirror's Edge, Mafia II, Borderlands, Batman, and Metro in what feels like the ultimate throwback to about 15 years ago. Although this may feel like an isolated issue for a small list of old games, we think it speaks to a potentially broader concern of intercompatibility and vendor lock-in. We also get an opportunity for a PhysX and physics GPU history lesson.
The best way to support our work is through our store: https://store.gamersnexus.net/
Grab a PC building Modmat! https://store.gamersnexus.net/products/large-modmat-gn15-anniversary
And check out or high heat resistant silicone project and soldering mat! https://store.gamersnexus.net/products/gn-project-soldering-mat
Or grab our PC case magnets and badges! https://store.gamersnexus.net/products/gn-3d-multi-level-pc-case-magnets-amd-ryzen-intel-gn-logo
Like our content? Please consider becoming our Patron to support us: http://www.patreon.com/gamersnexus
RELATED CONTENT
Watch our video about Fake MSRPs: https://www.youtube.com/watch?v=SPE95_RnL_Q
And our video about the missing ROPs: https://www.youtube.com/watch?v=PEXYZgVfOBM
TIMESTAMPS
00:00 - GTX 580 is Back and Better Than Ever
02:39 - NVIDIA Hasn't Fully Killed PhysX... Yet
03:58 - History of PhysX
06:31 - This Was Predicted
10:14 - Test Methods and Choices
13:27 - Benchmarks: Mafia II
17:27 - Benchmarks: Metro Last Light
20:58 - Accelerator GPU Utilization
21:36 - Benchmarks: Mirror's Edge
23:42 - Benchmarks: Borderlands 2
25:22 - Benchmarks: Batman Arkham City
28:14 - Conclusion
** Please like, comment, and subscribe for more! **
Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
Follow us in these locations for more gaming and hardware updates:
t: http://www.twitter.com/gamersnexus
f: http://www.facebook.com/gamersnexus
w: http://www.gamersnexus.net/
Our policies, processes, and ethics statements relating to review samples, advertising, travel, errors, and more are transparently and publicly available on this page: https://gamers.nexus/ethics-statements
Steve Burke: Host, Writing
Patrick Lathan: Testing, Writing
Vitalii Makhnovets: Editing
Tim Phetdara: Editing
Category
🗞
NewsTranscript
00:00This new GTX 580 is up to 81% better than NVIDIA's technologically outdated RTX 5080.
00:09And this is, it's not new, it's from 2010.
00:13But it's suddenly relevant again. We even brought back this GTX 980.
00:17Here it is running against the $1,000 plus RTX 5080 from 2025.
00:21The 980 is a better gaming experience than the 5080 in some of these tests today.
00:25We even dug Mirror's Edge out of the grave for this, surviving an onslaught of EA Origin pop-ups to do so.
00:30You can see a similar here. The 5080 struggles versus older hardware.
00:34In fact, we even put the RTX 5080 with the GTX 980 together as an accelerator card to improve the performance.
00:40And all of that is because of what's on the screen now.
00:43The way it's meant to be played. NVIDIA's old slogan.
00:46Where the company said that PhysX was a graphics feature that would change the way the games looked and felt.
00:53And they were right. Especially when they stopped supporting it.
00:57A couple weeks ago, a user responded to an NVIDIA driver feedback thread.
01:00Saying that, quote, PhysX still isn't working on the 50 series cards.
01:03Attempting to force it on via a config file edit in Borderlands 2 and turning on the PhysX indicator shows it's running on the CPU. End quote.
01:10In response, NVIDIA stated that, quote, this is expected behavior as 32-bit CUDA applications are deprecated on the GeForce RTX 50 series GPUs. End quote.
01:19Citing a post from January 17th that had, up until then, drawn little attention from the gaming community.
01:25This is an insight into what happens when vendor-specific solutions are abandoned.
01:29And even though the affected games are ancient, it breathes some healthy skepticism back into topics like vendor-specific graphics improvements.
01:38So, today, we have a lot of those. DLSS and its many sub-features. Things like FG, frame generation, multi-frame generation, DLAA, Reflex, any number of these special features that they're pushing.
01:50And the same goes for other companies like AMD.
01:53Even ray tracing. There's no guarantee that ray tracing continues to work the way it does today with the fixed hardware that we have in GPUs today.
02:01That stuff could change.
02:03And what we're looking at today with PhysX 32-bit support on these ancient games is an insight into that world.
02:10So, let's get into the PhysX situation.
02:12Before that, this video is brought to you by the Montech HyperFlow series of liquid coolers.
02:16The Montech HyperFlow aims to compete with other affordable liquid coolers by fighting on price to performance.
02:22In our testing, cooler performance was comparable to others in its class.
02:26The ARGB HyperFlow has an illuminated pump block with RGB LED fans that have quality rubber bumpers for vibration damping.
02:33The cooler has a 6-year warranty and supports all modern Intel and AMD sockets.
02:37And you can learn more at the link in the description below.
02:40Rumors of its demise are slightly exaggerated. Not greatly.
02:43NVIDIA, at this point, hasn't fully killed PhysX. Not yet, anyway.
02:50And what they have killed is 32-bit CUDA support.
02:54And by extension, that means that 32-bit PhysX games are affected.
02:59A user on the ResetEra forum has compiled a list of 32-bit games of PhysX based on a Wikilisks article.
03:05And big thanks to those guys for rescuing lists from rules-obsessed Wikipedia editors and their crusade to convert every list into a useless categories page.
03:13Yes, this bothers us.
03:16It'll be a separate video.
03:19If with enough likes we'll do that video.
03:21PhysX has had a long, long history, and it's not actually gone.
03:25For example, newer games with 64-bit PhysX shouldn't be affected.
03:28We were particularly confused when researching this story by the February 18th TOMS Hardware article,
03:34which claims in the leading statements, quote,
03:37As far as we know, there are no 64-bit games with integrated PhysX technology, end quote.
03:42But it then mentions Metro Exodus and The Witcher 3, both of which are 64-bit games.
03:47Further adding to the confusion, PhysX doesn't have to be accelerated on the GPU.
03:52It could be done on the CPU.
03:54So removing GPU acceleration doesn't necessarily completely kill the feature in games.
03:58In order to better explain the situation, we first need to redefine what PhysX is, or what it was trying to be.
04:03It was part of NVIDIA's SDK suite for developers.
04:07And what NVIDIA said it was trying to do on that was make it easier to integrate high-quality physical effects in games.
04:15The downside was that it'd run either exclusively or minimally just better on NVIDIA hardware.
04:20This is a familiar story even to today.
04:22NVIDIA with these features has historically also driven integration by providing engineering resources to developers.
04:28So back with the GameWorks stuff, one of the things NVIDIA was doing was sending engineers out to different game developer campuses
04:35to help them program and integrate features in their games.
04:38And while they were there, sometimes they'd even optimize the drivers.
04:41So this was all a part of NVIDIA's relationships with developers.
04:44It was really focusing on them back then.
04:46To paraphrase Wikipedia, PhysX was originally developed by Novodex AG,
04:51which was acquired by AGEA in 2004, which was acquired by NVIDIA in 2008.
04:57It originally ran on a discrete PPU, or physics processing unit.
05:01These were accelerator carts.
05:03But NVIDIA adapted the tech to run on CUDA, and specifically CUDA cores.
05:07We're only concerned with NVIDIA PhysX titles here.
05:10Pre-2008 games from the AGEA era already required the PhysX legacy installer,
05:15and we're not going to go far enough down the rabbit hole to test whether that still works.
05:20PhysX is a physics engine SDK, usually associated with destructible environments, ragdolls,
05:25fluid, fabric, interactive fog, and particles in general.
05:29NVIDIA's tagline for games it collaborated on in the heyday of PhysX was,
05:33the way it's meant to be played.
05:34And PhysX was the poster child for that philosophy.
05:37Games like Mafia 2 looked dramatically different with PhysX enabled.
05:41They even put NVIDIA's name on the cars that get shot to pieces and blown up in the benchmarking scene.
05:47That's an interesting message to send.
05:49NVIDIA worked with the developers to turn the NVIDIA experience into literally the way it's meant to be played.
05:56If you turn off PhysX in games like Mirror's Edge, it straight up removes objects from scenes.
06:01If you were using AMD, or game and forbid, Intel graphics in 2010,
06:06you were getting inferior versions of the highest profile AAA games.
06:09And that was before they invented quadruple A games.
06:14Yeah, I'm never going to forget when they said that.
06:16PhysX can be run on CPUs, but typically not well, which gives NVIDIA a big advantage with PhysX.
06:21And all of this is kind of an old topic, but again, it becomes relevant today for all the reasons we're talking about,
06:27plus some big ones in the conclusion.
06:29David Cantor, who has appeared on the channel many times as a technical analyst
06:33and helped us with our Intel Fab Tour deep dive, actually wrote about what is happening today 15 years ago.
06:41And so David, you finally get your I told you so moment for NVIDIA.
06:46It took 15 years, but you do finally get to say I told you so.
06:50As David Cantor said in 2010, quote,
06:52the sole purpose of PhysX is a competitive differentiator to make NVIDIA's hardware look good and sell more GPUs.
06:58He complains that, quote, PhysX uses an exceptionally high degree of X87 code and no SSE,
07:04which is a known recipe for poor performance on any modern CPU.
07:07Even the highest performance CPUs can only execute two X87 operations per cycle, end quote,
07:13meaning that performance will always remain bad, even with 2025 era CPUs.
07:19NVIDIA responded to Cantor back then by stating that PhysX 2.X, quote,
07:24dates back to a time when multi-core CPUs were somewhat of a rarity, end quote.
07:29NVIDIA noted that it was still possible to implement multithreading and seems to place the blame on the developers.
07:34That hasn't changed at all, actually.
07:38That's that's actually just the same.
07:40Further stating that SSE and multithreading support would be improved.
07:44PhysX 2.8.4, quote, with the SSE2 option to improve performance in CPU modes, end quote,
07:50was released a month after Cantor's post.
07:53So he can claim an accomplishment there.
07:55And PhysX 3.0 was released in 2011 with, quote, effective multithreading.
08:00Our understanding is that David Cantor's post back then created quite the firestorm inside of NVIDIA
08:05as they tried to figure out how to deal with the bad publicity that was starting to pop up around PhysX
08:10and how it might kind of restrict these graphics features behind what was a suboptimal implementation.
08:18All of that puts the 32-bit games from the era between a rock and a hard place.
08:23And these are games that are now losing GPU acceleration.
08:27And as a bonus, they run like absolute shit, even on modern CPUs.
08:32We've seen games as late as 2014, Borderlands, the pre-sequel, that appear to use PhysX 2.X.
08:38In contrast, some more modern post 2.X games like The Witcher 3 don't offer GPU acceleration at all,
08:45instead forcing PhysX to always run on the CPU.
08:48So if PhysX really did start to prioritize just running on CPUs post 3.0,
08:53that would explain why NVIDIA suddenly lost interest.
08:56With the release of 3.0, quote,
08:58GPU hardware acceleration in SDK 3.0 is only available for particles or fluids,
09:03which helps explain why The Witcher 3, with its PhysX clothing, isn't GPU accelerated,
09:07although NVIDIA had originally claimed that it would be.
09:10And that also hasn't changed.
09:15They're still claiming stuff that sometimes just doesn't happen.
09:17The last time we thought about PhysX was with NVIDIA Flex in 2014,
09:22most notable in our minds for being completely broken in Fallout 4 since the 20 series.
09:27There are still engineers tolling away within the NVIDIA Omniverse framework, though,
09:32with the most recent PhysX release at the time of writing being 5.5.1 from early February this year.
09:40We agree with Cantor's analysis of the time and see many parallels to today.
09:43It's just that lock-in can become really problematic,
09:46and if the hardware vendor suddenly finds out that their solution runs better on other devices,
09:52they might just lose interest.
09:54We sort of saw this with APL when it first came out,
09:56where Intel only supported the CPUs that needed it the least,
10:00presumably because those that needed it the most,
10:03which were within the same generations and same architecture family,
10:07could actually benefit from it in a way that might reduce interest in sales of the higher-end CPUs,
10:13if it ever took off.
10:14Getting to performance testing, we selected five 32-bit games for testing.
10:18Batman, Arkham City, Borderlands 2, Mafia 2, Metro Last Light, and Mirror's Edge.
10:22We rejected Assassin's Creed 4 Black Flag because it has an FPS cap.
10:27But more importantly, as AMD owners may already know,
10:30you aren't given the option to use PhysX at all without GPU acceleration in that one.
10:34This is especially galling because the big PhysX update for Black Flag in 2014
10:39appears to have brought the PhysX version up to 3.3.0,
10:43so there's a chance it might not run completely terribly on a CPU if it were allowed to do so.
10:49And that brings us to another point,
10:50which is that none of the stuff in this video today is news to AMD users, especially of that era.
10:55AMD users are used to getting shafted for things like PhysX,
10:59where it's just not an option.
11:01So for those of you who've been on AMD,
11:04you get to now say to owners of $2,000 RTX 5090s,
11:08welcome to my world.
11:10Maybe a bit of vindication for you.
11:12Although it is for games that are 15 years old, but, you know, the point stands.
11:16So methodologically, determining the PhysX version,
11:18judging by the admittedly not very scientific method of inspecting for PhysX
11:23by opening the DLLs and Ctrl-Fing for the different PhysX versions
11:27and all the different games,
11:28the ones we're testing all appear to be using PhysX 2.X,
11:31some version of 2.something.
11:33And remember, the pre-3.0 PhysX games
11:37are the ones that are likely to run poorly without GPU acceleration.
11:42Those are the ones that'll be the most affected.
11:44Mirror's Edge uses 2.8.0.
11:45Mafia 2 and Metro Last Light use 2.8.3.
11:48Borderlands 2 and Batman Arkham City use 2.8.4.
11:51These last two could therefore have better CPU performance
11:54due to the added SSE 2 option,
11:56depending on the developer's implementation.
11:58We used the latest compatible driver for each GPU,
12:00which was 572.47 for the 5080 and the 980,
12:04and 391.35 March 2018 for the 580.
12:07We also used the PhysX system software version
12:09that shipped those drivers for each of these cards,
12:12so that'd be 9.23.1019 for the newer cards
12:16and 9.17.0524 for the 580.
12:20All games run at 1080p on our 9800X 3D GPU test bench.
12:23Nothing has changed there.
12:24Same hardware. G-Sync universally disabled.
12:26Rebar enabled everywhere possible.
12:28The 580, for example, doesn't support it.
12:30And then CSM was also enabled by necessity for the 580.
12:34Installing the 980 as a secondary GPU pulls eight lanes
12:37from the 5080 in the first slot,
12:39so that also could have a small impact.
12:42And then the NVIDIA control panel was used
12:44to manually align the PhysX processing tasks
12:47with the secondary card in those situations
12:50and was also used to assign the processing to each device.
12:53And we confirmed that PhysX processing with the 5080
12:55was always performed on the CPU,
12:57regardless of assignment in these specific games.
13:0064-bit PhysX games should run normally on the 50 series.
13:04That should not be affected.
13:05That's not what we're testing, though.
13:07We also didn't test the 980 with CPU PhysX.
13:10It was tempting, but outside of the scope.
13:12We did it with the 580, though.
13:13Comparing PhysX performance is a little tricky
13:15because performance depends on the scene.
13:17Mirror's Edge runs fine on 50 series cards
13:19until Broken Glass shows up, for example.
13:21The benchmarks we've selected here are worst case scenarios
13:24that feature heavy PhysX, since that's what we're testing.
13:27Let's get into it.
13:28Mafia II is up first,
13:29mostly because the canned benchmark scene
13:31starts with the explosion of two cars
13:33plated with NVIDIA after they're shot to pieces.
13:36It's like they knew this was going to happen.
13:38Mafia II has pros and cons as a test candidate.
13:41It was a flagship PhysX title with prominent effects,
13:44but on Steam, the original version
13:46is now only available as a bundle with the 2020 remaster,
13:49which makes it less likely that players will revisit the original.
13:52Not everyone was happy with the remake, though.
13:54Mafia II really went all in with NVIDIA at the time.
13:57It uses NVIDIA Apex,
13:58which included modular PhysX components
14:00like clothing and particles.
14:02NVIDIA noted that, quote,
14:03when the Apex PhysX setting is at medium or high,
14:06the game will utilize both your CPU
14:08and a suitable NVIDIA GPU,
14:10which doesn't include the 5080, and, quote,
14:12clothing effects will always run on the CPU
14:15unless you have a second PhysX-capable NVIDIA GPU
14:18dedicated solely to PhysX.
14:20And finally, quote,
14:21Apex clothing effects are the most strenuous aspect of the game,
14:24and, quote,
14:25that could mean that PhysX on the CPU has a chance here.
14:28We maxed out the settings
14:29and set Apex PhysX to high when enabled X.
14:32The Apex clothing effects are obvious
14:34if you're looking for them,
14:35but really this benchmark is a showcase
14:37for rubble, chunks of concrete,
14:39glass being blown apart,
14:40and wood scattered across the scene with PhysX on.
14:43And there's no replacement for them with PhysX off.
14:45They just disappear.
14:47This is another game where the,
14:48we'll call it real, experience requires PhysX.
14:51Here's the chart.
14:52It's insane.
14:53With PhysX on, it's actually kind of comical,
14:56the RTX 5080 is the worst performer on the chart.
14:59That's right, over $1,000 to get worse performance
15:03than the GTX 580.
15:05Even more insulting than the chart,
15:07Mafia 2's benchmark concludes with a graded result.
15:11You get a letter grade on it.
15:13The 5080 got a D.
15:16Quote,
15:23And this is coming from a game so old
15:26that it thinks that Windows 11 is Windows 7.
15:29Believe us, Mafia 2,
15:31we'd all be happier if that were the case.
15:33It averaged 30.3 FPS
15:36with wild variation between individual benchmark passes
15:39showing behavior nearly identical to the GTX 580
15:42when PhysX processing was assigned to the CPU with that card,
15:46including the variation between passes.
15:48With PhysX processing done on the GPU,
15:50the GTX 580 is 85% ahead of the 5080
15:55with improved lows as well.
15:57Just to reiterate, this card is 15 years old
16:01and it's outperforming the 5080.
16:03The card is old enough to get a driving permit in the US.
16:06Now all we need is a card old enough to drink
16:09and we'll finally understand NVIDIA's decision-making
16:11for January of this year.
16:13On the 580, GPU PhysX is 81% ahead of CPU PhysX
16:17at 56 to 31 FPS,
16:20and CPU PhysX sucks.
16:22So far, that lines up with our theory
16:24about 2.8.4 onwards being more CPU favorable
16:27since Mafia 2 appears to be on 2.8.3.
16:30Turning PhysX off gives the 5080 a 1,267% improvement
16:35to 414 FPS.
16:38So it may be worth taking that route
16:40if you're stuck with a low-performing 50 series card
16:43and you can't afford to upgrade yet to a GTX 580 anytime soon.
16:47Of course, if you can upgrade from a 5080 to a GTX 980,
16:50there's also room there.
16:51It's a 198% improvement.
16:53Alternatively, you could travel back to 2006
16:56and add a dedicated PhysX accelerator
16:58paired with the 5080.
17:00We used a newer GTX 980 as the accelerator in this case,
17:03which gave a respectable 441% uplift to 164 FPS average.
17:070.1% lows were at 60 FPS
17:09and made it the best balance of performance and visuals on this chart.
17:12NVIDIA has finally achieved what it always wanted.
17:15You now need a dedicated accelerator GPU
17:18for PhysX processing.
17:20It took them 15 years to do it,
17:22but that is what they always want,
17:23is more attachment to the board.
17:252013's Metro Last Light
17:27is technically the most recent release on this list,
17:29although it's still a 32-bit application.
17:31Or, as NVIDIA spun it back in 2013,
17:35quote,
17:36thanks to a highly efficient streaming system,
17:38Last Light's world uses less than 4 gigabytes of memory.
17:42That's one way to put it.
17:43Advanced PhysX is an on or off toggle
17:46for all PhysX effects,
17:47which NVIDIA categorized as debris and destruction,
17:50explosive enhancements, fog volumes, and cloth.
17:53We've spent many, many hours benchmarking Metro Last Light,
17:57so for old time's sake,
17:58we used our old VHH graphics preset
18:01or very high quality AF4X texture filtering,
18:04low motion blur, no SSAA, and high tessellation.
18:08This was one of the benchmarks that built GN in the early days.
18:11Last Light includes an unusually user-friendly standalone benchmark
18:15that lasts three minutes without loading screens,
18:17which is why we used it so frequently back then.
18:20Initially, there's not a dramatic difference with PhysX on versus off
18:24until about two minutes in.
18:25You can see that in the footage here,
18:27when explosions and gunfire start to blast
18:29chunks of concrete across the scene.
18:31Visually, the developers did a fairly good job
18:33of making up for the lack of PhysX
18:35when the option is disabled, at least in this scene.
18:38PhysX helps, but disabling it won't ruin the game.
18:41On a GTX 980, it looks great.
18:43It's also not bad on a 580.
18:45The 5080 is a different story.
18:48It's bad.
18:49In all tests with the 5080, including the 5080 plus 980 combined test,
18:53the animation is bugged out for the train
18:55at the beginning of the scene as it comes to a stop.
18:57So another of the bugs here for the 50 series.
19:00As we can see on this frame time plot,
19:01things are okay until the end.
19:03Performance ruins the experience
19:05and completely tanks on the 5080 and 9800X3D
19:08as soon as the PhysX effects kick in.
19:11This makes it unplayable.
19:12If you watch captured footage of the benchmark,
19:14you can see that the sharp spike at the end of the plot
19:16makes up a full third of the benchmark chronologically,
19:19with frame rates dropping down below 10 FPS
19:21for the last minute straight.
19:22It doesn't look that way on the chart
19:24because of how the frame time chart plots,
19:27where the X axis is frame and the Y axis is milliseconds,
19:32but this is a significant portion of the test time.
19:35It's just that the time is stretched when it goes this vertical.
19:38Note that we're logging the entire benchmark run here,
19:40so these bar chart results aren't comparable to our ancient GPU charts.
19:44Also, they're probably not comparable
19:46because we're not using like a 4790K anymore.
19:49The early non-PhysX part of the benchmark
19:51helped the 5080 to score relatively high
19:53in terms of average frame rate,
19:55but the exceptionally poor 1% and 0.1% lows
19:57are due to the performance completely falling apart
20:00as soon as explosions and gunfire start up.
20:02It's totally unplayable, which is unfortunate
20:04because Nvidia said PhysX is the way it's meant to be played.
20:08Even the GTX 580, with GPU acceleration,
20:11coped with this section of the bench better,
20:13as indicated by its superior 22.3 FPS 1% and 19.2 FPS 0.1% lows.
20:20Watching the benchmark in action
20:21makes it clear that the 5080 completely failed this test.
20:24The best playable frame rate scored with PhysX
20:27was with the 5080 and the 980 working together,
20:30paying homage to AGEA as it was meant to be.
20:33The two averaged 252 FPS with much more reasonable lows.
20:37Disabling PhysX on the 5080 vastly improves performance as usual,
20:41but that's not a reasonable option in this game.
20:43Although this one copes better than Mafia 2,
20:46it does still totally change the experience.
20:48AMD players have lived with this reality for over a decade now,
20:52but now Nvidia's own customers, for those who play older games,
20:55get to experience vendor lockout.
20:57Out of curiosity, we also ran a hardware info log
21:00during a pass of the Last Light benchmark
21:02with the paired 5080 and 980.
21:04For the 5080, core load and Hardware Info's D3D usage metric
21:08had a near 1-to-1 correlation,
21:10and for the 980, core load and the Compute0 usage metric
21:14had a similar correlation.
21:16Plotting those two against each other
21:18reveals the point at which the PhysX effects kick in,
21:21with the 5080's usage dropping over the course of the benchmark
21:25as the workload bottlenecks elsewhere,
21:27while the 980's usage rises at the same time.
21:29The 5080's bus load was maxed out during the whole bench run,
21:32which makes some sense given the reduction in PCIe lanes.
21:36Mirror's Edge is the oldest game on our list,
21:38actually one of my favorite games from that era.
21:40I played a lot of the time trials.
21:42There aren't many graphics options in this one,
21:44and PhysX is a simple on or off toggle.
21:47But the graphics load in Mirror's Edge was absolutely brutal for its time,
21:51with glass breaking in particular causing major problems
21:54for hardware that couldn't handle it.
21:56Natively, the game is capped at 62 FPS.
21:59We were forced to edit the config file to uncap the framerate for testing,
22:03which would reportedly lead to a soft lock later in the game.
22:06This is another game where just turning PhysX off is a potential argument,
22:10even though we don't agree with it.
22:12We'd argue that the PhysX version of the game is superior,
22:15with lots of broken glass, fog, and fabric
22:18that's frequently completely missing with the setting off.
22:21But the PC version with added PhysX
22:23was only released a couple months after the original console launch.
22:26Here's the chart.
22:27The built-in Mirror's Edge flyby bench was inadequate,
22:29so we selected a simple bench path that includes breaking glass,
22:33which seems to be the major trigger for poor performance in this title.
22:36The 580 outperformed the 5080 when GPU-accelerated PhysX
22:39ran on the 580 to the 5080's CPU PhysX.
22:42Although the average framerate was roughly tied,
22:44the lows on the 5080 FE indicate huge problems
22:47with frametime pacing that ruin the experience.
22:50Just like Mafia 2, the run-to-run variance was extremely high
22:53with CPU PhysX on both the 5080 and 580.
22:56The solo GTX 980 beat the 5080 by 36%,
22:59which isn't the way those numbers are supposed to go.
23:02The 5080 and 980 pairing beat the solo 5080 by 276%,
23:07with significantly improved lows, and the experience is a lot better.
23:11The retired press from the PhysX era can all finally say,
23:15I told you so.
23:16Of the charted results, the paired cards scored the best.
23:20The 5080 with PhysX disabled showed 475% uplift,
23:24but that's at the cost of removing PhysX items from the scenery.
23:27And even with PhysX off, the 5080's lows were weak.
23:30As with Mafia 2, GPU-accelerated PhysX performed significantly better
23:34on the 580 than CPU PhysX.
23:36Another possible indication of kneecapped CPU performance
23:39in the older PhysX versions is what we just looked at.
23:42Borderlands 2 is up next.
23:43We've confirmed that the low setting for PhysX and Borderlands 2
23:46turns off all visible effects,
23:47so we're labeling that setting as PhysX off on the chart.
23:50PhysX effects include debris, cloth, particles,
23:53and smoothed particle hydrodynamics, or SPH fluid.
23:57We max out the settings and set PhysX to high when enabled.
24:00Borderlands 2 appears to gray out the PhysX option
24:03when no compatible card is detected,
24:05but this doesn't actually affect the setting.
24:07It just stops you from adjusting it without manually editing the config file,
24:11so great job, developers.
24:13Borderlands 2 contains a built-in PhysX test scene
24:15that's accessible with some light modding.
24:17This offers a good overview of how the game copes with the lack of PhysX.
24:21Some liquid and cloth is replaced with flat textures,
24:24some completely disappears,
24:25and sparks and rubble are completely eliminated.
24:28As with the other titles we've tested,
24:30just turn PhysX off isn't acceptable advice.
24:33It's a huge part of the game's visuals,
24:35and it's what Nvidia and this game used in its marketing,
24:38and the users of the era once again can finally feel heard and vindicated.
24:43Here's the chart.
24:44The 5080 avoided landing at the absolute bottom of the chart here,
24:47but this is a more CPU-friendly PhysX 2.8.4 title,
24:50as we can see from the fact that CPU PhysX on the 580
24:53scores higher than GPU Acceleration on the same card.
24:55Still, the 980 with GPU Acceleration outperformed the 5080 by 7%,
24:59and we even used the 5080 with all of its ROPs there.
25:02The 5080 plus 980 combination failed to launch at all in this title,
25:06and isn't included on this chart.
25:08Turning PhysX off gained 447% performance for the 5080 with an average of 518 FPS,
25:13but, again, that's not an option anyone should recommend.
25:16You should be able to play a game from 13 years ago
25:19on a 5080 without disabling core features.
25:22Batman is up next.
25:23Other Arkham games would have worked,
25:25but we picked Arkham City since it's a little newer than Asylum.
25:28In Arkham City, PhysX handles newspapers, flags, ropes, fog,
25:31sparks, glass, and general destruction.
25:33We maxed out the settings with PhysX set to high when enabled.
25:36When we tried this with our 5080,
25:38the game noted that our performance would be reduced.
25:40It suggested installing a nice GTX 570 instead,
25:44with a GTX 460 as a dedicated PhysX card,
25:47which it accurately says would be an upgrade from our 5080.
25:50And you could just do that.
25:52You could go buy a used, older card
25:54if you're an enthusiast of these games but want 50 series hardware.
25:58The game has a built-in benchmark that features heavy use of PhysX in several scenes,
26:02and we logged performance for the entire duration across all of them.
26:05Unfortunately, the scenes are relatively short and have black loading screens in between,
26:08which normally would artificially inflate the frame rates.
26:11Examining our frametime plots, though, which we won't bother showing here,
26:14showed us that the black screens caused frametime spikes rather than dips,
26:17but they were relatively short.
26:19For a serious review, not that a benchmark using a 980 and a 5080
26:23in the same system to test a 14-year-old game isn't serious,
26:26we'd never log across loading screens like this,
26:29but for this one, it just didn't really matter.
26:31The benchmark makes it very clear that PhysX is necessary
26:34to see the game as it was intended to be seen,
26:37especially the segment where display cases break and shower the floor with glass
26:41that's completely absent without PhysX.
26:43There was serious stuttering on all GPUs other than the GTX 580.
26:47This could either be because the frame rates are far above what the developers planned for,
26:51or because the 580 used a different driver than the rest.
26:54The 5080 is off to a promising start here with a 152 FPS average,
26:58giving it a 73% advantage over the GTX 980's GPU-accelerated result.
27:02However, we can also see that going from PhysX on to off with the 5080
27:07resulted in a 248% performance boost to 529 FPS,
27:11while the same change on the 980 only increased performance by 59%,
27:16up to 140 FPS.
27:18That's because the 980 is processing PhysX onboard,
27:21while the 5080 necessitates running it on our 9800X3 instead,
27:25which is a huge bottleneck.
27:26The 580 results are also interesting,
27:28with the PhysX CPU result actually 12% ahead of the PhysX GPU result.
27:32As we mentioned, this is a PhysX 2.8.4 title
27:35that may make use of the more CPU-friendly SSE2 instruction set,
27:39so it makes some sense that the CPU performance isn't completely abysmal.
27:42The Arkham City graphics menu recommends running a secondary card
27:45as a dedicated PhysX processor,
27:47which is exactly what we did with our 5080 and 980 tests.
27:50Ignoring the stuttering issues, this was the best result on the chart,
27:53allowing us to enable PhysX effects while averaging 215 FPS.
27:58It shouldn't be a surprise that, of all the cards dating from 2025 to 2010,
28:02the 2025 card is the best.
28:04But it is, and only when paired with the 2014 card.
28:07That's a 41% uplift gained by adding in an 11-year-old graphics card.
28:12In terms of frame time consistency, the winner is the 580.
28:14So now the kind of reality check.
28:16It's not the biggest deal in the world that games from 12 to 15 years ago
28:21don't run on these modern GPUs.
28:23That's not really the core complaint here.
28:26Something we wanted to explore.
28:27Certainly a lot of people care about that sort of thing and should know about it,
28:31but that's not the big problem.
28:33It's not the...
28:34I mean, there's a lot worse controversies right now for NVIDIA's 50 series
28:37than that list of games.
28:39The real problem comes from everything that's sort of attached to it,
28:43and the concern we have is the broader issue,
28:46which is that NVIDIA has a habit of trying to come up with
28:49some kind of exclusive graphics tech for games,
28:52and it's the exclusive part that's key there.
28:54For example, RTX-specific ray tracing features and other stuff,
28:57like DLSS, MFG, 3D Vision, Ansel, TXAA, Waveworks, MFAA, DLAA,
29:01HBAO+, Reflex, and Hairworks.
29:03Actually, you know what?
29:06That one's okay.
29:08We're okay with that one.
29:10But you get the point.
29:11This drags the industry along in NVIDIA's wake,
29:13and then when they get bored of the technology and abandon it,
29:16the customer's left holding the bag.
29:18That's NVIDIA's prerogative as the eternal market leader,
29:21but it doesn't feel any better for those of us suckers
29:24with 3D Vision monitors and glasses in the attic.
29:27And I actually did like 3D Vision.
29:29But it's interesting, because NVIDIA's making decisions on graphics.
29:32That's not all that controversial.
29:34They're a graphics company.
29:35There's things that they know how to do better than game developers,
29:37so some of this makes sense.
29:39Like embedding engineers with game developers
29:41to teach them how to optimize things or bake in settings.
29:44So that's not necessarily the problem.
29:46The problem is, once again, the implementation
29:48and kind of what this PhysX 32-bit situation gives us
29:52as a warning against all this other special technology
29:56in the future from now.
29:58And it is not necessarily an antitrust issue
30:02to decide to use power and resources and know-how
30:06to then try and force improvements in graphics
30:09or make game development easier
30:11in a way that benefits the technology provider
30:14that's in that position.
30:15However, it can become one,
30:17depending on how NVIDIA wants to execute on that vision.
30:20And that needs to be carefully monitored
30:22as NVIDIA maintains that 90-10 in the market.
30:25The bigger problem today is just that
30:27this move came as an unexpected side effect
30:30of the removal of 32-bit CUDA support in Nuke Art,
30:33which appears to have blindsided everyone
30:35and could have other downstream effects beyond PhysX.
30:38For example, Passmark recently noted
30:40that 5090 and 5080 compute performance
30:42was unexpectedly low in benchmarks
30:44due to OpenCL 32-bit support breaking without warning.
30:48This is also an issue related to games preservation,
30:50kind of like you see with stop-killing games,
30:53except it's for a different reason than that.
30:55Same kind of idea, though.
30:57It's very likely that some of these games
30:58could be tweaked to run better with CPU PhysX
31:01by swapping DLLs or otherwise modding them.
31:03But we can't take it for granted
31:05that some genius on a forum
31:06will always be there to clean things up,
31:08and that's not the point anyway.
31:10You could spend thousands of dollars
31:11on a brand-new NVIDIA GPU,
31:13and in the best-case scenario,
31:14still have to do extra work
31:16to get those games to play properly
31:19with NVIDIA's own abandoned feature.
31:21We don't have a decisive call to action here.
31:23NVIDIA could probably fix PhysX, we think.
31:26There's probably a lot of reasons
31:27that would be difficult,
31:28a waste of time to them, or undesirable,
31:31but we're willing to bet
31:33a $3 trillion market cap
31:35that NVIDIA has an engineer
31:36who can figure out how to get Borderlands 2
31:38to run at a stable frame rate on a 50-series GPU.
31:41In the long term,
31:42this feels like a grim omen
31:44for any NVIDIA special feature
31:46that's integrated into a game,
31:48especially if it's core to how that game feels,
31:51like what we were talking about in these games.
31:52They become different games.
31:54They're visually and experientially different
31:56when you play them without PhysX.
31:58It's a big feature.
31:59And even with newer PhysX games,
32:02we seriously doubt that Fallout 4
32:05is the only half-assed implementation
32:07of 64-bit PhysX.
32:09So, games that are 12 or 15 years old,
32:11maybe not the biggest deal
32:12in sort of the microscopic situation
32:15that they're in,
32:16except for the people who really care
32:17about those games.
32:18There's ways to work around it.
32:19You can do accelerator cards.
32:20Kind of sucks, though,
32:21to go buy, like someone on Reddit did,
32:23a 5090 and then buy an RTX 3050
32:27so you can play old games.
32:29But that's what was happening.
32:31And this is one of those where
32:3410, 15 years in the future,
32:36we don't know if features like Reflex,
32:38NFT, again, even ray tracing
32:40will work the way they do today
32:41on the hardware that will be current
32:43in the future versus the stuff
32:45from the era they were developed.
32:46That's just sort of the nature of hardware.
32:48Sometimes compatibility modes
32:49have been a thing for ages.
32:50And so we're trying to keep perspective here
32:52on just how big of an issue is this.
32:54But it's really, it does come back
32:56to NVIDIA and how much
32:57they push these specific technologies.
32:59And ray tracing in particular
33:01is an interesting one
33:02because it seems like it'll probably
33:03go through a paradigm shift
33:04that's major at some point
33:05just because it is still relatively new.
33:06But who knows?
33:07Anyway, that's it.
33:08That's the 32-bit PhysX feature.
33:10Hopefully, you had fun
33:12getting some of the history on it.
33:13And if nothing else,
33:15it is kind of fun to see the GTX 580
33:17and 980 come out of retirement.
33:19That's it for this one.
33:20Thanks for watching.
33:21Subscribe for more.
33:22Go to store.gamersnexus.net
33:23to support us directly
33:24or go to patreon.com
33:25slash gamersnexus
33:26to throw a few bucks our way each month
33:27as we continue to research
33:29and test all these different topics
33:30that are emerging right now.
33:32And we'll see you all next time.