


๐ท๐. ๐ผ๐๐ ๐ถ๐ข๐ก๐๐๐ ๐
46.1K posts

@IanCutress
Consultant, Chief Analyst, Influencer. Substack: https://t.co/yEtnDropHp (@MoreThanMoore2x) Youtube: https://t.co/1t9pRrV860 (@TechTechPotato)










@TheEggman64 > rather than more powerful hardware > look inside > 2x 5090s to run











@IanCutress And yet NVIDIA themselves showed it required two 5090's and could only generate slop AI filter garbage.






If you ever get a chance to talk to researchers at major labs about GPU failure rates, you should! They are extremely low, but even a 0.5% failure rate over hundreds of thousands of GPUs ends up being a lot of failed GPUs Heard some interesting things about how it's managed




"Hey, here's a demo that's still early and unoptimized, coming out end of year hopefully. We're still getting it to work and this is an early alpha." "We shoved it onto two separate GPUs to ensure the base game runs fine and the DLSS works well enough for the demo." "We have 5090s, so why not run it on those. We could have used two GT730s for the lolz, and you would have asked why, but we had 5090s on the desk." Everyone starts screaming ๐ถOMG ๐ถ ๐ถ IT REQUIRES ๐ถ ๐ถ TWO 5090s!!1!๐ถ ๐ถ WONT SOMEONE PLEASE THINK OF THE CHILDREN ๐ถ I really can't fathom the dumb. You know console games are made on dev kits that are vastly faster than the consoles they run on. It's well documented, especially in early cycle design. To anyone that claims I'm pro NVIDIA or I'm not taking people's complaints seriously, go look at my previous NVIDIA tweets. I don't get passes for events. I find issues with their messaging all the time. This is just one where seemingly most of the TechTube space couldn't be bothered to apply the most basic critical thinking, or didn't ask a basic question or two. Instead, let's jump straight to drama and have it as the cornerstone of the outrage and content. Because clicks, I guess. There are a thousand other things to mention about DLSS5. I gave them a ton of feedback when I spoke to the engineers at the demo in person (most of the time it looks like a repositioning of the light source more than anything else, it was shown on games that aren't that high in graphics fidelity (I'm actually disappointed in modern AAA if this is what we have), half the time it looks simply like the additional tesselation we saw in DX11, and it can ignore a number of doubly-relected shadows). Here's the kicker though. NVIDIA has thousands of employees working on its gaming portfolio. I've seen so many people complain 'why did they work on this when they could have worked on other stuff'. They ARE working on other stuff. You might be able to only think about one feature at a time (make sure you remember to breathe), but these things are all developed in parallel with other features. If anyone has enough money to have teams invested in researching and developing a bucket load of features, it's NVIDIA. Also, you can turn the feature off. Imagine complaining about a TV channel you don't watch, just because it's on the guide and the cable service promoted it and needed two 8K TVs to do so because it was early 3D. I'll say it again. I really can't fathom the dumb. Or rather I can, and I'm just amazingly disappointed on this hardware critique. I'll get backlash from other media just for this post, sure, that much is certain (and thanks for obliging, much appreciated). But perhaps I should just bring @ctnzr on the show for an interview later in the year. He's a really cool dude anyway.


If GTA6 launches as a DLSS5 native title.... it'll still sell 50 million copies half the people will have DLSS5 running most of them won't realise it here's betting most of them enjoy the game and marvel at the next gen graphics




"Hey, here's a demo that's still early and unoptimized, coming out end of year hopefully. We're still getting it to work and this is an early alpha." "We shoved it onto two separate GPUs to ensure the base game runs fine and the DLSS works well enough for the demo." "We have 5090s, so why not run it on those. We could have used two GT730s for the lolz, and you would have asked why, but we had 5090s on the desk." Everyone starts screaming ๐ถOMG ๐ถ ๐ถ IT REQUIRES ๐ถ ๐ถ TWO 5090s!!1!๐ถ ๐ถ WONT SOMEONE PLEASE THINK OF THE CHILDREN ๐ถ I really can't fathom the dumb. You know console games are made on dev kits that are vastly faster than the consoles they run on. It's well documented, especially in early cycle design. To anyone that claims I'm pro NVIDIA or I'm not taking people's complaints seriously, go look at my previous NVIDIA tweets. I don't get passes for events. I find issues with their messaging all the time. This is just one where seemingly most of the TechTube space couldn't be bothered to apply the most basic critical thinking, or didn't ask a basic question or two. Instead, let's jump straight to drama and have it as the cornerstone of the outrage and content. Because clicks, I guess. There are a thousand other things to mention about DLSS5. I gave them a ton of feedback when I spoke to the engineers at the demo in person (most of the time it looks like a repositioning of the light source more than anything else, it was shown on games that aren't that high in graphics fidelity (I'm actually disappointed in modern AAA if this is what we have), half the time it looks simply like the additional tesselation we saw in DX11, and it can ignore a number of doubly-relected shadows). Here's the kicker though. NVIDIA has thousands of employees working on its gaming portfolio. I've seen so many people complain 'why did they work on this when they could have worked on other stuff'. They ARE working on other stuff. You might be able to only think about one feature at a time (make sure you remember to breathe), but these things are all developed in parallel with other features. If anyone has enough money to have teams invested in researching and developing a bucket load of features, it's NVIDIA. Also, you can turn the feature off. Imagine complaining about a TV channel you don't watch, just because it's on the guide and the cable service promoted it and needed two 8K TVs to do so because it was early 3D. I'll say it again. I really can't fathom the dumb. Or rather I can, and I'm just amazingly disappointed on this hardware critique. I'll get backlash from other media just for this post, sure, that much is certain (and thanks for obliging, much appreciated). But perhaps I should just bring @ctnzr on the show for an interview later in the year. He's a really cool dude anyway.





