Adrian Sanchez 🎮

3.3K posts

Adrian Sanchez 🎮 banner
Adrian Sanchez 🎮

Adrian Sanchez 🎮

@SunsetLearn

Senior Technical Engineer @ Squanch Games ⚡️https://t.co/ejNReXFIh2 - https://t.co/nWHW3GpMjq ⚡️Using electricity to produce light @SunsetStudioCo

Currently building Katılım Kasım 2023
166 Takip Edilen459 Takipçiler
Sabitlenmiş Tweet
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
Current state of Sundown renderer: (cheapish) DDGI + Stochastic SSR + GT-VBAO + SVSMs + EEVEE-style Bloom + Deferred Lighting + Other Stuff. Running in a pseudo-bindless pipeline, as much as JS + WebGPU allows. My laptop was overheating from all the dev and open apps at this point so it started power throttling and cycling between laptop 3070 and iGPU. Still a few artifacts and quality touchups to get to, but currently trying to push performance as much as possible to fit within a decent 30fps budget for mobile and iGPUs.
English
2
2
42
7K
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
@andkalysh LPV mostly just gives you coarse diffuse though right? Never tried it but curious to use if it's performant on mobile
English
1
0
1
53
AK
AK@andkalysh·
I have been working on BIG update for my #UE5 cheap LPV plugin and believe me, it's tremendous. Separate LPV volumes you can place directly in your levels. Static or dynamic your choice. Planned: 🔜 Multiple volumes per level 🔜 Forward rendering support 🔜 DX11 & DX12 Support As far as my research goes light bounces are not possible for low end hardware without RSM. #UE5 #UnrealEngine #gamedev #indiedev
English
5
7
67
4.3K
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
@alightinastorm Like try to fill in the really small gaps, crevices and creases with higher quality occlusion. Maybe a combination of GTAO + contact shadows + really good AO maps.
English
0
0
1
34
robot
robot@alightinastorm·
@SunsetLearn tell me more adrian, what is micro shading and how do we slap AO on this
English
1
0
0
60
robot
robot@alightinastorm·
Building an Open Source Metahuman with AI: Pores and Oil All textures created with gpt image 2 The hairs are just placeholders and won't be useful anyways, since i will be porting the shading to threejs TSL soon and i'll have to figure out my own (and cheaper) hair system at some point Still a long way to reach metahuman realism, but at least there's progress (almost) daily And they say AI bros have no skill? I say we do friends
robot@alightinastorm

testing skin details with gpt image 2 + upscaling (4k) it is reaally good but even the tiniest displacement between textures creates very visible artifacts in 3D, as you can see on his ear Textures generated with GPT: - Color - AO (discarded, useless) - SS (discarded) - Normal base layer - Normal detail layer (mixed both together) - Roughness Overall very interesting and if the normal layer displacements weren't bleeding into the surface so hard, it'd be a great result and i could upscale it to 8k I think skinning with AI is becoming really good now

English
19
6
120
39.4K
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
If current trends continue, software engineering as a whole will morph into software design, which might be bittersweet but necessary.
English
0
0
0
56
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
Curious why you're building HZB for CSMs. I assume you want some sort of shadow view occlusion culling but the moment you rasterize depth for a light-view cascade you already pay for a full depth raster pass, so why waste the full depth raster on HZB when you're going to do it again (even if not fully an all light-visible geo) once you use the HZB result to do the actual depth raster?
English
0
0
0
4.2K
Alex Goldring
Alex Goldring@SoftEngineer·
Rewoking the rasterizer for Shade. After some testing, I notices that HZB rebuilds take up ~20% of the frame time on integrated GPUs and mobiles. The reason is the VRAM speed. I decided to rework the occlusion culling, going from 20 rebuilds per frame down to 12, which should bring frame times down by ~8% More powerful platforms don't care, on my RTX 4090 the HZB barely registers in the profiler, so it's something that has an outsized benefit for less powerful GPUs. Doing this kind of work - you disable everything unrelated to rasterization. Here is a frame that I see during debug, versus what the engine does when you re-enable all of the other features: + shadows + AO + exposure + bloom + TAA For those interested, the rework only does 2 pass occlusion culling on opaque geometry now, masked alpha and transparencies go after full frame HZB for opaque is already built. Another thing - the HZB rebuild no longer happens per rasterizer state, we build partial HZB with all opaque rasterizer states, then re-filter and finish the frame. A bit of extra efficiency at the cost of some added engineering complexity.
Alex Goldring tweet mediaAlex Goldring tweet media
English
3
5
71
9.5K
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
I think this would work even better in a virtual shadow maps implementation. You can coarsely cull tiles using a similar method and only use the resulting list for processing, so you end up paying for raster only on objects within dirty tiles of those frustum planes. Better if you can cull resulting meshes by cluster within the tile bounding planes.
English
1
0
0
192
Sebastian Aaltonen
Sebastian Aaltonen@SebAaltonen·
New cascaded shadows GPU time = 68% More precise culling helps a lot. It's still a single render pass. 2048x2048 texture atlas. Three 1024x1024 cascades in it (last 1024x1024 region saved for local lights). Let's talk about the correct way to cull shadows...
Sebastian Aaltonen tweet media
English
15
41
420
60.1K
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
Had meshlets already set up via an offline cooking step with meshoptimizer for a while, but then procrastinated on that to work on some other optimizations and a CVAR-like config system and some other QoL stuff 😅 Finally took some time to start hooking in the imported meshlets into a new viz. buffer pipeline. Still gotta work on frustum/occlusion culling for meshlets, setting up a CLAS for ray traversal and leveraging meshlets during the SVSM stages, but this effectively collapses a ton of draw calls (per material + mesh) into a single viz. buffer raster pass and a single GBuffer resolve (from the viz. buffer), since Sundown already has a pseudo-bindless data setup for materials and textures, so it's not quite there but it's coming along 🤓
Adrian Sanchez 🎮 tweet mediaAdrian Sanchez 🎮 tweet mediaAdrian Sanchez 🎮 tweet mediaAdrian Sanchez 🎮 tweet media
English
0
1
1
143
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
This is one of the best rising accounts on here. We need more completely delusional positivity in a world that is shattering from negativity on all sides. Being a realist doesn't mean you need to be pessimistic about everything. You can still be a realist and a complete optimist at the same time.
Digi (Delusional)@digiii

x.com/i/article/2028…

English
0
0
0
89
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
Not necessarily a cut at this but it's weird how there's always this discourse around technology about what makes you "more human" like its a fucken contest and if you somehow lose out at the bottom you're subhuman and deserve to rot. No wonder religion exists. We're all human. It's not a "more" or "less" thing. It doesn't matter if you're developing technology or creating art or running a political campaign. You're as human as anyone else, regardless.
Kekius Maximus@Kekius_Sage

🚨 Anthropic CEO says STEM is losing its edge, and what makes you “more human” will decide your future

English
0
0
1
103
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
@martinmbauer It's too bad it requires billions of dollars to run these experiments. Early physics and mathematics discoveries were largely done by small groups and singular individuals using pretty inexpensive methods.
English
1
0
0
88
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
@SebAaltonen At this point we would've pretty much solved real-time computer graphics. Then we'd only have animations, physics and general gameplay to try and keep improving.
English
0
0
0
138
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
@pmarca There's a difference between introspection and just being humble and not dwelling on success for too long or getting too prideful. I think Steve was talking about the latter. But they're both important for different reasons.
English
0
0
0
93
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
The amount of passion and work engineers put into developing these technologies just to get vilified in any public square they walk into. These tools are completely optional. You don't have to implement them in your game. It's just engineers and scientists trying to give you tools, that's all. They're not trying to pillage your hometown and raid your wares.
English
0
0
2
58
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
This is now obvious in hindsight, because history shows that any new mass ideology or way of thinking begins to create massive divides in human populations over time, but we have reached a point where there are massive extremes of anti-technology and pro-technology people. Technology used to be about the promise of new utility and an optimistic future for humankind. It worked exclusively under those constraints. Unfortunately widespread adoption, nihilism and extreme pessimism have turned technology into a political battlefield, and we are in danger of becoming a stagnant civilization that no longer seeks to improve its outcomes. For every optimist developing a promising technology, there are dozens of pessimists ready to tear it down. In the context of modern times, the real salient point is this: the arts are important for human expression, communication and general well being, but technology is the key driver behind civilizational progress. When you support a massive campaign to tear down either of these thing, you are indirectly putting the human race at risk. If either of these things (arts or technology) goes bust indefinitely, we are all royally fucked.
English
0
0
0
55
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
@GamingSinceNES If you're in the comments thinking lighting can't do this, you should watch this. The two side-by-sides are also taken at slightly different times in the game so if there's any inconsistency you can't rule that timestamp difference out. youtube.com/watch?v=6sgpeo…
YouTube video
YouTube
English
0
0
2
335
Compusemble
Compusemble@compusemble·
The DLSS 5 on/off comparisons below are a good demonstration of how the assets/geometry do not change. It's the improved lighting that makes such a massive difference. Source: #post-2401070" target="_blank" rel="nofollow noopener">forum.beyond3d.com/threads/nvidia…
GIF
GIF
English
29
55
450
146K
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
From a rendering tech POV, this is absolutely insane. I wish it wasn't a black box but it does kind of push the uncanny valley a bit further. I just hope there's some kind of steerability tools that come with this that help art direction not look like some generic AI generated video.
NVIDIA GeForce@NVIDIAGeForce

Announcing NVIDIA DLSS 5, an AI-powered breakthrough in visual fidelity for games, coming this fall. DLSS 5 infuses pixels with photorealistic lighting and materials, bridging the gap between rendering and reality. Learn More → nvidia.com/en-us/geforce/…

English
1
0
1
167
Adrian Sanchez 🎮
Adrian Sanchez 🎮@SunsetLearn·
I do have to admit, all the controversy and generative art push-back aside, LLMs actually might turn out to be one of the most important innovations of the century. I'm at the point where pretty much everyone I encounter in daily life, family, friends, everyone including people well outside of the tech sector, are constantly delegating questions and queries to some ChatGPT/Gemini/Claude/etc. And I don't even have to mention how useful they've become for cooperative software engineering. We thought it was a fad but it turns out it was bigger than that. And if you love engineering software and creating then you know the tide is turning but not at all in a bad way.
@levelsio@levelsio

I think this collective feeling of "I don't enjoy coding anymore because it's so easy with AI" is good to talk about and realize, and I have it too I miss going to bed with a coding challenge I have to get through and then wake up and in the shower I get the answer and I scream EUREKA!!!!! But then you quickly just have to accept that the world has permanently changed now and it's just not going back because letting AI code for you is simply so much faster and effective and will only get better with every passing year So the better mental approach for me to these things is to just aggressively embrace it and change myself instead, if the fun in solving the challenges is gone, where else can I find the fun? I'm lucky a bit because for me the fun has always been building new things in general, not so much the coding part, although the coding challenges were fun for me too. But having ideas and just building new things was always the most fun. So I have to double down on that now, making more things and making better things and making them much faster than before. Especially now that literally everyone in the world has access to the same coding skill as everyone else (which is AI), the focus will have to aggressively be on what remains as a differentiator for me as a creator, which is my ideas and the way I execute them, not coding them So that's what I will try focus on from now on I think

English
0
0
3
181