Alex Vlachos

185 posts

Alex Vlachos banner
Alex Vlachos

Alex Vlachos

@AlexVlachos

Real-Time Rendering / Games / Mixed Reality (Retired). Formerly at Valve, Meta, Microsoft, Naughty Dog, ATI, Spacetec.

Seattle, WA, USA Katılım Mart 2009
270 Takip Edilen3.3K Takipçiler
Alex Vlachos
Alex Vlachos@AlexVlachos·
I joined Meta's AR/VR team yesterday. Can’t wait to get my hands on some new prototypes! Super excited about our product roadmap!
English
16
7
134
11.9K
Alex Vlachos
Alex Vlachos@AlexVlachos·
Valve artist @rich_lord gave a great talk at SIGGRAPH HIVE 2023 that every rendering dev should watch. Tons of inspirational moments for shader development. The fractal sequence starting 29 minutes in is impressive. youtu.be/R-S9d-qFRHM?si…
YouTube video
YouTube
English
2
16
98
8.9K
Alex Vlachos
Alex Vlachos@AlexVlachos·
@swainrob No license. Zero patents. Originally written by Iestyn Bleasdale-Shepherd for Portal 2 with some minor tweaks for VR (original code was aimed at dithering Xbox 360). We shared it with the industry as part of our VR brain dump in 2015 at GDC. Free to use any way you want.
English
1
0
2
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
@EricWescott1 Only one frame needs to be critical to drop 2 levels. Max is used to decide to drop 2 levels when we're extrapolating frames, but we only extrapolate when a frame is above the extrapolation threshold. But you'll need someone at Valve to verify.
English
1
0
0
0
Eric Wescott
Eric Wescott@Wescott_v3·
@AlexVlachos I saw your GDC talk on adaptive quality and why you use linear extrapolation. I wasn't sure on how exactly a parameter (0-1 real #) would be factor to that calculation. As far as critical vs max you think they ended up being having the same behavior of just -2 quality levels?
English
1
0
0
0
Eric Wescott
Eric Wescott@Wescott_v3·
@AlexVlachos Any chance you could shed some light on what exactly these dynamic resolution parameters for HLA do outlined in this Reddit post reddit.com/r/OculusQuest/… Specifically: vr_fidelity_threshold_frame_percent_critical vr_fidelity_threshold_frame_percent_extrapolation
English
1
0
0
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
@EricWescott1 Not sure if that's helpful. I wrote that code over 5 years ago. Remembering the details is difficult at this point, especially because I can't just look at the code.
English
0
0
0
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
@EricWescott1 Extrapolation is about guessing what the cost of the next frame will be. I used linear extrapolation to guess at the next frame's cost to preemptively drop one quality level when the extrapolated estimate exceeded that limit. Again, assuming I'm remembering correctly.
English
2
0
0
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
This is a quarantine gift from the gods! Can’t wait to make my way through all of the developer commentary in Half-Life: Alyx! Jason hired me at ATI 22 years ago and then recruited me to Valve 8 years later. Technical writing is his superpower. This is going to be amazing.
English
1
2
28
0
Alex Vlachos retweetledi
UploadVR
UploadVR@UploadVR·
A new software update will increase the visual fidelity on the Reverb G2 and other WMR headsets, by improving corrections made for chromatic aberration and more. uploadvr.com/reverb-g2-visu…
UploadVR tweet media
English
1
12
52
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
My career started in 1997. I remember when we started putting heat sinks on GPUs before anyone called them GPUs. And let’s not forget about the QA guy who ran out of thermal paste and used a half-chewed Tootsie Roll instead. Spoiler alert: it didn’t work. Good times at ATI!
Universal Curiosity@UniverCurious

Graphics card evolution.

English
1
0
21
0
Alec Jacobson
Alec Jacobson@_AlecJacobson·
Sure you can compute shadows in screen space or using ray tracing, but why not do it the hard way? Extrude all of the silhouette and contour edges into a shadow volume triangle mesh, mesh Boolean that against the original mesh, and finally shadow triangles via winding number.
GIF
English
4
4
34
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
Very excited to share that I joined Microsoft today as a Partner Architect in Mixed Reality (HoloLens, VR, etc.)! Been looking forward to this for a while! More details later.
English
37
18
364
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
After seeing Animusic's Pipe Dream at @siggraph 2001, they offered my team at ATI all their art assets to build a real-time demo that we shipped one year later for Radeon 9700. Now I learn this! @Intel built a fully functional robotic replica! Amazing! youtube.com/watch?v=JLdB0W…
YouTube video
YouTube
English
4
1
20
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
@CMDannCA My other test scene was high res versions of left 4 dead zombies in their idle animation loops. I had about 10 characters in a circle with the player in the middle in a small room with no doors or windows. Some coworkers were not pleased.
English
1
0
20
0
Dann Blair ᯅ
Dann Blair ᯅ@CMDannCA·
@AlexVlachos I would have rendered a bunch of these guys. They used to scare me when I was younger haha.
GIF
Winnipeg, Manitoba 🇨🇦 English
1
0
5
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
Half-Life VR, fun fact. I was the first person to import and render a Half-Life 2 model in VR. It was Alyx’s giant robot dog in his idle animation. April 2014. I fell asleep that day with my HMD on. Woke up to a giant robot dog aggressively staring at me. Nearly crapped my pants!
Alex Vlachos tweet media
English
7
97
778
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
@MattFiler @matttwood My kids were very young and still not sleeping through the night. I was sleep deprived and mildly hallucinating most of the time. Involuntary naps at my desk was the norm for a while.
English
0
0
11
0
Alex Vlachos
Alex Vlachos@AlexVlachos·
@matttwood Yes I did. And it was only a few weeks after having an HMD at my desk. VR was still a very new concept. Waking up to that was frighteningly realistic at that point in time.
English
1
0
34
0