Alex Vlachos
185 posts

Alex Vlachos
@AlexVlachos
Real-Time Rendering / Games / Mixed Reality (Retired). Formerly at Valve, Meta, Microsoft, Naughty Dog, ATI, Spacetec.
Seattle, WA, USA Katılım Mart 2009
270 Takip Edilen3.3K Takipçiler

Valve artist @rich_lord gave a great talk at SIGGRAPH HIVE 2023 that every rendering dev should watch. Tons of inspirational moments for shader development. The fractal sequence starting 29 minutes in is impressive. youtu.be/R-S9d-qFRHM?si…

YouTube
English

@swainrob No license. Zero patents. Originally written by Iestyn Bleasdale-Shepherd for Portal 2 with some minor tweaks for VR (original code was aimed at dithering Xbox 360). We shared it with the industry as part of our VR brain dump in 2015 at GDC. Free to use any way you want.
English

.@AlexVlachos what is the license of the dithering code in your 2015 GDC talk? Slide 49: media.steampowered.com/apps/valve/201…
English

@thatJaneNg @Invertednormals @plumplumcurse @ErikRobson The advice I always give family and friends is to start with @JLCollinsNH book, "The Simple Path to Wealth". The first step is to watch this video: youtu.be/T71ibcZAX3I

YouTube
English

@EricWescott1 Only one frame needs to be critical to drop 2 levels. Max is used to decide to drop 2 levels when we're extrapolating frames, but we only extrapolate when a frame is above the extrapolation threshold. But you'll need someone at Valve to verify.
English

@AlexVlachos I saw your GDC talk on adaptive quality and why you use linear extrapolation. I wasn't sure on how exactly a parameter (0-1 real #) would be factor to that calculation.
As far as critical vs max you think they ended up being having the same behavior of just -2 quality levels?
English

@AlexVlachos Any chance you could shed some light on what exactly these dynamic resolution parameters for HLA do outlined in this Reddit post
reddit.com/r/OculusQuest/…
Specifically:
vr_fidelity_threshold_frame_percent_critical
vr_fidelity_threshold_frame_percent_extrapolation
English

@EricWescott1 Not sure if that's helpful. I wrote that code over 5 years ago. Remembering the details is difficult at this point, especially because I can't just look at the code.
English

@EricWescott1 Extrapolation is about guessing what the cost of the next frame will be. I used linear extrapolation to guess at the next frame's cost to preemptively drop one quality level when the extrapolated estimate exceeded that limit. Again, assuming I'm remembering correctly.
English
Alex Vlachos retweetledi

A new software update will increase the visual fidelity on the Reverb G2 and other WMR headsets, by improving corrections made for chromatic aberration and more.
uploadvr.com/reverb-g2-visu…

English

Here is a short write-up of some visual quality improvements I’ve been working on the last few months at Microsoft for the HP Reverb G2
Alex Kipman@akipman
During development of the #HPReverbG2, #WindowsMixedReality had major updates to our visual quality. Learn more about it here [)-) techcommunity.microsoft.com/t5/mixed-reali…
English

My career started in 1997. I remember when we started putting heat sinks on GPUs before anyone called them GPUs. And let’s not forget about the QA guy who ran out of thermal paste and used a half-chewed Tootsie Roll instead. Spoiler alert: it didn’t work. Good times at ATI!
Universal Curiosity@UniverCurious
Graphics card evolution.
English

@mcnabbd @_AlecJacobson Yes. The Animusic demo. About 18 years ago. It was a combination of statically carved shadows and dynamic stencil shadow volumes. I described the carved shadows in this talk at around slide 22: alex.vlachos.com/graphics/Vlach…
English

welcome to the team. very excited to collaborate with you on this journey [)-)
Alex Vlachos@AlexVlachos
Very excited to share that I joined Microsoft today as a Partner Architect in Mixed Reality (HoloLens, VR, etc.)! Been looking forward to this for a while! More details later.
English

After seeing Animusic's Pipe Dream at @siggraph 2001, they offered my team at ATI all their art assets to build a real-time demo that we shipped one year later for Radeon 9700. Now I learn this! @Intel built a fully functional robotic replica! Amazing! youtube.com/watch?v=JLdB0W…

YouTube
English

@CMDannCA My other test scene was high res versions of left 4 dead zombies in their idle animation loops. I had about 10 characters in a circle with the player in the middle in a small room with no doors or windows. Some coworkers were not pleased.
English

@AlexVlachos I would have rendered a bunch of these guys. They used to scare me when I was younger haha.
GIF
Winnipeg, Manitoba 🇨🇦 English

@MattFiler @matttwood My kids were very young and still not sleeping through the night. I was sleep deprived and mildly hallucinating most of the time. Involuntary naps at my desk was the norm for a while.
English

@AlexVlachos @matttwood The prototype HMD looks like it weighed a ton, exactly how tired were you?!
English

@gabsfrmarqs @ValveNewsNetwor This was early VR development that led to Aperture Robot Repair and The Lab.
English

@matttwood Yes I did. And it was only a few weeks after having an HMD at my desk. VR was still a very new concept. Waking up to that was frighteningly realistic at that point in time.
English



