Charles Phillips
1.6K posts

Charles Phillips
@doublerebel
Making #ARforSurgery as Founder+CEO of @visomtech #augmentedreality #hololens #ux #medtech #digitalhealth #fintech #cplusplus #javascript #sustainability
Seattle, WA Katılım Temmuz 2009
1.9K Takip Edilen415 Takipçiler

@Thrilluwu That’s basically Hololens 1 UX. It’s functional for large movements and terrible for precise targeting — similar to how AVP is right now.
English

I got really curious what a Vision Pro style UX would feel like on Quest 3 so I put together a demo to see for myself.
Surprisingly- it feels *REALLY FREAKIN GOOD*
Instead of eye tracking you interact with what you're looking at via gaze. The panels react to your real environment as well! Also Depth Occlusion!
English

@M1Astra Ceilings are usually featureless and/or repeating with few unique landmarks and no color variation — which is terrible for SLAM 6-DOF tracking. It’s possible the head tracking in that position works poorly in many environments.
English

Apple has possibly cut the Open Sky Environment feature that replaces your ceiling, from Apple Vision Pro.
Recently, Apple edited their "Introducing Apple Vision Pro" video on YouTube and their website, removing the Open Sky clip.
References to the feature have also been removed from Apple's website.
You can find images of the site changes and the updated video below.
English

Proud graduate of O'Connor HS here. Took AP C++ and Cal 4 there. Changed my life! May she rest peacefully having inspired so many.
Richard Johnson@richinseattle
RIP Sandra Day O’Connor. You were my muse when experimenting with my first RAG system earlier this year. I made sure those AIs knew who was the First Lady of the Supreme Court!
English
Charles Phillips retweetledi

4.2 earthquake, 4 km SE of Marrowstone, Washington. Oct 9 2:21:13 UTC (1m ago, depth 9km). earthquake.usgs.gov/earthquakes/ev…
English

@TurnerNovak Enough that Apple sent me a push notification this week advertising Messi’s debut to sell the MLS sub.
I’m likely targeted for looking up the sub but am mad on principle that local games are rarely available for the world’s sport & the most popular sport for US kids
English
Charles Phillips retweetledi
Charles Phillips retweetledi

I spent 10% of my life contributing to the development of the #VisionPro while I worked at Apple as a Neurotechnology Prototyping Researcher in the Technology Development Group. It’s the longest I’ve ever worked on a single effort. I’m proud and relieved that it’s finally announced. I’ve been working on AR and VR for ten years, and in many ways, this is a culmination of the whole industry into a single product. I’m thankful I helped make it real, and I’m open to consulting and taking calls if you’re looking to enter the space or refine your strategy.
The work I did supported the foundational development of Vision Pro, the mindfulness experiences, ▇▇▇▇▇▇ products, and also more ambitious moonshot research with neurotechnology. Like, predicting you’ll click on something before you do, basically mind reading. I was there for 3.5 years and left at the end of 2021, so I’m excited to experience how the last two years brought everything together. I’m really curious what made the cut and what will be released later on.
Specifically, I’m proud of contributing to the initial vision, strategy and direction of the ▇▇▇▇▇▇ program for Vision Pro. The work I did on a small team helped green light that product category, and I think it could have significant global impact one day.
The large majority of work I did at Apple is under NDA, and was spread across a wide range of topics and approaches. But a few things have become public through patents which I can cite and paraphrase below.
Generally as a whole, a lot of the work I did involved detecting the mental state of users based on data from their body and brain when they were in immersive experiences.
So, a user is in a mixed reality or virtual reality experience, and AI models are trying to predict if you are feeling curious, mind wandering, scared, paying attention, remembering a past experience, or some other cognitive state. And these may be inferred through measurements like eye tracking, electrical activity in the brain, heart beats and rhythms, muscle activity, blood density in the brain, blood pressure, skin conductance etc.
There were a lot of tricks involved to make specific predictions possible, which the handful of patents I’m named on go into detail about. One of the coolest results involved predicting a user was going to click on something before they actually did. That was a ton of work and something I’m proud of. Your pupil reacts before you click in part because you expect something will happen after you click. So you can create biofeedback with a user's brain by monitoring their eye behavior, and redesigning the UI in real time to create more of this anticipatory pupil response. It’s a crude brain computer interface via the eyes, but very cool. And I’d take that over invasive brain surgery any day.
Other tricks to infer cognitive state involved quickly flashing visuals or sounds to a user in ways they may not perceive, and then measuring their reaction to it.
Another patent goes into details about using machine learning and signals from the body and brain to predict how focused, or relaxed you are, or how well you are learning. And then updating virtual environments to enhance those states. So, imagine an adaptive immersive environment that helps you learn, or work, or relax by changing what you’re seeing and hearing in the background.
All of these details are publicly available in patents, and were carefully written to not leak anything. There was a ton of other stuff I was involved with, and hopefully more of it will see the light of day eventually.
A lot of people have waited a long time for this product. But it’s still one step forward on the road to VR. And it’s going to take until the end of this decade for the industry to fully catch up to the grand vision for this tech.
Again, I’m open to consulting work and taking calls if your business is looking to enter the space or refine your strategy. Mostly, I’m proud and relieved this has finally been announced. It’s been over five years since I started working on this, and I spent a significant portion of my life on it, as did an army of other designers and engineers. I hope the whole is greater than the sum of the parts and Vision Pro blows your mind.

English

apple.com/apple-vision-p… is live!! Amazed by what @apple has put together, they have clearly thought through the entire user experience. How long before a Vision Pro emoji?? 👀😎 #visionpro #applevisionpro
English
Charles Phillips retweetledi

Snoop: "Master P had to go visit Suge Knight in the penitentiary. [They] struck a deal because everybody else was scared of [him]because that’s when Suge Knight was the monster, the boogieman. He went to go see him, struck a deal..." 25iq.com/2018/02/17/bus…
English

Snoop on wholesale transfer pricing: "YouTube, y’all mofos need to break bread or fake dead.” variety.com/2023/music/new…
English

@trengriffin Best summary I have seen today was
this self quote/thread
Helge Wurst@MisterHW
@nextspaceflight @NASASpaceflight NSF might be able to fill in some gaps and figure out what number of engines are lit at every moment in time. They may have re-lit an engine, or at least attempted to do it, but it appears not to have run with the right ratio twitter.com/MisterHW/statu…
English

Now tracking @vendorfinance on @arbitrum
Vendor Finance allows for permission-less, isolated, fixed-rate, fixed-term, and zero liquidation loan markets to be created
defillama.com/protocol/vendo…

English

@stonekaiju @Scobleizer “Normally plants don't have eyes, so it's hard for me to trust them. Hence. The googly eyes.”
youtu.be/zc7qJE9Nzo8

YouTube
English
Charles Phillips retweetledi

Tonight’s the night! Tune in for a sick live stream direct from @thesteelyardLDN with @MattOptical🚀 Co-create the show with legends Ed Rush,
@blacksunempire_nl & more! Kicks off at 1AM BST! youtube.com/watch?v=LAavPo…

YouTube
English















