Jeffrey Shih

3K posts

Jeffrey Shih banner
Jeffrey Shih

Jeffrey Shih

@shihzy

Bay Area Texan | Ex a bunch of places

San Jose, CA Katılım Kasım 2008
228 Takip Edilen899 Takipçiler
PunchesBears ㅎ㉨ㅎ
PunchesBears ㅎ㉨ㅎ@punchesbears·
the graphics debug options in UE are pretty amazing. these are just for nanite. I'm learning c++. what is happening to me.
English
6
4
109
12.9K
Andy Touch (Same on 🟦⛅)
I hope the next Unity CEO opens Unity atleast once a week and tries to make something with it. 🙏
English
25
20
395
32.4K
Jeffrey Shih
Jeffrey Shih@shihzy·
@LinkedInHelp for some reason, when I try to add a company under Company Name, the LinkedIn pages don’t show up. Can you help?
Jeffrey Shih tweet media
English
4
0
3
61
Jeffrey Shih
Jeffrey Shih@shihzy·
@Jakob_Wahlberg @willgoldstone I load up all my tasks as calendar events. Sunday is the backlog. Sunday night, I move everything to an actual time block so I actually complete the task. Anything left over or not completed goes to following Sunday. Also helps to put tasks in the cal so no one books that time.
English
0
0
1
30
Jakob Wahlberg
Jakob Wahlberg@Jakob_Wahlberg·
Ok, there must be more efficiency nerds out there. Show me your efficient & clever workflows!
English
11
4
35
11.2K
Jeffrey Shih retweetledi
Tony Fadell
Tony Fadell@tfadell·
When it comes to products, the story doesn't exist just to sell. It's there to help you define what you're working towards, help you understand it, and help you understand your customers.
English
9
23
152
14.7K
Jeffrey Shih
Jeffrey Shih@shihzy·
Deandre Jordan has a championship ring. Think about that…
English
0
0
0
113
Jeffrey Shih retweetledi
Sterling Crispin 🕊️
Sterling Crispin 🕊️@sterlingcrispin·
Vision Pro mega-thread 1/5: My advice for designing and developing products for Vision Pro. This thread includes a basic overview of the platform, tools, porting apps, general product design, prototyping, perceptual design, business advice and more. Disclaimer: I’m not an Apple representative. This is my personal opinion and does not contain non-public information. Overview: Apps on visionOS are organized into “scenes”, which are Windows, Volumes, and Spaces. Windows are a spatial version of what you’d see on a normal computer. They’re bounded rectangles of content that users surround themselves with. These may be windows from different apps or multiple windows from one app. Volumes are things like 3D objects, or small interactive scenes. Like a 3D map, or small game that’s not immersive. Spaces are fully immersive experiences where only one app is visible. That could be full of many Windows and Volumes from your app. Or like VR games where the system goes away and it's all custom content. You can think of visionOS itself like a Shared Space where apps coexist together and you have less control. Whereas Full Spaces give you the most control and immersiveness, but don’t coexist with other apps. Spaces have immersion styles: mixed, progressive, and full. Which defines how much or little of the real world you want the user to see. User Input: Users can look at the UI and pinch like the demo videos show. But you can also reach out and tap on windows directly, sort of like it’s actually a floating iPad. Or use a bluetooth trackpad or video game controller. You can also look and speak in search bars, but that’s disabled by default for some reason on existing iPad and iOS apps running on Vision Pro. There’s also a Dwell Control for eyes-only input, but that’s really an accessibility feature. For a simple dev approach, your app can just use events like a TapGesture. In this case, you won't need to worry about where these events originate from. Spatial Audio: Vision Pro has an advanced spatial audio system that makes sounds seem like they’re really in the room by considering the size and materials in your room. Using subtle sounds for UI interaction and taking advantage of sound design for immersive experiences is going to be really important. Make sure to take this topic seriously. Development: If you want to build something that works between Vision Pro, iPad, and iOS, you'll be operating within the Apple dev ecosystem, using tools like XCode and SwiftUI. However, if your goal is to create a fully immersive VR experience for Vision Pro that also works on other headsets like Meta's Quest or PlayStation VR, you have to use Unity. Apple Tools: For Apple’s ecosystem, you’ll use SwiftUI to create the UI the user sees and the overall content of your app. RealityKit is the 3D rendering engine that handles materials, 3D objects, and light simulations. You’ll use ARKit for advanced scene understanding. Like if you want someone to throw virtual darts and have them collide with their real wall, or do advanced things with hand tracking. But those rich AR features are only available in Full Spaces. There’s also Reality Composer Pro which is a 3D content editor that lets you drag things around a 3D scene and make media rich Spaces or Volumes. It’s like Diet-Unity that’s built specifically for this development stack. One cool thing with Reality Composer is that it’s already full of assets, materials, and animations. That helps developers who aren’t artists build something quickly and should help to create a more unified look and feel to everything built with the tool. Pros and cons to that product decision, but overall it should be helpful. Existing iOS Apps: If you're bringing an iPad or iOS app over, it will probably work unmodified as a Window in the Shared Space. If your app supports both iPad and iPhone, it’ll look like the iPad version. You can use the Ornament API to make little floating islands of UI in front of, or besides your app, to make it feel more spatial. But that’s not something all existing apps get automatically. Ironically, if your app is using a lot of ARKit features, you’ll likely need to ‘reimagine’ it significantly as ARKit has been upgraded a lot. If you’re excited about building something new for Vision Pro, my personal opinion is that you should prioritize how your app will provide value across iPad and iOS too. Otherwise you're losing out on hundreds of millions of users. Unity: You can build to Vision Pro with the Unity game engine, which is a massive topic. Again, you need to use Unity if you’re building to Vision Pro as well as a Meta headset like the Quest or PSVR. Unity supports building Bounded Volumes for the Shared Space which exist alongside native Vision Pro content. And Unbounded Volumes, for immersive content that may leverage advanced AR features. Finally you can also build more VR-like apps which give you more control over rendering but seem to lack support for AR Kit scene understanding like plane detection. The Volume approach gives RealityKit more control over rendering, so you have to use Unity’s PolySpatial tool to convert materials, shaders, and other features. Unity support for Vision Pro allows for tons of interactions you’d expect to see in VR, like teleporting to a new location or picking up and throwing virtual objects.
Sterling Crispin 🕊️ tweet media
English
73
390
2.3K
1.2M
Jeffrey Shih
Jeffrey Shih@shihzy·
I think the most impressive feature of the Apple Vision Pro is the Optic ID. Periocular authentication is such a great benefit for users especially for headsets. Insanely difficult to build.
English
0
0
0
95
Jeffrey Shih
Jeffrey Shih@shihzy·
@HarriVayrynen Yea i think everyone is trying to create the “working in VR” use case. But miss the bigger point that most people don’t like to work 😂
English
0
0
0
8
Harri Väyrynen
Harri Väyrynen@HarriVayrynen·
@shihzy Looking my mac screen with em. Sounds very stupid idea ;)
English
1
0
0
16
Jeffrey Shih
Jeffrey Shih@shihzy·
Ok - let’s see how Apple Vision Pro works IRL before we all go bananas…
English
2
0
0
136
Jeffrey Shih
Jeffrey Shih@shihzy·
Actually it looks pretty amazing. Some much goodness (design, content strategy, sync with all devices)
English
0
0
0
49
Jeffrey Shih
Jeffrey Shih@shihzy·
Current passport renewal times is 10-13 weeks but they cashed my check in 1 day 🤔
English
0
0
0
111