Expo config plugins = a clean way to modify your native code without touching native code.
Works great with CNG — update AndroidManifest.xml, Info.plist, icons, and more via JS.
📘 Updated intro guide here: docs.expo.dev/config-plugins…
🚀 Launching a new blog series sharing actionable solutions from my dev projects! First up: Building Scalable Design Systems with TanStack Forms.
Struggling with custom form fields that play nicely with your component library?
Oigan yo he probado todos las actividades deportivas. Ahora estoy con ganas de salir a correr, pero los airpods se me caen y los Sony que tengo dice que no son para hacer ejercicio.
Alguna recomendación?
rolen contactos de sus dealers de energía solar en cdmx porque si yo me pongo a investigar cómo resolverlo no voy a poder parar y nomás me voy a crear otra adicción
@anamariasosam Me ha pasado que la carpeta de default tiene muchiiiiisimos archivos y eso hace que cargue lento. También si por allí tienen algunos programas tienen previsualización de archivos termina haciendo más lento todo. Aún así suena a que es por alguna extensión que se instaló.
Aiuda amix devs:
Tengo un componente le permite al usuario añadir archivos, pero en Edge se está demorando muchísimo en abrir el modal para seleccionar los archivos (el que es nativo del navegador) .
Como podemos probar la solución? Mi Edge carga super rápido 🥺
Want a chance to win 1 full bitcoin? Yeah, you do. Repost and reply with #WealthsimpleCrypto and #contest to be entered into our all-time highest giveaway.
Not every foundation model needs to be gigantic. We trained a 1.5M-parameter neural network to control the body of a humanoid robot. It takes a lot of subconscious processing for us humans to walk, maintain balance, and maneuver our arms and legs into desired positions. We capture this “subconsciousness” in HOVER, a single model that learns how to coordinate the motors of a humanoid robot to support locomotion and manipulation.
We trained HOVER in NVIDIA Isaac, a GPU-powered simulation suite that accelerates physics by 10,000x faster than real time. To put the number in perspective, the robots undergo 1 year of intense training in a virtual “dojo”, but take only ~50 minutes of wall clock time on one GPU card. The neural net then transfers zero-shot to the real world without finetuning.
HOVER can be *prompted* for various types of high-level motion instructions that we call “control modes”. To name a few:
- Head and hand poses: can be captured by XR devices like Apple Vision Pro.
- Whole-body poses: via MoCap or RGB camera.
- Whole-body joint angles: Exoskeleton.
- Root velocity command: Joysticks.
What HOVER enables:
- A unified interface for us to control the robot using whichever input devices are convenient at hand.
- An easier way to collect whole-body teleoperation data for training.
- An upstream Vision-Language-Action model to provide motion instructions, which HOVER translates to low-level motor signals at high frequency.
HOVER supports any humanoid that can be simulated in Isaac. Bring your own robot, and watch it come to life!
It's a big teamwork from NVIDIA GEAR Lab and collaborators: 🧵