The fact is that research papers are mostly tested on the very top tier of GPUs available the year they are published (and publication takes time, so they're actually even more ahead).
Is it because researchers only work with recent high end #GPU that everybody has to renew their hardware so fast or is the other way around?
Causality is hard to prove, but the correlation is a fact! And we have plenty more figures to show at our #SIGGRAPH2025 talk next summer!
Apply now to be part of the 2025 cohort of rising stars in computer graphics 🌟 This is a mentoring program with workshops co-located with Siggraph, for students and post-docs of under-represented genders in CG research
Deadline for submissions: Apr 18
wigraph.org/events/2025-ca…
Hi all, I’m on the job market for industry research scientist or TT faculty positions starting Summer 2025.
Interested in roles related to HCI, XR, Eye Tracking, Adaptive Interfaces, and Human-AI Interaction. Please reach out if hiring or aware of any positions!
RT appreciated!
Work done with @fanny_uoft@karansher and Adrien Bousseau.
This wouldn't have been possible without precious insights and time invested by our study participants! The @QuillSmoothstep community is lovely, thank you so much!
By letting artists work on a stack of apperance layers that define colors on top of a substrate layer that defines geometry, we can get many cool things!
✔️ non-destructive editing
✔️ color effects (blend modes, opacity gradients)
That issue stems from the fact that brush strokes in Quill define both the 3D geometry and the color to be rendered.
Our key idea is to decouple geometry and color, such that these two components can be edited independently with stroke-based interactions.
A specific pain point we identified was on how to edit colors of 3D strokes.
In Quill, you have to either apply new colors as vertex colors -- a bit blurry and hard to control...
VR painting tools like Quill are great -- love that software! -- but they're still in the early days of defining what a painting experience in 3D could be.
We got the chance to chat with some brilliant Quill artists to understand VR painting workflows better. Turns out, in VR painting you can essentially paint all light and shadow effects, and it looks great.
Here's a beautiful example by @NickTheLaddskfb.ly/owFLu
We could not release the full pipeline to convert your own video clips to video+cameras+depth, but we do describe pretty extensively our internal format. We also provide all preprocessed videos from our paper's results to try out the system!
Want to draw a face on inanimate objects, or arms on a flamingo? 🎨
I’m happy to share our latest research project, where we looked at how we can assist users in creating video doodles: hand-drawn animations added on top of videos 📽️
#SIGGRAPH2023
Time to share our #SIGGRAPH2024 paper "N-Dimensional Gaussians for Fitting of High Dimensional Functions"!
We show that N-Dimensional Gaussian mixtures can be optimized to handle high dimensional inputs like MLPs with a fraction of the training times.
sdiolatz.info/ndg-fitting/