
Kudos to the research team at our sister company @eyelinestudios. Their latest research paper, 🌊Go-with-the-Flow 🌊, will be presented at #CVPR2025!
Based on their research, we believe this could allow artists in the future to leverage these new techniques to direct the motion in generated videos, empowering creative control in a wide range of video applications: cut-and-drag animation, transferring movement between videos, first frame editing, camera control via depth warping, and text-to-video 3D scene creation.
Kudos to the amazing team: @RyanBurgert, @Yuancheng_Xu0, @wenqi_xian, Oliver Pilarski, Pascal Clausen, Mingming He, @maleewahaha, @yitong_deng, Lingxiao Li, Mohsen Mousavi, @ryoo_michael, @debfx, @realNingYu, from @eyelinestudios, @Scanline_VFX, @netflix, Stony Brook University, University of Maryland, and @Stanford.
***This is part of the ongoing research and development at @eyelinestudios and we hope to see adoption in these techniques and workflows soon.
Paper: arxiv.org/pdf/2501.08331
Web: eyeline-research.github.io/Go-with-the-Fl…
Code: github.com/Eyeline-Resear…
Models: huggingface.co/Eyeline-Resear…
#MachineLearning #video #VideoGeneration #DiffusionModels #VideoDiffusionModels #OpenSource
English












