Nithya Nadig Shikarpur

58 posts

Nithya Nadig Shikarpur

Nithya Nadig Shikarpur

@NithyaIsMe

interested in music + technology research | Hindustani music student | PhD student MIT

Katılım Ekim 2019
727 Takip Edilen538 Takipçiler
Nithya Nadig Shikarpur
Nithya Nadig Shikarpur@NithyaIsMe·
This is happening tomorrow from 9am to 12 pm and 1 pm-4 pm in East Ballroom C! You can come and interact with our model too!! 🎤 Hope to see you there :) #NeurIPS2024
Nithya Nadig Shikarpur@NithyaIsMe

We studied some musicians interacting with GaMaDHaNi, a generative model for Hindustani vocal music 🎤! I am so excited to present this work at @NeurIPSConf @ML4CDworkshop this week! 📝arxiv.org/abs/2411.13846 🎧cedar-decade-974.notion.site/Example-Videos… An example interaction with the model:

English
0
1
17
2.2K
Nithya Nadig Shikarpur
Nithya Nadig Shikarpur@NithyaIsMe·
While the model presents exciting directions for creative explorations and human-AI partnerships, this work notes the experiences of musicians to inform the model's future development! More information on the generative model: x.com/NithyaIsMe/sta…
Nithya Nadig Shikarpur@NithyaIsMe

We built a hierarchical generative model to sing Hindustani vocal melodies 🎤! We will be presenting this work at ISMIR 2024! @ISMIRConf 📝Paper: arxiv.org/abs/2408.12658 💻Code: github.com/snnithya/GaMaD… 👩🏽‍💻Demo: huggingface.co/spaces/snnithy… 🎧Samples: snnithya.github.io/gamadhani-samp…

English
1
0
1
235
Nithya Nadig Shikarpur
Nithya Nadig Shikarpur@NithyaIsMe·
@0xhexhex Hi, we used two open source datasets collected by folks at @mtg_upf: Saraga and Hindustani Raga Recognition Dataset. Let me know how the tinkering goes!
English
1
0
1
25
Claire 0xCAFE_BABE
Claire 0xCAFE_BABE@0xhexhex·
@NithyaIsMe Fantastic! What datasets did you use? And are these available for research? I'd love to tinker with stuff like raga classification, etc. Thanks!
English
1
0
0
15
Nithya Nadig Shikarpur
Nithya Nadig Shikarpur@NithyaIsMe·
Intending to use this model for interactive human-AI generation, we present two possible use cases with our model: (1) primed generation and (2) coarse pitch conditioning. Learn more and play around with these interactions in our demo (huggingface.co/spaces/snnithy…)!
English
1
0
5
200
Nithya Nadig Shikarpur retweetledi
Arkil Patel
Arkil Patel@arkil_patel·
🚨Understanding In-Context Learning: 1. Pretrained LLMs can implement learning algorithms to learn from data in-context. 2. Transformers can encode multiple algorithms for the same task and use one based on context at inference time. 3. Attention-free models also exhibit ICL.
Satwik Bhattamishra@satwik1729

Recently, Transformers have been shown to implement learning algorithms in-context. Key questions: What are their limits? Can they exploit informative examples to learn more efficiently? How does this relate to pretrained LLMs? Our new preprint explores these questions. 🧵

English
0
12
72
11K
Nithya Nadig Shikarpur
Nithya Nadig Shikarpur@NithyaIsMe·
@the_smg97 @arvshank Yeah that makes sense! So are you saying more discrete changes in speed vs. the current continuous change or just increasing the wavelength of the sine wave to make the change in speed slower and thus less obvious? Or maybe both?
English
1
0
0
55
Nithya Nadig Shikarpur
Nithya Nadig Shikarpur@NithyaIsMe·
Today @snpranav and I got into an argument over how frustrating it would be for the speed of a song to constantly change as a function of time. To settle it, I decided to use Bespoke to control the speed of 'Spain' by Chick Corea based on a modified sine wave :)
English
1
1
14
1.9K