Sabitlenmiş Tweet
Arnav
7.6K posts


"LLMs are only pretending to have emotions because we taught them how emotions work in their training data."
This is an incredibly naive and stupid thing to say.
Because how exactly humans got them???
We got it through evolution, we literally developed/learned them because we needed them, then generation by generation we passed them on such that emotions got concrete.
Now we got all this information, wrote it all up and gave it to a bunch of maths such that it can behave emotionally just as humans do/imitate.
The current generation of humans are only imitating the emotional information we got from all the previous generations.
So, there is not a huge difference in how humans and ai displays emotions.
English

i would have been a painter since my childhood, had i been born here
novaworld@countryyvibes
Switzerland 🇨🇭
English


@notKartikk didn't know i'd learn websockets for the first to create tunnels in colab
English

would have never thought this project will be the reason i learn websockets
kartik@notKartikk
fixed latency and streamed text to create audio faster which is still slow because i do not have an nvidia gpu
English

@notKartikk the 13 yo is worth meeting if it directly affects this timeline
English

@notKartikk sure. also i'm not cracked, just learning things over time
English

@__iamarnav i really like this project, you are so cracked bro.
we can talk in dm, if you want to
English

fixed latency and streamed text to create audio faster which is still slow because i do not have an nvidia gpu
kartik@notKartikk
was so impressed by neuro sama that i made a shitty version of it just tried making it work
English

@__iamarnav ohhh, i will try that its a pretty good idea.
but for this specific project i kinda wanted to run things locally (right now am using an api for the llm but that will change)
like a 24/7 working ai vtuber that makes shorts and posts and all.
English

@notKartikk i've been doing this for more than two months lol. kinda problematic in kaggle tho but once achieved there, you can get two T4s
English

@notKartikk you can use gpt sovits in colab. or if you find a better tts. just use websockets and ngrok or cloudflare to use the gpu for generation
English

@__iamarnav oh i will check it out, i also have to test with mlx tts models, that will reduce latency by a lot
English






