Blank

1 posts

Blank

Blank

@SuperBot123456

United States Katılım Nisan 2012
73 Takip Edilen10 Takipçiler
Blank
Blank@SuperBot123456·
@chatgpt21 Yes. Hebbian learning within the weights themselves. Training then follows on how to actual use the neural plasticity. Creating the deltas to be additive and not subtractive is the trick. The issue is that creates something that isn't a great "tool". It adds unpredictability
English
0
0
0
110
Chris
Chris@chatgpt21·
Reminder to the public that Continual Learning is NOT: >updating knowledge cutoff every month >knowledge frozen in amber like jurassic park mosquito > fine tuning on last week’s headlines >RAG but we pinky promise it’s different > web search stapled on. ———— ACTUAL Continual Learning: >model encounters new information integrates it into weights without full retrain >doesn’t forget how to code while learning about today’s news >catastrophic forgetting? never heard of her >learns from conversations in real-time remembers you said you hate cilantro permanently
Omar Sanseviero@osanseviero

@iruletheworldmo continual learning ftw

English
40
31
408
41.5K