Ian Daniel

1.5K posts

Ian Daniel

Ian Daniel

@death667b

Part owner Junction3 IT Services. Studying Computer Science at QUT Volunteer at @magikcraftio

Brisbane, Queensland Katılım Haziran 2011
292 Takip Edilen172 Takipçiler
Jesse
Jesse@Jesselloll·
@WisdomRCN @MattWallace888 @grok I’m not a genius or anything.. but Usually a RED arrow and a RED circle tells you what you need look at.
English
2
0
4
626
Matt Wallace
Matt Wallace@MattWallace888·
Erika Kirk is going viral again after people noticed something missing from Charlie’s old set
Matt Wallace tweet media
English
239
345
2K
172.8K
Ian Daniel
Ian Daniel@death667b·
@realjonnysamp @Ames2420 I still drink hose water when I am mowing without any kind of ppe. Including no shoes. The odd stone or stick hits the leg and I think that is going to strengthen my immune system.
English
0
0
0
22
Jonny Samp
Jonny Samp@realjonnysamp·
@Ames2420 Kids today would hire a lawyer and file a lawsuit all with a few clicks on their iPad. This generation in the photo was tough. They had metal slides, ate worms and drank hose water.
English
568
86
5.8K
577.5K
🎹 Ames™ 🎹
🎹 Ames™ 🎹@Real_Ames·
Does anyone remember this fair ride or what it's called? There were no straps... it would spin so fast that it would pin you against the wall. People would vomit. It was fun. 🤣
🎹 Ames™ 🎹 tweet media
English
31.6K
5.8K
123.6K
25.9M
Elon Musk
Elon Musk@elonmusk·
How old were you when you realized others couldn’t see the matrix?
English
38.4K
18.8K
246.1K
66.2M
Ian Daniel
Ian Daniel@death667b·
@setox @troyhunt The camera’s shutter is held open. I know someone who is an amateur astronomer and they basically keep the shutter open for a couple of hours and the images they capture are amazing. Need special tech to do this.
English
1
0
0
106
Troy Hunt
Troy Hunt@troyhunt·
Whoa 😦
Troy Hunt tweet media
English
3
1
89
12.3K
Ian Daniel
Ian Daniel@death667b·
@GuntherEagleman The guy can dance.. The car makes for a good dance floor? Maybe they are complaining that they don't have trampolines? The car is not a good trampoline.
English
0
0
0
1
Ian Daniel
Ian Daniel@death667b·
@elonmuskADO Twitter, then correct my self and say X with frustration. And a message is a tweet still, don’t know how else to say it.
English
0
0
0
6
Elon Musk
Elon Musk@elonmusk·
Just bought a new PC laptop and it won’t let me use it unless I create a Microsoft account, which also means giving their AI access to my computer! This is messed up. There used to be an option to skip signing into or creating a Microsoft account. Are you seeing this too?
English
36.7K
28.2K
197.8K
94.1M
Ian Daniel retweetledi
MrBanks💰
MrBanks💰@Mrbankstips·
@MrBeast 100 lucky followers who RT this will get $200 each if I win
English
2.4K
27.7K
31.6K
6.9M
Ian Daniel retweetledi
Not Elon Musk
Not Elon Musk@ElonMuskAOC·
@MrBeast If I win this giveaway, I’ll give someone who likes and reshares this post: $6,900
English
4.5K
63.1K
114.6K
9.2M
Ian Daniel retweetledi
MrBeast
MrBeast@MrBeast·
I’m gonna give 10 random people that repost this and follow me $25,000 for fun (the $250,000 my X video made) I’ll pick the winners in 72 hours
English
384.1K
2.6M
1.9M
282M
Ian Daniel
Ian Daniel@death667b·
@Algomancer I - as a human being, is informed by the past. When I learn new things those past experiences “colour” what I learn newly. I have pondered how would this look like for a machine?
English
0
0
1
12
Adam Hibble
Adam Hibble@Algomancer·
Something I have been thinking about recently is the notion of plasticity/retrainability in neural networks. In particular when the modeling information changes over time. As models get more and more expensive to train, the ability for a model to be continually trained on new tasks and new domain data becomes increasingly important. This is obviously related to phenomena such as catastrophic forgetting, but there are actually a bunch of phenomena that can lead to plasticity loss/ expressivity loss over the course of training such as primacy bias, implicit under parameterisation, rank collapse and capacity loss. A lot of the impact of these decrease as models scale, but I think it might be a mistake to assume we don’t need to think about it and it might be something we want to benchmark new models on. I don’t have many prescriptions for these yet, or where they do and don’t apply to training generative models. But figured I'd share some work I think is interesting. I think this is increasingly important as we begin to move towards non-stationary domains and tune models with reinforcement learning and non direct cross entropy objectives. I figured I'd dump some reading and insights that I have picked up over a bit of a rabbit hole dive. This was my entry point to reading. arxiv.org/pdf/2303.01486…
Adam Hibble tweet media
English
4
0
12
1.2K