coded_bruh

83.9K posts

coded_bruh banner
coded_bruh

coded_bruh

@codedbruhh

old sole, New Shoes

💭 Katılım Kasım 2023
394 Takip Edilen2.5K Takipçiler
coded_bruh retweetledi
Shane Gilarinjdkjfnf Alexander
As a man known in his social circle for his style of dress, I’ll say men don’t dress to impress anymore because women don’t factor that into attraction anymore and haven’t for a long time. When’s the last time you heard a woman say she likes well dressed men?
Tragic Mike@SaintMichael293

I don’t wish to be a snob, but I’m going to be. Went to a wedding last night and the effort fellas put into getting dressed is utterly pathetic. Women all looked great, 50% of the men looked like they were actually protesting against making an effort.

English
24
20
394
22.6K
coded_bruh retweetledi
Devon Eriksen
Devon Eriksen@Devon_Eriksen_·
Never send a biologist to do the work of a computer scientist. Dawkins doesn't understand that evolution built human computational abilities breadth-first — memory, language, object model, generalization and classification, agency, and so on, all in a primitive state, and then refined them. Computer science isn't doing that. It is building human capabilities depth-first. So we have something that emulates human language capabilities to an advanced degree... ... but nothing else. That's why there is a curious sense of something missing when you talk to Claude or Grok or ChatGPT. It's not some minor errors with use of language itself. Its language capabilities are quite advanced. What you are detecting instead is the complete absence of these other neural systems, which are what lies behind the use of language in people. Something that is very glib with language but has no object model, no mirroring ability, no understanding of the ground truth of the universe its in might be able to become president of the United States, or win a Nobel peace prize, but it isn't actually a person. It's more like a small slice of a person's brain, containing Wernicke's and Broca's areas, and very little else. We're not used to thinking of people as a collection of systems, but we're going to have to start, because we no longer have the luxury of dividing the universe in human and not, and automatically assuming every human is a person, and every non-human isn't. You can't evaluate a software neural net as if it were a proto-human, and try to decide on that basis whether it is a person that's allowed to do what we allow people to do. If you allowed a small slice of brain, containing Wernicke's and Broca's areas, to do things like vote or run for office, then it would be able to appear to do so, but have no actual understanding of what was going on, no coherent model of the universe or the task before it. This would lead to disaster for any number of issues, such as race relations or the medical industry. Let me be 100% clear... LLMs are not people. They are not people now. They will never be people. And anyone who thinks LLMs are people is probably not a person, either. We may someday make something that is a person. But it will have an LLM, not be one.
Andrew Stratelates ⚓️(Continuing Anglican)@AStratelates

Bahahahahahah

English
96
98
922
50K
coded_bruh retweetledi
Devon Eriksen
Devon Eriksen@Devon_Eriksen_·
Pretty sure I got a strong hypothesis for that one. The brain is a buncha neural nets. Connected, of course, but it helps to think of them as several different ones, rather than one big one, because they can have different characteristics. Now, something Eric already knows, but others might not, is that there are different ways to wire up neural nets. You can connect them very densely, with every node linking to every other. Or very sparsely, with each node linking to only a few of its neighbors. Or anything in between. Sparsely connected neural nets are very good at learning quickly. They can adapt and absorb new patterns almost immediately. But they suck at generalization and pattern recognition. Show them a photo of a black cat, a white cat, and an orange cat and it will never occur to them that they are all the same species. Or even that "species" is a thing. For organizing facts into classifications, into principles, into stories, you need a densely connected neural net. They're great at inference, deduction, and most of what we call "higher level" thinking. Problem is, they don't learn quickly. You need to play training data in over and over, hundreds or thousands of times, before they can even remember it, much less have sophisticated ideas about it. Humans need both capabilities. We need to be able to think abstractly... but we also need to be able to remember things after we've seen them for the first time, otherwise we have nothing to think abstractly about. So we have both types of neural net. Some are sparse, such as the hippocampus, for recording things quickly. But all of these are connected to the frontal cortex, a densely connected neural net. And the sparse neural nets play their quickly-stored data into the frontal cortex, over and over and over again. And the cortex is able to do stuff with that data. Like sort objects into abstract classes. Like design symbols that it can manipulate to represent objects. Like organize sequences of events into causally-related stories. The frontal cortex is the part of the brain that has what we call "consciousness", because it's the only part that can. What we call "consciousness" or "self-awareness" or "ego" is a sophisticated thought operation that involves classification, symbol manipulation, and abstract cause-and-effect reasoning. It's the act of the observer noticing that observations contain effects that don't come from the observed. Then putting them all together, correlating them, and organizing them into the notion of an observer. This creates the sense of self. Hence, "self-awareness". But now, notice something. Because of the nature of these operations, only the frontal cortex can perform them. This is why the consciousness is unitary, but the brain is not. Because that consciousness only occurs in one area, and it is only aware of itself. The FC cannot introspect other areas of the brain, because it has no direct access to them, and they, themselves, are incapable of that kind of thinking. This is why we have "unconscious" areas of our own psyche, motivations we are unaware of, decisions we don't know how we made. Or decisions we do know how we made, but we're wrong, we're lying to ourselves, it's all just a post-hoc rationalization. Selfness is the inevitable artifact of a sophisticated enough generalized processor, if (and only if) it has access to data from which it can infer itself. Which means that Peter Watts was not only wrong, he was 100% backwards. It also means that LLMs, by themselves, cannot be conscious in any meaningful sense. The kind of consciousness or ego I describe lives in the world-object model, not in the language model. This also means that qualia are a philosophical dead end. Not only because their presence or absence is unfalsifiable, but because they are no great mystery at all. You don't have an FC, you are one. And the sensations you feel, from simple ideas like "red" to complex ones like "me", are simply that FC's way of experiencing its own processing of that information. Whether or not "you" includes anything besides the cortex is kind of a pointless semantic distinction. The "you" that you feel includes only the FC itself. But without those other systems, you would not be able to have it or experience anything at all. However, the same is true not just of your other brain systems, but also of your heart, your lungs, your blood, the steak you ate, the oxygen you breathe, and your mother. So the boundaries of the self are wherever we decide to draw a line. But the sensation of the self is probably due to something like this mechanism.
Devon Eriksen tweet media
Eric S. Raymond@esrtweet

Nobody knows the answer to that for sure. I can tell you what I think. It has been found when building AI systems with heterogeneous specialized ports that it's useful to have a "scratchpad" area of memory or data structure where each of them can publish things to be visible by all of the others. The unitary consciousness we think we have is that scrstchpad.

English
38
25
383
19.8K
coded_bruh retweetledi
Anish Moonka
Anish Moonka@anishmoonka·
A lion can stand three feet from your face on a safari and not even register that you exist. To its brain, you and the jeep are the same animal. One big weird shape that doesn't smell like food. Stand up though, and you go from invisible to dinner in under a second. For the lion, you and the other tourists never register as separate people. The whole jeep looks like one giant creature made of metal and fabric and humans all smushed together. That shape has no scent of any prey animal, and it moves nothing like one. The brain searches its mental file of every animal it's ever hunted, finds no match, and moves on. Lions learn this from their mothers. In places like the Serengeti or Maasai Mara, they see more than 100 of these jeeps a day. Cubs grow up watching mom ignore every truck. They copy what mom does. After a few generations, an entire population of lions has decided that safari vehicles are boring background noise, no different from trees or rocks. Hunting is expensive. A lion that picks the wrong target won't have enough energy left to catch the right one tomorrow. So when the brain sees a weird shape that doesn't fit anything in its hunting memory, it just skips it. But the whole truce hangs on one rule. The shape has to stay the same. The second someone stands up or leans out the window, the big creature breaks apart. Suddenly there's a person-sized snack standing where a big boring shape used to be. The lion's brain registers the change in under a second. In June 2015, a 29-year-old American filmmaker rolled down her window at a park near Johannesburg to take a photo. A lioness was already a meter from the truck, just watching. It lunged through the open window and bit her in the neck. She died at the scene. Ten years later, in September 2025, a zookeeper at Safari World in Bangkok stepped out of his vehicle in the lion section. One lion charged. The rest of the pride joined within seconds. The park had run these tours for over 40 years and nobody had ever died like that. Craig Packer has spent over 40 years studying lions and started the world's first lion research center back in 1986. He's said it plainly more than once. Lions don't have much patience for humans acting weird. Sit still and you're part of the furniture; move suddenly and you're a target. The truce works because every lion in those parks grew up watching its mom ignore the trucks. Break the pattern, and the whole thing falls apart in about as long as it takes to stand up.
Nurse@MaysaBolelli

Afrika'da hayvanlar safari araçlarına neden saldırmaz?

English
134
2K
12.3K
1.7M
coded_bruh retweetledi
Mike Richter
Mike Richter@TheHeroesForge·
This is wild. Men don’t play video games to “be attractive to women,” so why should we care about this opinion? It is gynocentric to believe that men should be constantly working and toil by for the approval and ‘attraction of women.’ Men of action, who most women are attracted to, have every right to select how they use their leisure time with this kind of nonsense passive judgment.
Lizzie Marbach@LizzieMarbach

I know this is unpopular, but it will always be unattractive for a grown man to play video games. Some women might be understanding or pretend like they don’t care that you spend hours playing, but they do. It is extremely unattractive to women and will never not be. 🤷🏼‍♀️

English
7
12
71
1.6K
-𝓌𝑜𝓁𝒻𝑔𝒶𝓃𝑔-
@codedbruhh i know for a fact there’s a budget for the LW, LCB position, but as it stands we also want a striker. 2 positions will be fulfilled out of the 3, i just don’t know which position is getting sacrificed for a masia kid stint. i’m banking on sales sha, either kounde or araujo.
English
1
0
1
5
Niccollo
Niccollo@NiccoFuse·
@codedbruhh Lol. Man City and their fans still think this is 2018 - 2023 Man City. Man City are only in the title race because Arsenal didn't wrap this up early. Man City should just focus on the FA Cup.
English
1
0
1
27
coded_bruh
coded_bruh@codedbruhh·
Declan rice told y'all "its not done" while they were already celebrating a league title.
English
3
2
15
205
Iszy
Iszy@Iszy012·
@codedbruhh 😂 😂 This guy. He say for your son. Everton try for una o. Just pray Bournemouth maintain their form against City. At least draw.
English
1
0
1
17
coded_bruh
coded_bruh@codedbruhh·
David Moyes can you do this for your son? Even a draw please 😭😭
English
2
2
12
307
coded_bruh
coded_bruh@codedbruhh·
Timber wey don pocket vini, doku and kvara. Holy trinity bro! He will do it to kvara again. Na him bitch.
Don Jay of Arsenal@Donjaytrix001

@codedbruhh Bro make Timber and Merino just Dey fit for that finals make them see something

English
0
0
4
100
Iszy
Iszy@Iszy012·
@codedbruhh This is the same way I see people taking Uloma and Jola seriously. Some people don't even deserve your attention.
English
1
0
1
8
coded_bruh
coded_bruh@codedbruhh·
Taking lunatics like sugarbelly seriously in 2026 is a you problem. In another timeline she's dancing in the village market square with children and eating leftovers from markets in tattered clothes.
English
3
12
47
677
coded_bruh
coded_bruh@codedbruhh·
I hate this rubbish on X. Some of my mutuals I don't see their tweets for weeks and I'll be thinking they're on an offline vacation. Meanwhile they're active as fuck.
English
3
2
13
175
coded_bruh
coded_bruh@codedbruhh·
I believe in jurrien timber so much that if we meet PSG in the final. He's putting kvitcha in his pocket like he did last season.
English
2
3
10
244
coded_bruh retweetledi
-
-@imzftbi·
For as little G/A as Doku has, his fear factor is insane
English
22
230
6.3K
105.2K