James Ramsden

2.5K posts

James Ramsden banner
James Ramsden

James Ramsden

@JamRam96

Born in London, live in Liverpool, Newcastle United fan. Don’t ask how. He/Him

Liverpool, England Katılım Ağustos 2010
50 Takip Edilen23 Takipçiler
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo Ah, but of the two, only computation is intrinsically symbolic. Electromechanical computation *requires* a symbolic map between: - the evolution of a concrete system (the computing machine) along a physics, and - the evolution of an abstract system (an algorithm) along a logic
English
0
0
0
1
Yosarian2
Yosarian2@YosarianTwo·
@JamRam96 Descartes was a brilliant man, but we understand the brain better than he did. In any case consciousness can and must rely on a physical system and have both physical inputs and outputs if it were to have any meaning at all.
English
2
0
0
33
Yosarian2
Yosarian2@YosarianTwo·
I really want "AI can not be intelligent by defintion" people to explain what they mean by "intelligence". Or better yet, to make concrete predictions about what they think LLM's won't be able to do because they lack "intelligence" and then notice when they do those things.
onion person@CantEverDie

my biggest pet peeve around LLMs is when people (usually those invested in its success) call it “intelligent”. it definitionally, how it functions on a base level, is not intelligent. the way LLMs are built, it can never hit real intelligence. it’s just predictive

English
85
23
553
38K
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo So a computational system cannot physically instantiate consciousness, it could only ever simulate doing so. This, for me, implies that a computer can't *understand* things (because understanding is a subset of consciousness), so ipso facto cannot exhibit human-level intelligence
English
1
0
1
17
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo In other words, computation depends on some extrinsically constructed relation between a physical system and an abstract system. However, consciousness, as Descartes famously observes, depends on nothing extrinsic to itself whatsoever.
English
1
0
1
15
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo This is like how a clock does not itself contain the idea of "3:00PM", neurons do not "fire" or "not fire"; they simply change continuously. The alphabetisation of those changes is a construct of *ours* that we impose on the behaviour of the neuron
English
1
0
0
20
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo Nothing about the behaviour of a neuron is intrinsically discrete or alphabetised, it does in fact behave continuously in physical space, according not to an algorithm (which is a symbolic, logical structure), but according to a real-world physics
English
1
0
0
16
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo It's likely that neurons are much more complex than logic gates, most physical systems are.
English
0
0
0
3
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo The neuron contains no symbols, it's just a physical system, behaving according to a continuous physics. You have a cognitive need to narrow its behaviours down to "On" or "Off" in order to help you understand it, but this is a result of your cognition, not of neuronal behaviour
English
1
0
0
5
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo That's not relevant. What is clear is that consciousness itself is a physical process, like photosynthesis. As such, a computer simulation of whatever the brain is doing can never be conscious, just like a simulation of photosynthesis cannot produce oxygen molecules.
English
1
0
0
25
Yosarian2
Yosarian2@YosarianTwo·
@JamRam96 Sure, but the brain is fundamentally running simulations of the world whenever you think. If you want to decide what to eat for dinner you run a little simulation in your brain of what eating the pasta dish might taste like and how it will feel to eat it.
English
1
0
0
25
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo And at any rate, I'm not sure that "using your brain to simulate thinking" is possible or even meaningful. However, one can readily appreciate the ont. distinction between "X system" and "computer simulation of X system"; the former is physically instantiated, the latter is not.
English
1
0
1
22
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo I never made such a claim, because I don't make the assumption you make; that conscious thoughts are causally downstream of some form of computation. The brain can not just be doing computations, because computation is an intrinsically symbolic (not just physical) process.
English
2
0
1
31
James Ramsden
James Ramsden@JamRam96·
@ThePremiseOfIt @YosarianTwo Semantic? How? I'm arguing for a particular ontology. Nothing I've argued rests on the definition or usage of words, it rests on drawing an ontological distinction between "X" and "a computer simulation of X". Is there something about that distinction you don't understand?
English
1
0
0
13
Sean Cantrell
Sean Cantrell@ThePremiseOfIt·
@JamRam96 @YosarianTwo Oof, that is such a semantic argument. They're literally growing human brain cells in dishes and teaching them how to simulate a neural network. Does switching the medium impact how you feel?
English
1
0
0
12
James Ramsden
James Ramsden@JamRam96·
@ThePremiseOfIt @YosarianTwo That's not true though. You don't really believe that a mathematical formula describing gravity exerts weight, or a simulation of the weather will one day, given enough complexity, become the weather, or that a GPU simulating photosynthesis will one day produce a glucose molecule
English
1
0
0
12
Sean Cantrell
Sean Cantrell@ThePremiseOfIt·
@YosarianTwo Literally the only arguments they have amount to something akin to "you can't model X with math" and it's completely motivated by needing to feel life and specifically being human has something almost mystical or spiritually special about it. Literally everything is math.
English
1
0
1
26
James Ramsden
James Ramsden@JamRam96·
@ExileSeal @YosarianTwo A mathematical formula describing motion under gravity does not itself exert weight. Even an ever-expanding, real-time simulation of the weather cannot ever become the weather. The map is not the territory!
English
0
0
1
6
Zach
Zach@ExileSeal·
@YosarianTwo That's not what they are saying. They are saying *LLMs* cannot be intelligent by definition. An LLM is an algebraic model that predicts the next token in a sequence. It cannot be self-aware, or have thoughts, or even have a persistent memory. LLM != AI.
English
1
0
0
21
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo @PlsEndMe666 That the discussion on LLMs and their supposed intelligence ought to be centred on what they can/can't do and not on what they *are*
English
0
0
0
7
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo @CantEverDie If you can't appreciate the ontological diff. between "thoughts" and "simulations of thoughts" (or, more fundamentally, the diff. between "physical system X" and "simulation of physical system X"), then I'm not surprised that one half of this discussion is not making sense to you
English
1
0
0
32
Yosarian2
Yosarian2@YosarianTwo·
@CantEverDie "Consciouness" is a whole other thorny debate, but I don't see any reason LLM's can't reason in ways that are similar to human intelligence. They can already do a lot of that, you can see the chain of reasoning with some models and how they logically think and reach conclusions
English
9
0
77
2.4K
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo Your insistence on making predictions is not strictly scientific, because science also imposes a particular ontology (physicalism), and can therefore accept reasoning within that ontology. The discussion is not about what LLMs can/can't do, it is about what they *are*
English
0
0
0
16
James Ramsden
James Ramsden@JamRam96·
@YosarianTwo Why not, you may ask? For the same reason that a GPU simulating photosynthesis will never produce a single glucose molecule; the system lacks the necessary physical structure. Computation is symbolic (ie non-physical), and symbols themselves are caused by consciousness.
English
1
0
0
9