@TribbleTj@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it I talk now to the authors of the textbooks I learned by, they disagree with one another, and sometimes even with the previous selves from the past, they're mostly in the field of physics, - why can't you entertain the possiblity that the textbooks in math aren't the final word?
@eleusinianatlas@TribbleTj@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it Also I'd like to note here that floating point numbers cannot properly store 0.1. When a float claims to be 0.1, it's truncating the imprecision.
This does not mean 0.1 is not a real number. I'm coming to realize you don't actually know anything about anything. Oh well.
@eleusinianatlas@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it I don’t mean you have a flawed understanding of mathematics or infinities. You actually seem to understand that stuff pretty well. I mean you literally got the textbook definition of the … notation wrong. There’s no room for interpretation there.
@TribbleTj@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it How do you know that ther are more "...33333...." in 0.33333... ? Because you know the operation they're produced by, 1/3. It's not decimal representation, without knowing 1/3, you wouldn't know that there's no some "1445457" down the line. You just brand algorithm as a number.
@TribbleTj@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it I say the other way around, stating that 0.999... = 1 is exactly appliactional aprroach, the engineering one. Cutting a stick in half, then cutting the half in half, etc, will get you to no stick in your hands, but mathemtically 1/2, 1/4, 1/16... never ends. Same with 0.999... .
@eleusinianatlas@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it I think the issue here is just that your fields of interest are just too removed from the use cases of this notation. A scientist or applied mathematician probably wouldn’t need to express fractions are periodic decimals. A mathematician would and so we do.
@TribbleTj@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it That's all we can say - "the algorithm doesn't stop for sure". We can prove it. But there's no such number as an infinity or "..." that can be used the INPUT of a computer or our compuation by hand. "Never stopped repeating" is a Turing machine's action we predict, not a number.
@TribbleTj@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it I wouldn't. If you remember the algorithm of division you did at school on a piece of paper with your head and a pen, or coded in machine code directly, you know that the compuation of 1/3 will never stop neither in decimal nor binary numbers.
@Opaopa13@TribbleTj@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it I have no definition, I just reject the definition you like. I think that that definition is a manifestation of human nature, of OCD in this case. There's a lot more of these manifestations in physics, but humans are still humans in all fields.
@TribbleTj@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it The thing is that someone wanted to do something with math, make personal contribution, and defined the things the way he liked. It's still about Zeno's pardoxes, and dozens of people claimed solving them. The problem is that the solutions differ with their authors.
@eleusinianatlas@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it It’s nothing like 1/0= inf. That’s a matter of the meaning of dividing by zero, the limit thing is just notation. The reason why we would use the … notation is because high schoolers aren’t taught limits until at least PreCalc but elementary students can convert 1/3 into decimal
@TribbleTj@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it I don't agree with it. It's the same as to say that 1/0 = infinity. It's kinda common sense, but rigorously speaking, this operation doesn't exist. You can define it this way, I just don't see a reason to do that, other than OCD, the urge to make everything tidy.
@TribbleTj@Opaopa13@LurkasaurusRex@Unlearned_Hand@Draren_Thiralas@NLRG_it A better example, classical field in a flat spacetime, electrical or graviational: F = GmM/r2. It gets weaker and weaker with the distance r growing, but what's the point of defining its force as 0 at the infinite range? Its decreasing actually never ends, then why do that?
@Opaopa13@LurkasaurusRex@TribbleTj@Unlearned_Hand@Draren_Thiralas@NLRG_it I just rejcet the idea of converging being the same as being equal. To converge means to make the difference between the converging entity and the limit it converges to lesser than any given number. To say otherwise is to say that an asymptote crosses the line it clings to.