phyllo wrote:Not really the same issue when you put one or more operation in there. Does 0.999... look like any sort of expansion of 1. The answer seems obviously no.

Having studied math, the answer is obviously yes! It's all a matter of training and mathematical inclination. If you haven't learned basic arithmetic, you think 2 + 2 and 4 look different. After you make it through grade school, you come to recognize without a moment's hesitation that 2 + 2 and 4 refer to the same thing. Likewise one comes to recognize that .999... and 1 as two distinct expressions for the same thing, namely the Platonic or intuitive concept of the number 1.

So all you are saying here is that "My mathematical training includes 2 + 2 = 4 but not .999... = 1." That's all your remark amounts to.

phyllo wrote:I think that you are mistaken that most people are comfortable with it.

I'll concede that I'm unfamiliar with the studies about what the man on the street thinks about the Peano axioms. Most people don't think about this at all, and if asked, would think you're weird for asking them.

So let's leave this with you and me. I myself have a perfectly clear intuition in my mind of the endless sequence 0, 1, 2, 3, 4, ... of natural numbers; and even of the "completed" set of them which we call \(\mathbb N \). You do not. We could still be friends. Not everyone hears the music, as I say. Some like Picasso and some like Norman Rockwell. It's all good.

I could ask you, though, to explore the nature of your own ultrafinitism. Do you believe there's a largest number that has no successor? If the process 0, 1, 2, 3, ... ends, where does it end?

phyllo wrote: Some of the posts in this thread deal with the difficulties of "reaching the end" of an infinite sequence. It's a big problem as far as I can tell.

But no, THAT IS ONE OF THE CORE CONFUSIONS of many people, pardon my shouting. I'm glad you mentioned it though.

Let's just consider the limit of a sequence, which DOES give people a lot of conceptual trouble.

Consider the sequence of rational numbers 1/2, 1/4, 1/8, 1/16, etc. We say it has a limit of 0. This causes people who have not studied Real Analysis, which to be fair is a course only math majors take, to think that the sequence "reaches" 0 in some mysterious way.

But NO! The whole point of the formalism of limits is that we DON'T TALK ABOUT REACHING. We talk instead of getting "arbitrarily close." You give me a small positive real number, no matter how tiny, and I'll show you that the sequence gets closer to 0 than that. And we DEFINE that condition as being the limit of the sequence.

The entire point of the limit formalism is that we never have to think or talk about "reaching and endpoint at infinity," which is a hopelessly muddled mess. Instead we FINESSE the whole problem by using the "arbitrarily closeness" idea. That is the brilliance of the modern approach to infinitesimals. We banish them! We don't have to talk about them.

(I mention in passing that the hyperreals of nonstandard analysis don't help you, because .999... = 1 is a theorem of nonstandard analysis as well).

So there is no mysterious endpoint, there is no reaching. These are mind-confusing illusions left over from your imprecise intuitions of infinity. And these intuitions are clarified and made logically rigorous in math. That's a fact.

phyllo wrote:A child knows that you can "always add 1" to any counting number.

Try to be a little less condescending.

I apologize. I was not trying to be condescending. I am actually under the sincere impression that most adults, if pressed, will agree that there is no end to the sequence 0, 1, 2, 3, 4, 5, 6, ... because you can always add one. I am under the impression that most people do feel that way, if you asked them.

I believe this must be especially true in the computer age, when many people from programmers to spreadsheet users have internalized the concept of "always add 1" or "keep adding 1." We live in the age of algorithms and "given n, output n + 1" is a perfectly intuitive concept to many people.

Everyone who ever started to learn how to program came to understand (often with great difficulty) the concept of looping, or endless repetition. If you can do something once you can do it forever. That is one of the main grokitudes of programming!

If you genuinely don't agree, and genuinely reject the concept of adding one, then I'm interested to learn more about what that means. And even if it's nothing more than a convenient fiction (which is exactly what it is!) What if it's all bullshit but Newton used it to calculate the motions of the solar system. Wouldn't you at least grant that the mathematical formalisms are useful and therefore worthy of study?