## Is 1 = 0.999... ? Really?

For discussing anything related to physics, biology, chemistry, mathematics, and their practical applications.

Moderator: Flannel Jesus

## Is it true that 1 = 0.999...? And Exactly Why or Why Not?

Yes, 1 = 0.999...
12
40%
No, 1 ≠ 0.999...
15
50%
Other
3
10%

### Re: Is 1 = 0.999... ? Really?

wtf wrote:Thanks, but I'm afraid I'm getting off the roller coaster. Magnus wrote, "(Albeit, as wtf noted, the latter expression has no recognized meaning within the official language of mathematics, which means nothing with regard to this topic.)", my emphasis. If math isn't what the thread's about, I'm at a loss to contribute.

The question is:
Does $$0.999\dotso = 1$$ hold true within the standard language of mathematics?

The question is not:
Does $$0.999\dotso = 1$$ hold true within some guy's personal language of mathematics?

I have my own language, so you will often see me using standard symbols in a non-standard way (e.g. using $$\infty$$ to mean what JSS means by "infA".) With this in mind, one must approach my posts with care, lest they misunderstand me.

I did write extensively in this thread several years ago explaining why .999... = 1, but clearly to no avail. Long answer short, though, it's because .999... = 1 is a theorem that can be proved (by a computer if one likes) in standard set theory.

There's no other reason. Once you define the symbols as they are defined in standard math, the conclusion follows. If you define them differently, you can say that .999... = 47 or anything else you like. There is no moral or absolute truth to the matter, it's strictly an exercise in defining the notation and then showing that the theorem follows. Just like 1 + 1 = 2. If you give the symbols different meanings, you get a different truth value. With the standard meanings to the symbols, the statement is true.

That's correct.

The point of contention is the meaning of the symbol $$0.\dot9$$.

If $$0.\dot9$$ represents the limit of the infinite sum that is $$0.9 + 0.09 + 0.009 + \cdots$$ then it is true that $$0.\dot9 = 1$$.

However, if $$0.\dot9$$ represents the infinite sum that is $$0.9 + 0.09 + 0.009 + \cdots$$ then it is NOT true that $$0.\dot9 = 1$$.

The limit of the sum is not the sum itself. They are two different things.

The argument put forward is that the mathematical establishment defines $$0.\dot9$$ as an infinite sum and NOT as the limit of an infinite sum.

Magnus Anderson
Philosopher

Posts: 4253
Joined: Mon Mar 17, 2014 7:26 pm

### Re: Is 1 = 0.999... ? Really?

phyllo wrote:Suddenly there are two symbols for one number. And one symbol has the concept of infinity within it. That's unsettling.

$$0$$ and $$0.000\dotso$$ are two different symbols representing one and the same number with the second symbol having the concept of infinity within it.

I have no problem with this. Neither do other people who deny that $$0.\dot9 = 1$$.

wtf wrote:I don't see why this is a problem. Are you equally concerned that 4, 2 + 2, 2 + 1 + 1, and 1 + 1 + 1 + 1 are different symbols for the same number?

How many other different ways can you think of to symbolize the number 4?

You understand that the number 4, the "actual" number 4, is an abstract idea that is "pointed to" by the representations. The representations are not the number.

It's not a problem. It's merely phyllo's best attempt to make sense of why other people disagree with him without being particularly successful at it.

By the way, $$4.000\dotso$$ is yet another way of representing $$4$$ and it has the concept of infinity within it (:

There are infinite sums that evaluate to finite numbers. The argument put forward by JSS and people on his side is that $$0.9 + 0.09 + 0.009 + \cdots$$ is not one of them.

You seem uncomfortable with the endless sequence of counting numbers 1, 2, 3, 4, 5, ..., that most people seem to be perfectly comfortable with. All infinitary reasoning in mathematics is based on this fundamental intuition of the counting numbers.

Is there something about the idea of the endless sequence of counting numbers that bothers you? If so, you would be equally troubled by decimal notation. I grant you that. But why are you bothered by it in the first place? A child knows that you can "always add 1" to any counting number.

Additionally, we live in the age of computers. Even many endless decimals, like pi, are completely characterized by finite-length descriptions as computer programs. Pi only encodes a finite amount of information. .999... only encodes a finite amount of information: "(1) Print dot; (2) Print '9'; (3) GoTo 2". That's a finite description of the symbol .999...

No no no, it's not phyllo who's uncomfortable. Phyllo is on your side. It's us. Those who deny the popular belief. We are the ones who are uncomfortable and only allegedly so.

Magnus Anderson
Philosopher

Posts: 4253
Joined: Mon Mar 17, 2014 7:26 pm

### Re: Is 1 = 0.999... ? Really?

I don't see why this is a problem. Are you equally concerned that 4, 2 + 2, 2 + 1 + 1, and 1 + 1 + 1 + 1 are different symbols for the same number?
Not really the same issue when you put one or more operation in there. Does 0.999... look like any sort of expansion of 1. The answer seems obviously no.
You seem uncomfortable with the endless sequence of counting numbers 1, 2, 3, 4, 5, ..., that most people seem to be perfectly comfortable with.
I think that you are mistaken that most people are comfortable with it. Some of the posts in this thread deal with the difficulties of "reaching the end" of an infinite sequence. It's a big problem as far as I can tell.
A child knows that you can "always add 1" to any counting number.
Try to be a little less condescending.
Last edited by phyllo on Wed May 20, 2020 2:11 am, edited 1 time in total.
phyllo
ILP Legend

Posts: 11857
Joined: Thu Dec 16, 2010 1:41 am

### Re: Is 1 = 0.999... ? Really?

wtf wrote:But this point has been made repeatedly by myself and others in this thread to no avail. I confess to not understanding the objections, It's like asking if the knight in chess "really" moves that way. The question is a category error. Chess is a formal game and within the rules of the game, the knight moves that way. There is no real-world referent. Likewise with a notation such as .999... By the rules of the game, it evaluates to 1. It's a geometric series. If you make some other rules, you can get a different result. Efforts to imbue this with some kind of philosophical objections are likewise category errors. Rules of formal games aren't right or wrong. They're only interesting and useful, or not. The rules of math turn out to be interesting and useful so we teach them.

There is no real-world referent. I agree. That's not what's being disputed. What's being disputed is that, within the rules of the game, $$0.\dot9$$ evaluates to $$1$$.

The notation .999... is a shorthand for the infinite series 9/10 + 9/100 + 9/1000 + ... which sums to 1 as shown in freshman calculus. There truly isn't any more to it than that, though if desired one could drill this directly down to the axioms of set theory.

The infinite sum that is $$0.9 + 0.09 + 0.009 + \cdots$$ does not evaluate to $$1$$. It evaluates to a number that is less than $$1$$. This is evident in the fact that the sum NEVER attains $$1$$. In order to say that some number $$x$$ is the result of some infinite sum, there must be a point at which that number $$x$$ is attained and from there on indefinitely maintained. That's not the case here.

Rather than evaluating to $$1$$, the LIMIT of that infinite sum is $$1$$.

In short, the RESULT of the sum is less than $$1$$ whereas the LIMIT of the sum is $$1$$.

When you go to the Wikipedia article you will see that most of the proofs are trying to prove that the RESULT of the sum (and not the limit of the sum) is equal to $$1$$.

Magnus Anderson
Philosopher

Posts: 4253
Joined: Mon Mar 17, 2014 7:26 pm

### Re: Is 1 = 0.999... ? Really?

The infinite sum that is 0.9+0.09+0.009+⋯ does not evaluate to 1. It evaluates to a number that is less than 1. This is evident in the fact that the sum NEVER attains 1.
See.

What did I say?
phyllo
ILP Legend

Posts: 11857
Joined: Thu Dec 16, 2010 1:41 am

### Re: Is 1 = 0.999... ? Really?

What did you say?

Magnus Anderson
Philosopher

Posts: 4253
Joined: Mon Mar 17, 2014 7:26 pm

### Re: Is 1 = 0.999... ? Really?

More than you know.
phyllo
ILP Legend

Posts: 11857
Joined: Thu Dec 16, 2010 1:41 am

### Re: Is 1 = 0.999... ? Really?

wtf wrote:This is what brings me back to this thread. I want to understand the objection that is being made.

So you've exhausted rational proof, and you still encounter opposition.

What then do we conclude? You've correctly identified that the cause is mathematical ignorance, likely more within the context of flawed education standards than individual fault. The expected consequence of this cause is irrational opposition - as is logically consistent with the that fact that your opposition is against exhaustive rational proof.

This is why I've identified that the only valid way to approach this thread is psychological.
This is much to the frustration of the irrational opposition, but the alternative is tedious repetition. You keep repeating all the rational ways the prove your position from a point of knowledge and experience, and they keep repeating all the irrational ways that they think prove their position from a point of ignorance.

It's an interesting question to ask - why do people dig their heels in, even when they're wrestling with a topic that they know they're weak at, and dealing with people who genuinely do know what they're talking about?
Obviously the low-hanging fruit for them is to get on their high horse and protest that this is a rational debate and it should only deal with the topic directly - even though I've just rationally shown that this is not the case, and that dealing only with the topic directly is the whole problem.

From what I can tell so far, there's not much more going on than the good old Dunning-Kruger effect, along with plenty of cognitive biases and logical fallacies - notably "confirmation bias" and "moving the goalposts". There's a lot of forgetting rational arguments that already countered irrational positions, denying that they ever existed, or insisting that they're irrelevant - anything to avoid the cognitive dissonance of honest introspection.
It's a psychologically difficult process to admit you're wrong or not in the position you thought you were, that you wanted to see yourself as being in. People want to hear and remember things that support their position and make them feel validated, special and competent - especially if they don't feel that way overall in their normal life. It becomes a socially detrimental force when people begin to construct their own identity and a sense of purpose around topics in which they lack sufficient expertise, emotionally investing in them and feeling personally attacked when people challenge your cause. This is especially so when there are others around in the same fragile state of mind, looking for someone with a sense of confidence who is defying a way of thinking that they're weak at - this vicariously soothes their insecurities and only bolsters yet another movement against rationality, leaving experts such as yourself in a state of confusion about how you're supposed to deal with what's going on. It's truly toxic.

So of course they'll attack your mathematical competence as a weakness rather than admit their mathematical incompetence is their weakness.
We see it in politics all the time - e.g. "if someone sides with something you don't agree with, they've been brainwashed by them." It's taken as "the" (necessary) conclusion, when it's just "a" possible (sufficient) conclusion that might hold or might not: abductive reasoning. In some cases it's actually going to be because one person understands something that the other doesn't - yet it's so much easier and lazier to simply trust your own prejudices and assume the other person is the ignorant one. Confirmation bias so often clouds all the evidence against this, and over-emphasises all the memorable moments of victory when it's at least seemed like you were right.

It would appear as though all this "extra baggage" that you're talking about is simply, or at least largely a result of the above. Mathematically challenged people want to believe that it's maths that is at fault rather than them.

To this end they see as far as they want to see and no further.
It's easy to note "0.9 < 1", "0.99 < 1", and that since this pattern continues, it must continue indefinitely under all possible circumstances. One simply neglects the key property of infinity that changes this finite pattern, and voila - it can seem as though you were right all along.

Yet the obvious truth that you pointed out is essentially undeniably tautological - math and only math defines all the notations involved in the topic, and in like manner math completes how they equal each other.
That's as far as rationality "need" go, even though all the proofs you understand and can expertly articulate can extend this rationality far further - there's only one answer to objections to this: they're necessarily irrational. And the reason why this irrationality exists is as I've just explained.
The only issue left to resolve in this thread is how to resolve irrationality. Demonstrably the answer to this is not simply "rationality" - as the irrational response to rationality is the whole problem to begin with.

Silhouette
Philosopher

Posts: 4136
Joined: Tue May 20, 2003 1:27 am
Location: Existence

### Re: Is 1 = 0.999... ? Really?

Ecmandu wrote:My disproof for convergent series!

If you take any rational, irrational or imaginary number and divide it by half, let’s say, the number 1!

0.5 + 0.5 = 1

Then you divide that in half!

0.25 + 0.25 + 0.25 + 0.25 = 1

When you divide that in half!

You get:

0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 = 1

Etc...

When you push this series to the convergent limit, eventually 0=1

In order for series to converge (and I reverse engineered the problem to prove this)... 1 MUST equal zero!!

Your above argument could maybe be notated as:

$$\sum_{i=1}^n\frac1{n_i}=1$$, or just $$\frac1n\times{n}=1$$
$$\therefore\lim_{n\to\infty}(\frac1n\times{n})=1$$,
$$\lim_{n\to\infty}(\frac1n)=0$$
$$0\times{n}=0$$
$$\therefore\lim_{n\to\infty}(\frac1n\times{n})=0$$

The issue is you're concluding that "1=0", instead of concluding that the only way in which you can arrive there is invalid.

What even is the infinite sum of 1 over infinity? Anything times infinity is infinite, anything times the limit of zero is zero, anything divided by itself is 1 - that's why infinity is undefined and not a number. You can get anything you want from abusing it - that's not the same as "anything you want does actually equal anything you want". Math is consistent, it's built from consistency to remain consistent, which means that when you get to inconsistencies from e.g. someone using infinity as a number, there was a problem with them using infinity as a number. In the same way, there's a problem in your reasoning get 1=0.
You even get the mathematical constant "$$e$$" from the similar sum $$\lim_{n\to\infty}(1+\frac1n)^n$$, when it might look as though the limit goes to $$(1+0)^\infty$$, which might look like 1 (actually an "indeterminate form").
All it means when you get inconsistent nonsense like "1=0" (which circularly violates the fundamentals of math that allowed you to arrive at the inconsistency in the first place) is that you need to do more work.

You don't just stop at a seeming contradiction, and stick with that as your singular definite conclusion.
You have to find valid ways to get a single answer only. This is how the sum I just mentioned can be known to equal exactly "$$e$$" and nothing else, without contradiction/inconsistency.
This is why it helps to be a mathematician, because you have the experience of being familiar with all sorts of methods to enable you to arrive at a correct answer, when non-mathematicians might be tempted to just settle for the simplest answer that they first arrive at, and conclude that "there's therefore something inconsistent in math", to justify that they never needed to gain expertise in math in the first place.
Last edited by Silhouette on Wed May 20, 2020 4:54 pm, edited 1 time in total.

Silhouette
Philosopher

Posts: 4136
Joined: Tue May 20, 2003 1:27 am
Location: Existence

### Re: Is 1 = 0.999... ? Really?

Hi wtf;

It was so much fun reading a well-informed post I hesitate to comment further.

Personally, I have been absolutely committed to a rigorous study of the foundations mathematics. You might get a Geist by looking at my post at:

viewtopic.php?f=4&t=183931&p=2422704&hilit=Mathematica+Principia#p2422704

You can skip personal reflections by scrolling down to The Foundations

For example are you personally committed to the Real Analysis definition of 1 as an equivalency class of Cauchy sequences? (If so you still have an ontological error). Perhaps you might consider {Φ, {Φ}} as the number 1, where Φ is taken to be the null set? This is taken from ZFC. On the other hand Gödel thinks that Plato’s Ideal for 1 is correct – So much for Cauchy sequences.

Since Real Analysis depends on Cauchy sequences which map the Counting numbers to the Rational numbers, Counting numbers and Rational numbers are defined prior to the Real numbers. Do you have any ontological problems with this? I.e. those numbers for ½, 1, 2, 3, … are not the same as Real numbers ½, 1, 2, 3, … .

Really, I immensely enjoyed your post.

Ed
"Albert! Stop telling God what to do." - Niels Bohr
Ed3
Thinker

Posts: 886
Joined: Sun Oct 31, 2004 2:56 pm
Location: Minneapolis, MN

### Re: Is 1 = 0.999... ? Really?

Ed3 wrote:Really, I immensely enjoyed your post.

Thank you Ed. I am so glad you read it in the right spirit, I didn't mean for it to come out as critical as it did. IMO it's misplaced energy on your part to strenuously claim that functions aren't numbers, since it's not really a very important point in the first place; and it's at least arguable and not as clear cut as you believe. IMO of course.

You ask if I'm ontologically committed to the formal set-theoretic definitions, and of course I'm not. I do take your point that functions and numbers are distinct entities in the Platonic world, even if we can define numbers as functions and probably functions as numbers in various formal symbolic ways. If that's what you're saying, we're in agreement.

So even if I fall back on saying that the real number 1 literally is the sequence .9, .99, etc. or an equivalence class containing it, I would NOT ever be confused by thinking that really "is" the number 1. The very fact that the natural number 1, the integer 1, the rational number 1, and the real number 1 are distinct set-theoretic entities is evidence that NONE of them could be the real 1, a point first made by Benacerraf.

The number 1 is some kind of thing out there in Platonic land along with Captain Ahab and the Baby Jesus. That's the problem with Platonism. If there's a non-physical realm of existence, exactly

* What is it?

* Where is it?

* What else might live there? How do we know what lives there and what doesn't?

Point being that Platonism is easily refuted. So is anti-Platonism. Ontology is hard.

Did I manage to catch the essence of your viewpoint?

tl;dr: If I acknowledge that formalisms don't imply ontology, then a function and a number definitely are two different things. I concede the antecedent and the logical conclusion; but I don't understand your insistence on the point, when (IMO) it is not an especially relevant point in this thread. I think that's what I was going on about.
wtf

Posts: 357
Joined: Sun Dec 06, 2015 5:47 am

### Re: Is 1 = 0.999... ? Really?

wtf wrote:it's misplaced energy on your part to strenuously claim that functions aren't numbers

The input of a function can be a number.

The output of a function can be a number.

Functions can have properties that can be expressed using numbers e.g. the limit of a function is a number.

However, functions themselves aren't numbers. A function is no more than a set of input-output pairs where every input is paired with exactly one output. A set of pairs of numbers is not a number (even though you can use a set of pairs of numbers to represent a number.)

So Ed is right when he says that functions aren't numbers but he's also wrong because he says that $$0.\dot9$$ is a function (which it is not.)

Magnus Anderson
Philosopher

Posts: 4253
Joined: Mon Mar 17, 2014 7:26 pm

### Re: Is 1 = 0.999... ? Really?

phyllo wrote:Not really the same issue when you put one or more operation in there. Does 0.999... look like any sort of expansion of 1. The answer seems obviously no.

Having studied math, the answer is obviously yes! It's all a matter of training and mathematical inclination. If you haven't learned basic arithmetic, you think 2 + 2 and 4 look different. After you make it through grade school, you come to recognize without a moment's hesitation that 2 + 2 and 4 refer to the same thing. Likewise one comes to recognize that .999... and 1 as two distinct expressions for the same thing, namely the Platonic or intuitive concept of the number 1.

So all you are saying here is that "My mathematical training includes 2 + 2 = 4 but not .999... = 1." That's all your remark amounts to.

phyllo wrote:I think that you are mistaken that most people are comfortable with it.

So let's leave this with you and me. I myself have a perfectly clear intuition in my mind of the endless sequence 0, 1, 2, 3, 4, ... of natural numbers; and even of the "completed" set of them which we call $$\mathbb N$$. You do not. We could still be friends. Not everyone hears the music, as I say. Some like Picasso and some like Norman Rockwell. It's all good.

I could ask you, though, to explore the nature of your own ultrafinitism. Do you believe there's a largest number that has no successor? If the process 0, 1, 2, 3, ... ends, where does it end?

phyllo wrote: Some of the posts in this thread deal with the difficulties of "reaching the end" of an infinite sequence. It's a big problem as far as I can tell.

But no, THAT IS ONE OF THE CORE CONFUSIONS of many people, pardon my shouting. I'm glad you mentioned it though.

Let's just consider the limit of a sequence, which DOES give people a lot of conceptual trouble.

Consider the sequence of rational numbers 1/2, 1/4, 1/8, 1/16, etc. We say it has a limit of 0. This causes people who have not studied Real Analysis, which to be fair is a course only math majors take, to think that the sequence "reaches" 0 in some mysterious way.

But NO! The whole point of the formalism of limits is that we DON'T TALK ABOUT REACHING. We talk instead of getting "arbitrarily close." You give me a small positive real number, no matter how tiny, and I'll show you that the sequence gets closer to 0 than that. And we DEFINE that condition as being the limit of the sequence.

The entire point of the limit formalism is that we never have to think or talk about "reaching and endpoint at infinity," which is a hopelessly muddled mess. Instead we FINESSE the whole problem by using the "arbitrarily closeness" idea. That is the brilliance of the modern approach to infinitesimals. We banish them! We don't have to talk about them.

(I mention in passing that the hyperreals of nonstandard analysis don't help you, because .999... = 1 is a theorem of nonstandard analysis as well).

So there is no mysterious endpoint, there is no reaching. These are mind-confusing illusions left over from your imprecise intuitions of infinity. And these intuitions are clarified and made logically rigorous in math. That's a fact.

phyllo wrote:
A child knows that you can "always add 1" to any counting number.
Try to be a little less condescending.

I apologize. I was not trying to be condescending. I am actually under the sincere impression that most adults, if pressed, will agree that there is no end to the sequence 0, 1, 2, 3, 4, 5, 6, ... because you can always add one. I am under the impression that most people do feel that way, if you asked them.

I believe this must be especially true in the computer age, when many people from programmers to spreadsheet users have internalized the concept of "always add 1" or "keep adding 1." We live in the age of algorithms and "given n, output n + 1" is a perfectly intuitive concept to many people.

Everyone who ever started to learn how to program came to understand (often with great difficulty) the concept of looping, or endless repetition. If you can do something once you can do it forever. That is one of the main grokitudes of programming!

If you genuinely don't agree, and genuinely reject the concept of adding one, then I'm interested to learn more about what that means. And even if it's nothing more than a convenient fiction (which is exactly what it is!) What if it's all bullshit but Newton used it to calculate the motions of the solar system. Wouldn't you at least grant that the mathematical formalisms are useful and therefore worthy of study?
Last edited by wtf on Thu May 21, 2020 1:25 am, edited 4 times in total.
wtf

Posts: 357
Joined: Sun Dec 06, 2015 5:47 am

### Re: Is 1 = 0.999... ? Really?

Magnus Anderson wrote:The input of a function can be a number.

@Magnus You're getting a little ahead of me but I will try to catch up to all your replies to me. Just working on the other ones first! You have brought up a lot of items that I need to take time to reply to.
wtf

Posts: 357
Joined: Sun Dec 06, 2015 5:47 am

### Re: Is 1 = 0.999... ? Really?

No worries, it's always better for a person to take their time (:

Magnus Anderson
Philosopher

Posts: 4253
Joined: Mon Mar 17, 2014 7:26 pm

### Re: Is 1 = 0.999... ? Really?

Magnus Anderson wrote:
The input of a function can be a number.

The output of a function can be a number.

Evidently @Ed and you both care about the topic of whether functions and numbers are essentially different. I think it's kind of a distraction but just for sake of discussion I'll play.

First, a function goes from any set to any other set. We always denote a function as $$f : X \to Y$$, meaning that $$f$$ is a function that inputs an element of a set $$X$$ and outputs an element of a set $$Y$$.

Since everything in math is a set (in the standard set-theoretic formalism), a function can input anything and output anything.

For example a function can input and output functions. A familiar example is the derivative operator in one real variable. We have a function $$D$$ that inputs a function of one real variable and outputs another. For example $$D x^2 = 2x$$. $$D sin x = cos x$$, etc.

Or a function could input a set and output a number; for example the function that counts the number of elements of a finite set, and outputs -1 if the set's not finite. That's a perfectly valid function from the proper class of all sets to the natural numbers.

So functions can be completely arbitrary in terms of what they input and output. I don't see how this sheds light.

Magnus Anderson wrote:Functions can have properties that can be expressed using numbers e.g. the limit of a function is a number.

Well numbers have properties too. Everything has properties. So that doesn't distinguish functions from numbers.

Magnus Anderson wrote:However, functions themselves aren't numbers. A function is no more than a set of input-output pairs where every input is paired with exactly one output. A set of pairs of numbers is not a number (even though you can use a set of pairs of numbers to represent a number.)

Conceptually maybe not, but formally functions are often numbers. For example in mathematical logic, we use Gödel numbering to represent a function or a formula by a specific number.

Another example would be to use the fact that there are as many continuous functions from the reals to the reals as there are reals. So in principle there's a mapping that inputs a continuous function and outputs a real number that can be used as a proxy for it. Instead of saying cosine we can just say #45.3. Every function has an associated number. So again, the distinction between numbers and functions is less clear to me than it is to you and @Ed.

Magnus Anderson wrote:So Ed is right when he says that functions aren't numbers

Well ... I question the relevance or point of the observation, since it's not clear to me that it's true, and it's definitely clear to me that it's a red herring in the .999... discussion. I don't get the bit about numbers and functions. Set theory doesn't distinguish between numbers and functions, they're both different types of sets. So I honestly don't know what point is being made here.

Magnus Anderson wrote: but he's also wrong because he says that $$0.\dot9$$ is a function (which it is not.)

Oh but of course it is. Every decimal expression is a function $$d : \mathbb N_+ \to D$$ where $$D$$ is the set of decimal digits $$D = \{0,1,2,3,4,5,6,7,8,9 \}$$. That's what a decimal expression is. You give me the number 47, I give you back the 47-th decimal digit. (Just referring to the digits to the right of the decimal point, we can patch up the idea to account for the leftward digits if needed). You give me the number 545535 and I return that digit. That is exactly what a decimal expression is, a function from the set of positive natural numbers to the set of digits.

I'm using $$\mathbb N_+$$ which is the set 1, 2, 3, ... so that the first place to the right of the decimal point is 1 and not 0 for convenience. I hope that's clear.

You see this, right? $$\pi$$ is a function, $$\sqrt 2$$ is a function. [After we deal with the pesky leftward digits].

What function represents $$\pi - 3$$?

f(1) = 1
f(2) = 4
f(3) = 1
f(4) = 5
f(5) = 9
etc.
wtf

Posts: 357
Joined: Sun Dec 06, 2015 5:47 am

### Re: Is 1 = 0.999... ? Really?

Hi Magnus,

What type of entity do you believe .999... to be?

Thanks Ed
"Albert! Stop telling God what to do." - Niels Bohr
Ed3
Thinker

Posts: 886
Joined: Sun Oct 31, 2004 2:56 pm
Location: Minneapolis, MN

### Re: Is 1 = 0.999... ? Really?

The quibble over function or number reminds me of grammar.

Function is to number as verb is to noun. The definition of the word is the same, but is the definition being used differently - as a doing or as a being? Are humans "beings" or "becomings"? Well they're still humans.

Then there's the distinction between the definite and indefinite article, or better yet - the type/token distinction. "A human" is one specific concrete specimen, whilst "human" is abstract humanity in general. Again, the definition of human: what we're dealing with and what specifically is meant, is the same.

The same goes for function and number - the meaning, and what we're dealing with is the same.
What is it that is the same in this case? Quantity.

"A quantity" is what I've been referring to as a concrete representation. "Quantity" is abstract.
You can represent "a quantity" as a function or number, arriving at it algorithmically (the journey) or the final result of doing so (the destination). "The quantity" represented is the same. "Quantity" means the same thing either way.

So yes, this objection of whether something is function or number is superficial at best, and meaningless at worst.

@wtf, what did I say about endless repetition when trying to deal rationally with irrationality?
The irrational are like flies trying to fly through a window, trying the same thing over and over in slightly different ways, and as soon as a new angle is attempted they forget their mistake in trying the old angle, and soon repeat their attempt.
The window is never escaped, sometimes even if you literally open the window for them.

I wonder if the mental block comes from how "1" is being thought of as being arrived at from one side only? In the case of "building" the representation $$0.\dot9$$ from $$0.9$$ through $$0.99$$ etc. it's approached from below only.
If it was representationally approached from the above, would it also be "1 plus some infinitessimal" as well as "1 minus some infinitessimal" simultaneously?

The algorithmic functional "doing" to get there is superficial. How the number "looks" is superficial.

A couple of people have mentioned Cauchy sequences. $$0.\dot9$$ is cauchy because there is no other quantity that it approaches than "1".
Representationally, you can get arbitrarily close to "1", but as above, whilst "the representations" can differ, "the quantity" is identical.
You can represent the quantity $$0.\dot9$$ as different to "1" with a Dedekind cut, but again this is only representation. The "quantity" is equal.

Silhouette
Philosopher

Posts: 4136
Joined: Tue May 20, 2003 1:27 am
Location: Existence

### Re: Is 1 = 0.999... ? Really?

Hi wtf,

You might be onto something here. I am thinking about Von Neumann and Godel numbering, but I need to give it some more thought.

However, with this specific example aren't you getting into some trouble representing an uncountable object with a countable object? In this specific case I think you need sequences with limits.

Ed
"Albert! Stop telling God what to do." - Niels Bohr
Ed3
Thinker

Posts: 886
Joined: Sun Oct 31, 2004 2:56 pm
Location: Minneapolis, MN

### Re: Is 1 = 0.999... ? Really?

Hi wtf,

I think I have screwed up my comments abount countable/uncountable. Though I am still not certain about your representation for Pi - 3.

Ed
"Albert! Stop telling God what to do." - Niels Bohr
Ed3
Thinker

Posts: 886
Joined: Sun Oct 31, 2004 2:56 pm
Location: Minneapolis, MN

### Re: Is 1 = 0.999... ? Really?

Perhaps some confusion can be cleared by signaling to the matter, or it's corresponding idea relating to languages in general, mathematics being a quantifier, leading to this proposition:

'formula beginning with a quantifier is called a quantified formula. A formal quantifier requires a variable, which is said to be bound by it, and a subformula specifying a property of that variable.

Formal quantifiers have been generalized beginning with the work of Mostowski and Lindström.'

As far as generalization is concerned, it appears to pair with a tendency to integrate sets that functionally demand such, to qualify within reasonable set specifications.

So the function may be set differently,
but it tends to integrate within some mixed set consisting of both: of specified and more general characteristics.
At least this is what appears to be implied here.

I may be way off with this generalization, it seems credible.
Meno_
ILP Legend

Posts: 6400
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

### Re: Is 1 = 0.999... ? Really?

Ed3 wrote:Hi Magnus,

What type of entity do you believe .999... to be?

Thanks Ed

Hello.

I take $$0.999\dotso$$ to represent the same thing as every other decimal number which is a sum of the following form:

$$\cdots + d_2 \times 10^2 + d_1 \times 10^1 + d_0 \times 10^0 + d_{-1} \times 10^{-1} + d_{-2} \times 10^{-2} + \cdots$$

Every $$d_n$$ represents a decimal digit which is an integer from $$0$$ to $$9$$.

Magnus Anderson
Philosopher

Posts: 4253
Joined: Mon Mar 17, 2014 7:26 pm

### Re: Is 1 = 0.999... ? Really?

Silhouette wrote:Function is to number as verb is to noun. The definition of the word is the same, but is the definition being used differently - as a doing or as a being? Are humans "beings" or "becomings"? Well they're still humans.

Then there's the distinction between the definite and indefinite article, or better yet - the type/token distinction. "A human" is one specific concrete specimen, whilst "human" is abstract humanity in general. Again, the definition of human: what we're dealing with and what specifically is meant, is the same.

The same goes for function and number - the meaning, and what we're dealing with is the same.
What is it that is the same in this case? Quantity.

"Function" and "number" do not normally mean the same thing. You can make them mean the same thing, of course, but then, you must not equivocate.

How much money do you have? I have $$f(x) = x^2$$ money. What does that mean? Normally, it means nothing. But of course, if you have a need to, you can make it mean something.

You can use horses to represent numbers. You can say "This kind of horse represents this kind of number". For example, you can say that pegasuses represent number $$1,000$$, centaurs number $$100$$ and ponies number $$10$$. This allows you to do arithmetic with horses. You can say "A pony multiplied by a centaur equals a pegasus" without being wrong.

You can do the opposite too. You can say "This kind of number represents this kind of horse". You can say number $$1,000$$ represents pegasuses, and because pegasuses can fly, you can conclude, without making a mistake, that $$1,000$$ can fly too.

It's all fun and games until you equivocate.

For example:

1) All numbers are shapeless.
2) All horses are numbers.
3) Therefore, all horses are shapeless.

Horses qua numbers are indeed shapeless, but what is argued here is that horses qua animals are shapeless, which is not true.

In the same way, $$0.999\dotso$$ qua limit is indeed $$1$$ but what is being argued is that $$0.999\dotso$$ qua sum is $$1$$, and that is not true.

Magnus Anderson
Philosopher

Posts: 4253
Joined: Mon Mar 17, 2014 7:26 pm

### Re: Is 1 = 0.999... ? Really?

Ed3 wrote:Hi wtf,

I think I have screwed up my comments abount countable/uncountable. Though I am still not certain about your representation for Pi - 3.

Ed

You're not? What do you think a decimal expression like .1415926... means? It's a map from the positive integers to the decimal digits. 1 goes to 1. 2 goes to 4. 3 goes to 1.

The expression is then mapped to a convergent infinite series by summing 1/10 + 4/100 + ...

I cannot imagine you not knowing this. Please explain where you're coming from. You have me totally confused.
wtf

Posts: 357
Joined: Sun Dec 06, 2015 5:47 am

### Re: Is 1 = 0.999... ? Really?

Magnus Anderson wrote:

Then I'm confused. If the question is, "Is .999... = 1 true in standard math?" the answer is yes without a shred of doubt. I could point you to a hundred books on calculus and real analysis. If we're talking about standard math, how could anyone hold a different opinion?

Magnus Anderson wrote:
The question is:
Does $$0.999\dotso = 1$$ hold true within the standard language of mathematics?

The question is not:
Does $$0.999\dotso = 1$$ hold true within some guy's personal language of mathematics?

But yes! In which case .999... = 1 in standard math and there is no question or dispute, other than to clarify for people what the notation means and why it's true within standard math. It's a theorem in ZF set theory. It's a convergent geometric series in freshman calculus. It's even true in nonstandard analysis, which some people aren't aware of. There's just no question about the matter.

So you really have me puzzled, Magnus. If you agree we're talking about standard math, what is the basis of your disagreement?

Magnus Anderson wrote:
I have my own language, so you will often see me using standard symbols in a non-standard way

Ok. So you admit that you are NOT talking about standard math, but rather about your private nonstandard use of mathematical symbols. In which case you can define .999... = 47 and I would have no objection. If that's one of the rules in your game, I am fine with it; just as I learned to accept that the knight can hop over other pieces in standard chess.

Magnus Anderson wrote:
(e.g. using $$\infty$$ to mean what JSS means by "infA".) With this in mind, one must approach my posts with care, lest they misunderstand me.

You have already said that you are talking about standard math AND that you are talking about your own private nonstandard math. It's not hard to misunderstand you!

Now let me talk delicately about infA. When I came to this forum several years ago, James was already a prolific poster, an ILP Legend to beat all ILP legends. I am reluctant to criticize him since he is not here to defend himself. He has far more mindshare on this forum than I do. I respect his prolific output, if not always its content.

That said, the concept if infA is confused and wrong in the extreme. The idea seems to be some sort of mishmash of the ordinal numbers, in which we do "continue counting" after all the natural numbers are exhausted; and nonstandard analysis, in which there are true infinite and infinitesimal numbers. The infA concept borrows misunderstood elements from each of these ideas and simply makes a mess.

One really valuable thing I got from this thread a few years ago is that James caused me to go deeply into nonstandard analysis, to the point where I understand its technical aspects. For that I appreciate James. But the infA concept is just bullpucky, I don't know what else to say.

Magnus Anderson wrote:
The point of contention is the meaning of the symbol $$0.\dot9$$.

It means exactly what I've described to @Ed3. It's a particular map from the positive integers to the set of decimal digits; a constant map, in fact, in which f(n) = '9' for all inputs n. We then interpret this symbol as a real number as in the theory of geometric series, in which it's proved rigorously that .999... = 1.

Again I agree that if you choose to make up a new system in which .999... has some other meaning, you are perfect within your rights. After all there do happen to be many variants of chess; played on infinite boards, or with a new piece called the Archbishop, etc. If someone enjoys playing alternate versions of standard games it's ok by me.

Magnus Anderson wrote:
If $$0.\dot9$$ represents the limit of the infinite sum that is $$0.9 + 0.09 + 0.009 + \cdots$$ then it is true that $$0.\dot9 = 1$$.

Yes ok. Then we are done!

Magnus Anderson wrote:
However, if $$0.\dot9$$ represents the infinite sum that is $$0.9 + 0.09 + 0.009 + \cdots$$ then it is NOT true that $$0.\dot9 = 1$$.

I fail to follow that. Did you learn geometric series at one point? The definition of a limit? I can't tell where you're coming from.

Magnus Anderson wrote:
The limit of the sum is not the sum itself. They are two different things.

Actually the limit is defined as the sum. That's what a limit is. It's a clever finessing of the idea of "the point at the end" or whatever. You are adding your own faulty intuition. If you would consult a book on real analysis you would find that the limit of a geometric series is defined as the limit of the sequence of partial sums; and that the limit of a sequence is defined as a number that the sequence gets arbitrarily close to. I explained all this to @Phyllo the other day. That's the textbook definition. You're just wrong in your impression, either because you had a bad calculus class (as most students do) or none at all. It's not till Real Analysis, a class taken primarily by math majors, that one sees the formal definition and comes to understand that the sum IS defined as the limit of the sequence of partial sums. That cleverly avoids the confusion you've confused yourself with.

Magnus Anderson wrote:
The argument put forward is that the mathematical establishment defines $$0.\dot9$$ as an infinite sum and NOT as the limit of an infinite sum.

Ah, the evil cabal of mathematicians. I will fully agree with you that most math TEACHERS form an evil cabal. One doesn't get all this stuff sorted out properly till one sees the formal definitions; at which point, one learns that the limit of a sequence is defined by arbitrary closeness. The belief you have is a bad intuition that mathematical training is designed to clarify. It's sad that we don't show this to people unless they're math majors, and you can rest assured that when I am in charge of the public school math curriculum the teaching of the real numbers will be a lot better.

Till then, I apologize on behalf of the math community that you weren't taught better. But limits are very rigorously defined and your idea is just wrong.

Again I hope I'm not coming on too strong, I'm criticizing your ideas and not you. I know you are sincere. Except for the part where you say you're talking about standard math AND that you're not. That point confused me.
wtf

Posts: 357
Joined: Sun Dec 06, 2015 5:47 am

PreviousNext