Is 1 = 0.999... ? Really?

Yet, pretty much every single mathematician says that (i) is a complex NUMBER. But sure, if you’re operating with a different definition of the word “number”, you might be correct. The question is: how is that relevant?

I feel the need to remind everyone that there is no place for statements such as “Feel free to research it” within this thread. This is supposed to be a logical debate.

i is the imaginary unit. It’s a concept. It’s not a number. A complex number has the form a+b*i. The i identifies the imaginary part of a complex number.

You switch to whatever number system you feel like whenever you feel like it.

Complex numbers have nothing to do with infinity or the question of 1=0.9… or 1<>0.9…

Therefore I do not want to discuss it in this thread. It’s off-topic.

You can research for yourself if I’m right or wrong about the ‘imaginary unit’ but I don’t care if you do or you don’t.

They don’t seem to call it imaginary unit. They call it imaginary number and they say it’s a complex number.

See here:
en.wikipedia.org/wiki/Imaginary_number

Beside that:
(i = 0 + 1 \times i)

You said that (\infty) is not a number because (\infty \div 2) cannot be expressed more simply.

More generally, your point seems to be that (a) is not a number unless (a \div 2) can be expressed more simply.

I mentioned complex numbers for two reasons:

  1. because complex numbers are widely recognized as numbers (and because (i) is widely recognized as a complex number)

  2. because (i \div 2) cannot be expressed more simply

This refutes your point.

My point is that if infinity is a number then infinity/2 ought to be fairly easy to calculate.

So why isn’t it? Why do you need to say that it’s irreducible or why do you need to create a new symbol?

Because it’s not a number.

And my point is that your conclusion does not logically follow.

If (a \div 2) cannot be expressed more simply, it does not logically follow that (a) is not a number. It can simply mean that we did not invent a simpler way of representing the number represented by the expression (a \div 2).

This is evident in the case of (i). (i \div 2) canot be expressed more simply, and yet, no mathematician will take this to mean that (i) is not a number.

Indeed, back when only natural numbers existed, there was no way to express (5 \div 2) more simply.

Examining this example shows some possible interpretations:

Both 5 or 2 are numbers in the natural number system. So what about the result of the division?

Either the result is undefined because 5/2 does not exist in the natural number system.

Or the result of 5/2 is 2 or 3 depending on how division of natural numbers is defined.

Now consider infinity within the real number system.

Assuming that infinity is a number, there appears to be no reason why the result of infinity/2 should not exist in the real number system.

Therefore, my assumption that infinity is a number must be wrong. I see no other explanation.

Let’s stick to the standard definition of division – the one that we apply to integers and reals the same way.

According to such a concept of division, the result of (5 \div 2) is neither (2) nor (3). In fact, the result is not a natural number.

Let us now suppose that our mathematical language is restricted to natural numbers and basic arithmetic operators (+, -, * and /).

This means that (5 \div 2) cannot be expressed more simply. There are other ways to express it, sure, but there are no simpler ways (i.e. there are no equivalent expressions that use fewer tokens.)

Does that mean the result is “undefined”?
Depends on what you mean by “undefined”.

Does that mean the result is not a number?
Not really. It simply means we have no way to express the resulting number more succinctly.

Does that mean that (5) is not a number?
Most definitely not!

Remember that your argument is that (a) is not a number if (a \div 2) cannot be expressed more simply.

Notice that you don’t address my point about infinity at all.

Why wouldn’t infinity/2 be a real number?

Because infinity isn’t a real number.

The symbols (\pm \infty ) are defined rigorously as extended real numbers. They’re used as convenient shorthands in real analysis and measure theory.

But even as extended real numbers, division of those symbols by other real numbers is not defined. Which just means that it’s not defined. It has no standard definition and there’s no sensible way of creating one.

The specific rules for the defined arithmetic properties of (\pm \infty ) are here:

en.wikipedia.org/wiki/Extended_real_number_line

I ignored it because you’re repeating a point I addressed earlier.

(\infty) is not a real number. Neither is (\infty \div 2). Nonetheless, they are both numbers.
(Albeit, as wtf noted, the latter expression has no recognized meaning within the official language of mathematics, which means nothing with regard to this topic.)

You appear to be saying one of the following:

  1. if (a) is not a real number then (a) is not a number

  2. if (a \div 2) is not a real number then (a) is not a number

Neither of those is a logically valid argument.

I said it earlier: the set of real numbers is not the set of all numbers. There are numbers that are not real numbers e.g. complex numbers and hyperreal numbers.

I will let WTF have a turn on the roller-coaster ride.

Thanks, but I’m afraid I’m getting off the roller coaster. Magnus wrote, “(Albeit, as wtf noted, the latter expression has no recognized meaning within the official language of mathematics, which means nothing with regard to this topic.)”, my emphasis. If math isn’t what the thread’s about, I’m at a loss to contribute.

I did write extensively in this thread several years ago explaining why .999… = 1, but clearly to no avail. Long answer short, though, it’s because .999… = 1 is a theorem that can be proved (by a computer if one likes) in standard set theory.

There’s no other reason. Once you define the symbols as they are defined in standard math, the conclusion follows. If you define them differently, you can say that .999… = 47 or anything else you like. There is no moral or absolute truth to the matter, it’s strictly an exercise in defining the notation and then showing that the theorem follows. Just like 1 + 1 = 2. If you give the symbols different meanings, you get a different truth value. With the standard meanings to the symbols, the statement is true.

But none of this is of much interest, I gather. It’s “not what the thread’s about.” Perhaps I never understood what this thread is about.

If you have been following this thread, then you know that he has rejected the operations on infinity which are described in that wiki article:
[attachment=0]extreal.JPG[/attachment]

I was trying to show the contradictions within his own ideas about infinity.

Only sporadically, and (I admit) with dismay.

None of this bears on the original topic. The infinity of the extended reals has nothing, repeat nothing, to do with the infinite cardinals and ordinals of set theory, or the meaning of positional notation in decimals. It’s a red herring and a distraction to the question of .999… = 1.

The notation .999… is a shorthand for the infinite series 9/10 + 9/100 + 9/1000 + … which sums to 1 as shown in freshman calculus. There truly isn’t any more to it than that, though if desired one could drill this directly down to the axioms of set theory.

But this point has been made repeatedly by myself and others in this thread to no avail. I confess to not understanding the objections, It’s like asking if the knight in chess “really” moves that way. The question is a category error. Chess is a formal game and within the rules of the game, the knight moves that way. There is no real-world referent. Likewise with a notation such as .999… By the rules of the game, it evaluates to 1. It’s a geometric series. If you make some other rules, you can get a different result. Efforts to imbue this with some kind of philosophical objections are likewise category errors. Rules of formal games aren’t right or wrong. They’re only interesting and useful, or not. The rules of math turn out to be interesting and useful so we teach them.

Oh I agree completely.

I wasn’t finished typing! But if you agree with me on anything, you’d be the first person to do so in the history of this thread! :slight_smile:

I agree with you on this topic too.

I think every competent mathematician who has contributed to this thread has been saying things that agree with what you’ve been saying. The entire mathematical community in fact (philosophers or not) says what you’ve been saying.

It’s only been the ones who’ve admitted “I’m not a mathematician, but…”, or words to that effect, who have been trying to say things that do not agree with what you’ve been saying.

There’s only really one of these types left fighting his corner, simply accusing any content that disagrees with him of being irrelevant, or never having existed in the first place. It’s a chronic case of confirmation bias, other cognitive biases, and logical fallacies - that’s really all there is to this thread.

Dude! You guys! For some bizarre reason I have been relegated to the corner with a dunce cap!

I have two arguments! 1.) There are no greater infinities! 2.) Infinite series don’t converge!

I have no humanly clue why you disagree with me!

1.) no greater infinities: the disproof I use for this is what I call “the cheat”. What is “the cheat”? Very simple!

1.) rational number
2.) irrational number
3.) imaginary number
4.) different rational number
5.) different irrational number
6.) different imaginary number

You can list every number on this single list!

Etc…

My disproof for convergent series!

If you take any rational, irrational or imaginary number and divide it by half, let’s say, the number 1!

0.5 + 0.5 = 1

Then you divide that in half!

0.25 + 0.25 + 0.25 + 0.25 = 1

When you divide that in half!

You get:

0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 = 1

Etc…

When you push this series to the convergent limit, eventually 0=1

Contradiction.

In order for series to converge (and I reverse engineered the problem to prove this)… 1 MUST equal zero!!

I can’t believe this thread is still going on! I made actual proofs for this!

This is what brings me back to this thread. I want to understand the objection that is being made.

If I can sum up the way this feels to me, the pattern goes like this:

  • I give any one of half a dozen perfectly valid reasons why .999… = 1. (None of them are those semi-fallacious “multiply .999… by ten” or “multiply 1/3 by 3”, for the record. I regard those proofs as essentially circular and agree that they can be fairly criticized.

  • I am immediately accused of being a literal-minded mathematicians, brainwashed by the orthodoxy and only able to spout what I’ve been taught; and unable to think for myself or comprehend the larger point that they seem to be making. As a brainwashed mathematician I am incapable of seeing why my mathematical response to a mathematical question is totally inadequate.

  • But that’s the thing. I can never pin them down on exactly what extra baggage their imputing to the symbols. They’re mathematical symbols hence their meaning comes only from their definition and they do not necessarily refer to anything in the real world.

So this is my mystery. I want to understand what extra ontological power they assign to the harmless symbology .999… = 1. Of course it’s not true in the physical world, I’m sure I must have pointed that out early on.

So what then? What philosophical point are people trying to make? The truth is that there is no natural meaning to the symbols. They are assigned a meaning in mathematics. And by the meaning they are assigned, we can demonstrate that .999., = 1 many different ways.

The only reason, in my opinion, that anyone would think decimal notations have any other meaning or ontology, is due to faulty teaching along the way; and not to any philosophical insight. That would be my take on the situation.

I asked that question long ago and never got a satisfactory response, but I’ll just mention it here again. What is the secret sauce that leads anyone to think that .999… = 1 has any meaning whatever besides what mathematicians give it.

tl;dr: The thread subject asks a mathematical question and is fully answered by a mathematical response.

The notion that it’s anything more than a mathematical question is a false belief grounded in mathematical ignorance; which I generally do not hold against the individual, but rather the mathematics education establishment.

I agree. Math ‘lets’ it. Because it is agreeable. But the order of the notation has been lost. Abandoned for a “new” theoretical math. Counting from zero to infinity and beyond, within infinite sets, some grand and others infinitesimal. Doesn’t get anymore philosophical then that. Slicing and dicing the infinite.

That’ll take you places.