Jump to content
Objectivism Online Forum

why does 1/3=.333333333

Rate this topic


The Wrath
 Share

Recommended Posts

The thread about .99999999 prompted me to make this thread. I've thought about it before though...why is it that 1/3=.333333333? I mean, when you take .33333333333 and multiply it by 3, you do not get one. I'm assuming the explanation is similar to the ones that some people gave in the other thread

Link to comment
Share on other sites

The thread about .99999999 prompted me to make this thread.  I've thought about it before though...why is it that 1/3=.333333333?  I mean, when you take .33333333333 and multiply it by 3, you do not get one.  I'm assuming the explanation is similar to the ones that some people gave in the other thread

1/3 doesn't really = .3333 repeating. 1/3 is an irrational number, meaning that it can't be accurately represented as a decimal. Anytime you represent 1/3 as a decimal it is only an approximation, no matter how many 3s you put on the end of it

Edit: I made a mistake, 1/3 is not an irrational number. An irrational number is a number that cannot be represented as an exact ratio of two integers. 1/3 is just a ratio that cannot be represented as a terminal decimal.

Edited by Bryan
Link to comment
Share on other sites

1/3 doesnt equal .333333333, it equals .33333~ where the ~ signifies that the sequence of 3s repeat to infinity (some people write three dots, as in 0.33333... instead). All real numbers can be represented by an infinite decimal expansion in this way.

When you multiply .33333~ by 3 you get .9999~, an infinitely recurring sequence of 9s, which is equal to 1.

The argument given in the other thread to show that .999~ = 1 is informal and not particularly rigorous, but you can prove it properly using elementary calculus.

1/3 doesn't really = .3333 repeating. 1/3 is an irrational number, meaning that it can't be accurately represented as a decimal. Anytime you represent 1/3 as a decimal it is only an approximation, no matter how many 3s you put on the end of it

1/3 is a rational number. Any number of the form p/q where p and q are integers is rational. The difference between rational and irrational numbers is that the decimal expansion of a rational number is periodic, ie your infinite sequence will repeat a certain pattern over and over again.

For instance, 2/7 = 0.285714285714285714, which is just '285714' repeating to infinity, and 1/6 = 1.66666~ where the 6 repeats to infinity. However an irrational number such as root(2) or pi will have a non-periodic decimal expansion, ie there is no sequence that repeats over and over again (for instance, the number represented by the decimal expansion 0.101001000100001000001etc, where the number of zeros inbetween the 1s gets larger and larger, will be irrational since there is nothing that repeats).

Edited by Hal
Link to comment
Share on other sites

The argument given in the other thread to show that .999~ = 1 is informal and not particularly rigorous, but you can prove it properly using elementary calculus.

The argument in the other thread is based on an algebraic error, not calculus. You can never say that .999999~ = 1, it approaches 1 but never quite equals it.

Link to comment
Share on other sites

The argument in the other thread is based on an algebraic error, not calculus.  You can never say that .999999~ = 1, it approaches 1 but never quite equals it.

Yes you can, at least in the standard construction of the real numbers where you don't have infinitesimals. The real numbers are dense, which means that in between any 2 real numbers is a third. If 0.999~ is not equal to 1, then which number is between then?

Edited by Hal
Link to comment
Share on other sites

The proof using calculus.

0.999~ is the infinite series 0.9 + 0.09 + 0.009 + ... which is a geometric progression with initial term 0.9 and ratio (1/10). From calculus, the sum of an infinite series is (initial term)/(1 - ratio). Substituting in here we get (0.9)/(1-(0.1)) = 0.9/0.9 = 1.

Link to comment
Share on other sites

Yes you can, at least in the standard construction of the real numbers where you don't have infinitesimals. The real numbers are dense, which means that in between any 2 real numbers is a third. If 0.999~ is not equal to 1, then which number is between then?

There is no number between them, .999~ is the largest number less than 1.

0.999~ doesn't actually exist in reality.

Link to comment
Share on other sites

There is no number between them, .999~ is the largest number less than 1.  0.999~ doesn't actually exist in reality.

There is no largest number less than 1, as I said the real numbers are infinitely dense. I'm not sure what you mean by 0.99~ doesnt exist in reality.

It makes more sense (to me at least) if you think of 0.999~ as being an infinite series (0.9 + 0.09 + 0.009 + ...), which is how it's defined mathematically, rather than being a big long list of 9s

Edited by Hal
Link to comment
Share on other sites

Bryan,

The argument in the other thread is based on an algebraic error, not calculus. You can never say that .999999~ = 1, it approaches 1 but never quite equals it.

...

There is no number between them, .999~ is the largest number less than 1.

0.999~ doesn't actually exist in reality.

I'm confused by these statements.

Is .999~ a number, or not? If so, how do you define this number, what is it?

What do you think it means for a number to exist in reality?

What does it mean for a number to 'approach' another number?

You know that there is no greatest number less than one, right? If you think you have such a number x, then the (1 + x)/2 is still less than one, but greater than x.

If 0.999~ doesn't exist in reality, what do you think about any other irrational number like pi or the square root of two? Since we can only give finite decimal approximations of these numbers, do they also not exist in reality?

Link to comment
Share on other sites

Bryan,

I'm confused by these statements. 

Is .999~ a number, or not?  If so, how do you define this number, what is it?

I'm in the minority here, but I still hold that .999~ is not 1. Abstractly speaking it is the largest number that is less than 1. The difference between .999~ and one is inifinitely small, but there is a difference.

What do you think it means for a number to exist in reality?
For a number to exist in reality, it needs to be represented as a measurement of something. Since inifinity does not actually exist, inifinite decimal numbers don't exist. In the example of 1 and .999~, lets pretend you have something with a mass of 1 unit. You remove 1 single electron from it. Now the mass is 1 unit - the mass of one electron. Its a measureable amount and a finite number. You can't take an infinitely small amount away, because ininity doesn't exist.

What does it mean for a number to 'approach' another number?

A number can't really approach a number. When I said the .999~ approaches 1, I meant that the infinite sum 0.9 + 0.09 + 0.009 + ... approaches 1. It never actually equals one because the sum never ends, but everytime you carry out another iteration of it, you get a number that is closer to 1.

You know that there is no greatest number less than one, right?  If you think you have such a number x, then the (1 + x)/2 is still less than one, but greater than x.
This is why I said that .999~ doesn't actually exist in reality. The function you gave above is a perfect example of what .999~ is. Lets say that you performed (1+x)/2 over and over starting with x = .99 and then replaced x with the result and performed the operation again. If you did this an inifinite number of times, your "final" answer would be .999~.

If 0.999~ doesn't exist in reality, what do you think about any other irrational number like pi or the square root of two?  Since we can only give finite decimal approximations of these numbers, do they also not exist in reality?

Obviously pi and the sqrt(2) exist, They just can't be 100% accurately represented using decimal numbers. That has nothing to do with their nature, it has to do with the nature of the things they describe.

Link to comment
Share on other sites

Bryan,

0.999~ doesn't refer to anything in reality? You'd better be careful about statements like that. Does -1 refer to anything in reality? How about i = sqrt(-1)? If these numbers are concepts of method, then why isn't 0.999~? After all, just as negative and complex numbers have many applications, so do summing series and limits (in fact, they make all of calculus possible, which is the only thing that makes modern physics possible).

Also, I wasn't able to get a straight answer out of you about the status of 0.999~. Is it a real number? An 'abstract' number (abstract in what sense)? One minus an infinitesimal (what is an infinitesimal)? The greatest number less than one (which can't happen, btw)? A sequence of rational numbers? You seem to be using all of these definitions interchangably.

I contend that the only way to make any sense whatsoever of the expression 0.999~ is for it to represent the limit of the sequence (0.9, 0.99, 0.999, ...). Since one can show that this sequence converges to 1, it follows that 0.999~ = 1. I agree that the sequence never does reach one--that's fine. But the limit of the sequence is what 0.999~ denotes, and that is equal to 1.

Link to comment
Share on other sites

Bryan,

0.999~ doesn't refer to anything in reality?  You'd better be careful about statements like that.  Does -1 refer to anything in reality?  How about i = sqrt(-1)?  If these numbers are concepts of method, then why isn't 0.999~?  After all, just as negative and complex numbers have many applications, so do summing series and limits (in fact, they make all of calculus possible, which is the only thing that makes modern physics possible).

No, I don't see the particular number .999~, directly referring to anything in reality. Infinity only exists as an abstract concept; therefore infinite series only exist as concepts. Let's be clear about mathematics. In ITOE, Ayn Rand aptly defines mathematics as "the science of measurement". Calculus isn't made possible by infinite series and limits, it is made possible by reality. It was developed as a manner of describing and measuring things in reality. Infinite sums and limits are used to describe calculus concepts, not the other way around.

Also, I wasn't able to get a straight answer out of you about the status of 0.999~.  Is it a real number?  An 'abstract' number (abstract in what sense)?  One minus an infinitesimal (what is an infinitesimal)?  The greatest number less than one (which can't happen, btw)?  A sequence of rational numbers?  You seem to be using all of these definitions interchangably.
Mathematically speaking, a real number is a non-complex number, so yes, it’s a real number. It is an abstract number the sense that it only exists as a concept. I never said "infinitesimal" so I can't help you there. I demonstrated how .999~ is the largest number less than 1 with function you provided [f(x) = (1+x)/2 ], again this is just an abstract concept, perhaps I wasn't clear. And yes it is a sequence of rational numbers, specifically the sequence 0.9 + 0.09 + 0.009 + ...

I'm not using the definitions interchangeably, its just that .999~ fits all these definitions.

I contend that the only way to make any sense whatsoever of the expression 0.999~ is for it to represent the limit of the sequence (0.9, 0.99, 0.999, ...).  Since one can show that this sequence converges to 1, it follows that 0.999~ = 1.  I agree that the sequence never does reach one--that's fine.  But the limit of the sequence is what 0.999~ denotes, and that is equal to 1.

The limit of the infinite sum 0.9 + 0.09 + 0.009 + ... does indeed equal 1. But this does not mean that the actual number .999~ is equal to 1. Just because the limit of something equals 1 does not mean that the something is 1.

0.999~ can be approximated by 1 because its pretty damn close, but its still not exactly 1. Nothing is exactly 1 but 1.

Link to comment
Share on other sites

Bryan,

I appreciate your candor in trying to defend the idea that there is no metaphysical infinity. But don't misunderstand what it means-- in mathematics it's perfectly valid to talk about infinite sets, infinite series, etc., because mathematics is a discipline involving concepts of method.

No, I don't see the particular number .999~, directly referring to anything in reality.  Infinity only exists as an abstract concept; therefore infinite series only exist as concepts.
All numbers also 'only' exist as abstract concepts. In what way is the infinite series 0.999~ less real or less valid than the concept of 1?

Let's be clear about mathematics.  In ITOE, Ayn Rand aptly defines mathematics as "the science of measurement".

She also mentions that mathematics is a concept of method, which includes the valid concept of infinity to help aid the ultimate goal of mathematics to measure things.

Calculus isn't made possible by infinite series and limits, it is made possible by reality.  It was developed as a manner of describing and measuring things in reality.  Infinite sums and limits are used to describe calculus concepts, not the other way around.
First, both reality and limits are necessary to make a theory of calculus. Second, infinite sums and limits are calculus concepts! One uses the concepts of calculus to solve problems in reality, yes.

Mathematically speaking, a real number is a non-complex number, so yes, it’s a real number.

...? This isn't how real numbers are defined. It merely steals the concept of 'complex numbers', which are defined in terms of the real numbers.

It is an abstract number the sense that it only exists as a concept.  I never said "infinitesimal" so I can't help you there.
First of all, all numbers 'only' exist as concepts. You did say "infinitely small amount away", which is what an infinitesimal is. Care to elaborate on what an "infinitely small amount" is?

I demonstrated how .999~ is the largest number less than 1 with function you provided [f(x) = (1+x)/2 ], again this is just an abstract concept, perhaps I wasn't clear.

The reason I brought up that expression was because it helped to prove that there is no greatest number less than one. The argument goes as follows: Suppose there is a largest value less than one. Call it 'x'. Then (x + 1)/2 is strictly greater than x, but strictly less than one. Therefore, by the definition of x, it follows that (1 + x)/2 <= x. But then x < (x + 1)/2 <= x. This means that x is something other than itself. But A is A, so contradictions do not exist; one of our premises is wrong. The wrong premise is the beginning one, that x is the largest number strictly less than one-- no such number can exist.

And yes it is a sequence of rational numbers, specifically the sequence 0.9 + 0.09 + 0.009 + ...
Let's get something straight right now. A sequence is a countable list of numbers. A series is the limit of a sequence of partial sums of numbers. The above isn't a sequence, it's a series. And anyway, if you really accept this, you've already admitted that .999~ = 1.

I'm not using the definitions interchangeably, its just that .999~ fits all these definitions.

If you open ItOE to the section "Definitions", Rand defines a definition to be "a statement that identifies the nature of the units subsumed under a concept." Moreover, it is the statement which condenses and explains all the other knowledge of the concept in question under the context of one's current knowledge. If you try to do this for the various 'definitions' for 0.999~ you present above, you'll find little more than an anti-concept, since most of those definitions don't even describe the same objects and are contradictory.

The limit of the infinite sum 0.9 + 0.09 + 0.009 + ... does indeed equal 1.  But this does not mean that the actual number .999~ is equal to 1.  Just because the limit of something equals 1 does not mean that the something is 1. 

What else is the 'actual number' .999~ besides the limit of the sequence (0.9, 0.99, 0.999, ...)? I agree that just because the limit of a sequence is one does not mean the sequence is one. Sequences aren't numbers, and I never claimed that they were.

0.999~ can be approximated by 1 because its pretty damn close, but its still not exactly 1.  Nothing is exactly 1 but 1.

How close is it? Blank out.

Link to comment
Share on other sites

Consider three sequences:

A is 0.9, 0.99, 0.999, 0.9999, etc.

B is 1.1, 1.01, 1.001, etc.

C is 0.9, 1.1, 0.99, 1.01, 0.999, 1.001, etc.

We define limA, limB, limC to be the limits of the sequences.

If limA differs from 1 then so does limB (and that limA does not equal limB). If this is true then you have to say that limC is undefined since it consists of two infinite subsequences converging to two different values.

If however you say that limC = 1, then you have to say any infinite subsequence of C is 1 and, in particular, that limA = limB = 1

Do you really want to go so far as to say limC is undefined?

Edited by punk
Link to comment
Share on other sites

In ITOE, Ayn Rand aptly defines mathematics as "the science of measurement".  Calculus isn't made possible by infinite series and limits, it is made possible by reality.  It was developed as a manner of describing and measuring things in reality.  Infinite sums and limits are used to describe calculus concepts, not the other way around.
I think that this is a bad definition. Mathematics can certainly be used to measure things, but saying that it is 'just' about measuring things (or even mainly about measuring things) is incorrect. What measurements are involved in complex analysis? Or number theory? Are these not valid areas of mathematics?

What do you mean by calculus is "made possible by reality"? I suppose this is trivially true in the sense that if nothing existed then noone would have been able to come up with calculus, but calculus can be (and usually is) defined in a purely formal manner that makes no mention whatsoever of physical or geometric objects, and it still works perfectly well. Calculus is useful because of certain aspects of reality, yes.

Edited by Hal
Link to comment
Share on other sites

Bryan,

I appreciate your candor in trying to defend the idea that there is no

metaphysical infinity.  But don't misunderstand what it means-- in

mathematics it's perfectly valid to talk about infinite sets, infinite

series, etc., because mathematics is a discipline involving concepts

of method.

I understand that it is valid to discuss infinity; my point of this

whole thread is that you can't use the concept of infinity to set a

number equal to a number that it is not equal to.

All numbers also 'only' exist as abstract concepts.  In what way is

the infinite series 0.999~ less real or less valid than the concept of

1?

They are both valid concepts, the difference being that 1 can be directly

related to things in reality.

...?  This isn't how real numbers are defined.  It merely steals the

concept of 'complex numbers', which are defined in terms of the real

numbers.

Do you care to give a definition? Mine was ok, just kind of sloppy.

Here are some other's from

google:

" any number that is not imaginary Example:"1.23156..., 5, 8/6, e,

square root (3)"

" A number with an integer and a fractional part. The primitive types

double and float are used to represent real numbers. "

" A real number is one-dimensional and can be placed somewhere on the

number line. The set of real numbers includes all rational and all

irrational numbers. "

" Any finite or infinite decimal. Any rational or irrational number. "

First of all, all numbers 'only' exist as concepts. You did say

"infinitely small amount away", which is what an infinitesimal is.

Care to elaborate on what an "infinitely small amount" is?

If you accept the concept of infinity, something can be infinitely large or

infinitely small. The only limit on the smallness is that it can never be zero; it always has to be something.

The reason I brought up that expression was because it helped to prove

that there is no greatest number less than one.  The argument

goes as follows:  Suppose there is a largest value less than one.

Call it 'x'.  Then (x + 1)/2 is strictly greater than x, but strictly

less than one.  Therefore, by the definition of x, it follows that (1

+ x)/2 <= x.  But then x < (x + 1)/2 <= x.  This means that x is

something other than itself.  But A is A, so contradictions do not

exist; one of our premises is wrong.  The wrong premise is the

beginning one, that x is the largest number strictly less than one--

no such number can exist.

Let’s do this with numbers:

y = (x+1)/2, where x is .99: y = .995

Now substitute .995 in for x:

y = (.995+1)/2: y = .9975

Keep repeating over and over, replacing x each time with the result for y the time before. What is the “final” answer after and infinite amount of iterations? .999~

This is why .999~ is the largest number that is less than 1.

What else is the 'actual number' .999~ besides the limit of the

sequence (0.9, 0.99, 0.999, ...)?

It is the result of infinite iterations of the recursive function x = (x+1)/2.

It is the result of the infinite sum of the sequence (.9, .99, .999, .9999, …).

How close is it?  Blank out.

It is an infinitely small amount less.

Edit: Sorry about the text wrapping of this post, it turns out the going from the message board to gmail to word back to the message board messes it up :thumbsup:

Edited by Bryan
Link to comment
Share on other sites

I think that this is a bad definition. Mathematics can certainly be used to measure things, but saying that it is 'just' about measuring things (or even mainly about measuring things) is incorrect. What measurements are involved in complex analysis? Or number theory? Are these not valid areas of mathematics?

Declaring Miss Rand's definition to be bad depends on your definition of measurement. If you define measurement as the identification of relationships, mathematics certainly qualifies as the science of measurement.

As far as complex analysis goes, it can be directly used to measure the magnitude and phases of voltages and currents in electrical systems.

I have no experience in number theory but I googled it and it said "A branch of mathematics that investigates the relationships and properties of numbers." That being said, you could describe number theory as the measurement of numbers themselves.

What do you mean by calculus is "made possible by reality"? I suppose this is trivially true in the sense that if nothing existed then noone would have been able to come up with calculus, but calculus can be (and usually is) defined in a purely formal manner that makes no mention whatsoever of physical or geometric objects, and it still works perfectly well. Calculus is useful because of certain aspects of reality, yes.

I was trying to make a distinction between calculus being used as a mathematical tool to describe reality as opposed to the mathematical tools that are used to describe calculus (i.e. limits and infinite sums).

Link to comment
Share on other sites

This is an absurb thread. Why does 1+1=2? The Law of Identity says it does. A thing is what it is. A is A. .333333333333333333333333333333~ = 1/3 and .99999999999999999999999999999999999999999999999999999999999999~ = 1 for the exact same reason. It's the same way to right to different numbers.

For f(x) = (x+1)/2 as x gets arbitrarily close to 1 the limit = 1. The key word is arbitrarily which is usually taken to mean infintesimally close. But the infintesimal or the arbitrary doesn't exist in reality. And that's why .999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999

99999999999999999999999999999999999999999999999999999999999999999999999~ = 1. A is A

Link to comment
Share on other sites

Bryan,

I understand that it is valid to discuss infinity; my point of this

whole thread is that you can't use the concept of infinity to set a

number equal to a number that it is not equal to.

You haven't even told me what 0.999~ is yet, let alone that it's a number not equal to one. In fact, if you assume that 0.999~ < 1, you can get a pretty big contradiction.

Suppose that 0.999~ < 1. Then e = 1 - 0.999~ is a positive number. Hence, there exists an integer N such that 1/10^N < e. Now it should be clear that 0.99...9 (N + 1 9's) < 0.999~. But this implies that e = 1 - 0.999~ < 1 - 0.99...9 (N + 1 9's) < 1/10^N <= e. Once again, we get e is not itself, which is a contradiction.

They are both valid concepts, the difference being that 1 can be directly

related to things in reality.

Okay, that's a good step, and you're right-- the concept 1 is far less abstract than 0.999~.

Do you care to give a definition?
If you insist...

The way the real numbers are constructed from the rational are as equivalence classes of Cauchy sequences of rational numbers, equipped with termwise addition and multiplication. It's a result in Analysis to show that this structure is an algebraic field and has the familiar order properties of the real line, as well as being complete.

Essentially, this means that every real number can always be identified with a sequence of rational numbers (a, b, c, ...). In our case, each of the sequences are equivalent to the real number 1:

(1, 1, 1, ...)

(0.9, 0.99, 0.999, ...)

(1.1, 1.01, 1.001, ...)

(0.9, 1.1, 0.99, 1.01, ...)

If you accept the concept of infinity, something can be infinitely large or

infinitely small.

You're right, to this extent: if you accept axiomatically the existence of an infinitely large number, you can build a system in which infinitesimal numbers exist, and in such a system the series 0.9 +0.09 + ... will converge to a number less than one by an infintiesimal. You can even say which infintiesimal.

But if you'd like to stay in the real number system, where there are no 'infintiely large numbers', just sequences of numbers increasing without bound, you cannot have a smallest positive number. If you think you have such a number, then half of it is always both positive and strictly smaller, which is a contradiction.

The only limit on the smallness is that it can never be zero; it always has to be something.
But what is the 'smallness' exactly? Presumably it's some number-- you appear to be postulating this 'something' as existence without identity here. If a number is larger than zero it has to be larger than zero by some amount. What amount is it?

Let’s do this with numbers:

y = (x+1)/2, where x is .99:  y = .995

Now substitute .995 in for x:

y = (.995+1)/2:  y = .9975

Keep repeating over and over, replacing x each time with the result for y the time before.  What is the “final” answer after and infinite amount of iterations?  .999~

Okay, well let's work with this. Let's define f(x) = (1 + x)/2, as you've done above, and consider the sequence of numbers

f(x)

f(f(x))

f(f(f(x)))

...

You claim that the "final answer" a = f(f(f(...))) is equal to 0.999~, right? Okay, well clearly by the definition of a, we have f(a) = a. Therefore, (1 + a)/2 = a, whence 1 + a = 2a, so a = 1. QED.

This is why .999~ is the largest number that is less than 1.

Such a number doesn't exist. I gave you the proof of that already. If you can find an incorrect step in that proof, please show me.

Link to comment
Share on other sites

Bryan,

You haven't even told me what 0.999~ is yet, let alone that it's a number not equal to one.  In fact, if you assume that 0.999~ < 1, you can get a pretty big contradiction.

Suppose that 0.999~ < 1.  Then e = 1 - 0.999~ is a positive number.  Hence, there exists an integer N such that 1/10^N < e.  Now it should be clear that 0.99...9 (N + 1 9's) < 0.999~.  But this implies that e = 1 - 0.999~ < 1 - 0.99...9 (N + 1 9's) < 1/10^N <= e.  Once again, we get e is not itself, which is a contradiction.

Not if N is infinity.

If you insist...

The way the real numbers are constructed from the rational are as equivalence classes of Cauchy sequences of rational numbers, equipped with termwise addition and multiplication.  It's a result in Analysis to show that this structure is an algebraic field and has the familiar order properties of the real line, as well as being complete.

....

This the same definition I gave using more complex terminology.

You're right, to this extent: if you accept axiomatically the existence of an infinitely large number, you can build a system in which infinitesimal numbers exist, and in such a system the series 0.9 +0.09 + ... will converge to a number less than one by an infintiesimal.  You can even say which infintiesimal.

This is why .999~ < 1. If you put a cap on the largest number, whatever the sum of .9+.09.+.009+.0009.... iterated the "largest number" of times, the final result will be less than 1.

But what is the 'smallness' exactly?  Presumably it's some number-- you appear to be postulating this 'something' as existence without identity here.  If a number is larger than zero it has to be larger than zero by some amount.  What amount is it?

The "something" is whatever units your are working with. The actual unit you use is arbritrary.

Okay, well let's work with this.  Let's define f(x) = (1 + x)/2, as you've done above, and consider the sequence of numbers

f(x)

f(f(x))

f(f(f(x)))

...

You claim that the "final answer" a = f(f(f(...))) is equal to 0.999~, right?  Okay, well clearly by the definition of a, we have f(a) = a.  Therefore, (1 + a)/2 = a, whence 1 + a = 2a, so a = 1. QED.

No!!!! The final answer is dependant the cap you put on the number of iterations. No matter what the cap it is, the final answer is less than 1!.

Such a number doesn't exist. I gave you the proof of that already. If you can find an incorrect step in that proof, please show me.

The incorrect step that there is not an infinite number of steps. I understand that it can be demonstrated that there always a number that between two numbers, in fact there are an infinite number of numbers between any two numbers. This is beginning to fall into the dead horse beating catagory.

Nothing you have demonstrated shows that .999~ is equal to 1. Everything I have demonstrated shows that it is less than 1.

Link to comment
Share on other sites

Bryan,

Not if N is infinity.
If you want to admit a literal infinity into the natural numbers, that's your business. But you'd better be ready to supply an infinite collection of objects on demand to prove that your concept belongs in the same genus as the natural numbers 1, 2, 3, ...

This the same definition I gave using more complex terminology.

No, it isn't. It makes no reference to "infintely small amounts" nor "least positive numbers." But since you agree to this definition, this discussion is over: 0.999~ = 1. I refer you to any Real Analysis text for the proof, if you're interested.

This is why .999~ < 1.  If you put a cap on the largest number, whatever the sum of .9+.09.+.009+.0009.... iterated the "largest number" of times, the final result will be less than 1.

This is an arbitrary assertion. How do you know that the limit of a sequence of terms which are less than one is less than one? Do you have a proof of this?

You have to define 0.9 + 0.09 + 0.009 + ... to be the limit of the sequence (0.9, 0.99, 0.999, ...). There is no other consistent way to describe what you're talking about.

The "something" is whatever units your are working with.  The actual unit you use is arbritrary.
You misunderstand what I'm asking for. I accept the unit we are working with is one. You say that an "infinitely small amount" is a number that is really small, but still not zero. I asked you how small that was. You evaded the question. I ask again: how small is 'infinitely' small? And you'd better not answer "smaller than any positive number", since no such quantity exists.

No!!!! The final answer is dependant the cap you put on the number of iterations.  No matter what the cap it is, the final answer is less than 1!.

Every interation you get is less than one, that's right. But none of those is the final answer, since you can always stick that back into f again. Hence, your observation that all of the finite iterations are less than one is irrelevant. You must describe this 'final' answer in terms of a limit, and the limit here is also 1.

The incorrect step that there is not an infinite number of steps.  I understand that it can be demonstrated that there always a number that between two numbers, in fact there are an infinite number of numbers between any two numbers.  This is beginning to fall into the dead horse beating catagory.
To which proof does this refer? I was referring to my proof that there is no largest number less than one. There are no "infinite number of steps" in that proof. In fact, none of the proofs that I've presented so far contain an infinite number of steps. I'm beginning to think that you think that the 'x' in that proof isn't a constant, but a variable.

Nothing you have demonstrated shows that .999~ is equal to 1.  Everything I have demonstrated shows that it is less than 1.

You have demonstrated nothing but the fact that you don't understand what 0.999~ means.

In any case, we are beating a dead horse here. If you do accept the definition of real numbers that I proposed, the ball game is over, and 0.999~ = 1. If you want to use infinitesimals and whatever god-awful number system you can dream up to justify your assertion that 0.999~ and 1 are not identical, be my guest; I'll have nothing to do with it.

Link to comment
Share on other sites

This is an absurb thread. Why does 1+1=2? The Law of Identity says it does. A thing is what it is. A is A. .333333333333333333333333333333~ = 1/3 and .99999999999999999999999999999999999999999999999999999999999999~ = 1 for the exact same reason. It's the same way to right to different numbers.

For f(x) = (x+1)/2 as x gets arbitrarily close to 1 the limit = 1. The key word is arbitrarily which is usually taken to mean infintesimally close. But the infintesimal or the arbitrary doesn't exist in reality. And that's why .999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999

99999999999999999999999999999999999999999999999999999999999999999999999~ = 1.  A is A

Actually it isn't as absurd as all that. The calculus as formulated by Newton and Leibniz made use of the idea of an infinitesimal, so for A = 0.99999...

1 - A = d

where d is an infinitesimal and d does not equal zero.

Modern analysis following Weierstrass eliminates any appeals to infinitesimals.

Its all in how you formulate your system.

Link to comment
Share on other sites

If you want to use infinitesimals and whatever god-awful number system you can dream up to justify your assertion that 0.999~ and 1 are not identical, be my guest; I'll have nothing to do with it.
I'm curious; why? A lot of people, including myself, find infinitesimals to make more intuitive sense than limits - indeed calculus was originally formulated using infinitesmials and wasnt grouded upon a notion of limits until over a century later. The only reason infinitesimals were abandonded was because it was thought impossible to formalise them in a consistent manner, but since it has now been demonstrated that this can be done, I personally don't understand the need to continue teaching the epsilon-delta system in calculus classes. Several teachers who have experimented with teaching calculus using non-standard analysis have claimed that their students ended up with a better understanding than the classes they taught using traditional methods (see this for instance).

I personally think that the limit approach is highly unnatural, and encourages the 'it gets closer and closer but never actually reaches it because you cant do things infinite times' feeling that a lot of people get. I know that I struggled with it conceptually for quite some time when first introduced.

Edited by Hal
Link to comment
Share on other sites

Bryan,

If you want to admit a literal infinity into the natural numbers, that's your business.  But you'd better be ready to supply an infinite collection of objects on demand to prove that your concept belongs in the same genus as the natural numbers 1, 2, 3, ...

What is the the largest number in the sequence of natural numbers?

No, it isn't.  It makes no reference to "infintely small amounts" nor "least positive numbers."  But since you agree to this definition, this discussion is over: 0.999~ = 1.  I refer you to any Real Analysis text for the proof, if you're interested.
I was talking about the defintion of a real number.

This is an arbitrary assertion. How do you know that the limit of a sequence of terms which are less than one is less than one? Do you have a proof of this?

I'm saying that the sum of i = 1 to i = [insert the largest number you can think of here] of 9/(10^i) is less than one.

You have to define 0.9 + 0.09 + 0.009 + ... to be the limit of the sequence (0.9, 0.99, 0.999, ...).  There is no other consistent way to describe what you're talking about.
This is where we have a disgreement. I'm talking about a literal sum, you are talking about a limit.

You misunderstand what I'm asking for.  I accept the unit we are working with is one.  You say that an "infinitely small amount" is a number that is really small, but still not zero.  I asked you how small that was.  You evaded the question.  I ask again: how small is 'infinitely' small?  And you'd better not answer "smaller than any positive number", since no such quantity exists.

1/infinity is infinitely small.

Every interation you get is less than one, that's right.  But none of those is the final answer, since you can always stick that back into f again.  Hence, your observation that all of the finite iterations are less than one is irrelevant.  You must describe this 'final' answer in terms of a limit, and the limit here is also 1.

The final answer is always an actual number, and that actual number is never 1. You can't actually obtain an actual number doing an infinite number of interations, you can describe the final answer after an infinite number of interations is 0.999~. When in the process of calculating these interations, would the answer every jump up to 1?

To which proof does this refer?  I was referring to my proof that there is no largest number less than one.  There are no "infinite number of steps" in that proof.  In fact, none of the proofs that I've presented so far contain an infinite number of steps.  I'm beginning to think that you think that the 'x' in that proof isn't a constant, but a variable.
Your proof was valid in showing that there is always a number between two numbers. Using the basis of that proof, making x a variable, and repeating it recursivly an infinite number of times, the final x you get is the largest number less than 1. You can't actually get a final x unless you put an artificial cap on infinity.

You have demonstrated nothing but the fact that you don't understand what 0.999~ means.

Tell me one more time what is means, I'll tell you if I agree.

BTW, what is your mathematical background?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...