Jump to content
Objectivism Online Forum

Nate T.

  • Posts

  • Joined

  • Last visited

Posts posted by Nate T.

  1. All right, Mindy, but your development of "quantity" explicitly leaves out the problem of measurement of the continuum, which is the whole point here. It's true that we can only measure physical objects using rational numbers, but 0.999... = 1 is meant in the sense of real numbers, and so is a concept of method.

    Anyway, you seem to be bothered by the fact that whatever quantity is denoted by the symbol "0.999..." cannot be represented as a terminating decimal of the form 0.?????. That's true, but has no bearing on the truth of 0.999... = 1.

    If you don't want to accept the existence of numerical expressions beyond terminating decimals I suppose that's your business, but don't go around claiming that it can't be done. I handle such objects all the time in a way that ties right back to reality, in the way that has already been presented in this thread.

    Anyway, I think we've both made our points, so I'm gonna bow out-- thanks for the discussion.

  2. Non-terminating decimals are not quantities. Quantities are determinate.

    This may be a good question to ask you and everyone else participating: what is a "quantity?" It seems that different sides of this discussion are answering the question in wildly different ways.

    As for real numbers not necessarily being quantities, what do you think about the square root of two? Is it a number, a quantity, or what? Or are the only numbers rational numbers since they're the only ones that can be physically constructed?

  3. Even though it's a treatise on induction, I found it to be among the better surveys of the history of science that I've seen so far. It helped me integrate a lot of grab-bag facts that I learned in science courses but didn't know the significance of until I understood the context (which is of course the point!)

    Edit: Oh, I should also add that I thought it made good points about induction, in particular its use of Peikoff's concept of the arbitrary to quell skeptic objections, and also the importance of recognizing that inductive proofs rely on the totality of one's knowledge (and modern philosophers not liking that).

  4. "Would" does not mean "is." I was tempted to stop there, but I'll add that non-terminating decimals are generally understood, by the layman, to be quotients. Few of us have memorized a list of the non-terminating decimals which can be produced by a ratio, and those which cannot. If that is the whole point, then this entire debate is about a trick question.

    Would be, but for what?

    And whether layman accept it or not, there are no natural numbers p and q for which (p/q)^2 = 2. And don't even think about throwing me overboard for having mentioned it. :ninja:

    Incidentally, there is a criterion for determining whether a non-terminating decimal arises from a quotient. Such a decimal number is rational precisely if it repeats after a certain point. There's even a nice way, given a periodic decimal expansion, to recover the quotient it's equal to. That method also recovers 0.999... = 1.

  5. It adds up to 1. Just like .999 + .000999... adds up to .999...

    The mathematical incommensurability of the diameter and circumference of a circle depends on the fact that the non-terminating fraction relating those two measurements is non-terminating.

    Help me with a confusion, if you will. We have a sequence, (.9, .99, .999, ... ) and the "biggest value" of that sequence, the "limiting value" is .999... . Here's my confusion. Is this limiting value a member of that sequence? If not, why not? If it is, then it is not itself the limit of the sequence. If it is not, isn't it the limit of the sequence?

    I seem to be reading that there is a sequence, it has a limit, and that limit has a limit. A sequence with a limiting value and a limit, ok, because they are both defined by the sequence. I don't know how a limit can be said to have a limit...

    -- Mindy

    The reason I brought up binary notation (base 2) was that in that notation, the fact 1 = 1/2 + 1/4 + 1/8 + ... is written as 1 = 0.111..., by definition. So it'd be odd if you accepted a sum of 1 in one notation and not 1 one in another. I'm not sure why you bring up Archimedes' constant.

    Anyway, as to your questions, there is no largest term of the sequence. The notation "0.999..." is just how everyone is (or at least how I am) writing "limit of the sequence (0.9, 0.99, 0.999, ...)" in shorthand. This limit has been shown to be the value 1, hence "0.999... = 1". If "0.999..." doesn't mean this, I don't know what else it would mean.

  6. Not all infinite series sum to a finite number. That was my point. By creating the series in a simple way, I could show easily that such a sequence could add up as required. That it was created by division was merely a matter of convenience. If we took .999... and divided it infinitely by 10, then added it all back up, we'd get .999... That doesn't get us anywhere in regards to .999... = 1.

    -- Mindy

    So you were just arguing that the infinite series you alluded to with the halving process converges to a finite sum, not that that sum ought to be equal to 1. Out of curiosity, what is your opinion as to the sum of 1/2 + 1/4 + 1/8 + ...?

  7. Two responses to this: One, I have no problem with a convention that substitutes a sequence's limit with its "numerical value," since it is merely a matter of convenience. I just note that that limit was not calculated to be equal, it was calculated to be the limit. The mathematical part leaves them unequal, the conventional part sets them equal.

    My other point is the thought that if you resort to treating the problem of multiplying .999... by 2 in such an elaborate way as equivalence classes of Cauchy sequences, aren't you implying that .999... is not, in the direct mathematical sense, equal to 1?

    That's right-- strictly speaking I should have said the result of multiplying (2, 2, ...) and (0.9, 0.99, ...) is *equivalent to* (2, 2, ...) again. This is why a real number is the *set of equivalence classes* of such sequences, not just one of them. I can see how that part would make people awfully jumpy, since there are necessarily infinitely many such sequences for any real number. So while you're certainly right that the two *representatives* of the real number commonly referred to as (1, 1, ...) and (0.9, 0.99, ...) are distinct sequences, they equal the same number. Regarding equivalent sequences as the same real number is to avoid introducing quantities smaller than any rational number, which'd just be useless, since as we all know here A is A and a unit of measuring length has a definite length and hence an error.

    I must be missing something--how are they the exact same problem?

    I was referring to this line:

    "There are infinite series that do, in fact, add to a finite number. Cut a quantity in half, repeatedly into infinity, then add it all back up, and the infinite series equals the original quantity, of course. But non-terminating decimals are not in this category."

    Why not instead say:

    "There are infinite series that do, in fact, add to a finite number. Cut a quantity in ten pieces, take one of them, cut it into ten pieces, take one of *those*, repeatedly into infinity, then add it all back up, and the infinite series equals the original quantity, of course. But non-terminating decimals are not in this category."

    Because that's exactly what 0.999... is getting at. I don't see why you're allowing yourself to half something indefinitely, but not cut into tenths.

  8. See Aleph_0's post #10. I meant "arbitrarily small/close."

    Oh good-- otherwise I'd have to make fun of you for having the last name Newton and arguing for infinitesimal quantities. Ghosts of departed quantities, I say!

    Right, it doesn't have a right-most digit, so how do you multiply it? When you multiplied .999... by 2, what did you get?

    I regard real numbers as (equivalence classes of ) Cauchy sequences. To show that you get 2 as the answer, take the representative (2, 2, 2, ...) for 2 and (0.9, 0.99, 0.999, ...) for .999... . Multiplication is termwise, so the product is the sequence (1.8, 1.98, 1.998, ...), the terms of which (as you note) become as close to 2 as you like, provided you're willing to go far enough out. So each individual term *does* have an 8 at the end, if you like, but the limit is still 2.


    Do you mean infinitely dividing a number by 10? Because that is not the same as dividing once and getting a non-terminating quotient.

    What it sounded like you were taking about was a "Zeno's paradox" process of allowing 1 = 1/2 + 1/4 + 1/8 + ... . If so, this is the exact same problem as .999... = 1-- in fact, in binary this fact is rendered as 1 = 0.1111..... . If that's not what you meant, sorry for misunderstanding.

  9. Not to rehash what had been done in the other thread, but Mindy,

    What does "infinitely close" and "infinitesimally small" mean? Also, how can any reasonable concept associated to something labeled 0.999... have a right-most digit? While we're at it, why doesn't your argument about the "acceptable" infinite series formed by repeatedly halving things "to infinity" and re-summing not work for 0.999...? After all, you're just repeatedly taking a tenth of a unit into infinity and adding it back together.

    I can peform the operations you say can't be done to 0.999... quite well, so I don't see why you say it isn't a number.

  10. Hopefully everyone's convinced that the real number formalism is okay at this point. A reason to accept the usual construction of the real numbers is that it formalizes solving problems by use of sequences and iterations, that is, it's the simplest number system containing the rationals that's complete.

    To see how the need for such a definition might arise in practice, consider the really easy linear equation

    x = 0.1x + 0.9

    This clearly has solution x = 1. You can think of this problem as asking "What number, when you divide by 10 and add 0.9, gives itself back?" Now other more complicated problems for which there are no explicit solutions often make use of an iterative procedure: plug a guess into the right hand side, use the result and plug it in again, etc, and hope that the answer gets closer and closer to something. You can actually approximate square roots this way pretty effectively, for example.

    If you try this for the above toy problem with the initial guess x = 0, you get the "defining sequence" of 0.999..., namely if f(x) = 0.1x + 0.9, then

    f(0) = 0.9

    f(f(0)) = 0.99

    f(f(f(0))) = 0.999


    You can actually prove from the equation that this sequence is "Cauchy", and so we can talk about the "limit" of this sequence, which really ought to "be" the solution to the original equation (that is, an x for which x = f(x)), in the sense that going far out enough into the sequence gives you as good of an approximation as you want to the solution (that's what "Cauchy sequence" means). But the limit is also what is typically referred to as 0.999..., so we should regard them as being the same thing in the sense of real numbers. Now if you started with another guess initially you'd get another sequence that still converges to 1, which is why they define real numbers as equivalence classes of Cauchy sequences.

  11. I've heard that even in the best case scenario that Brown is elected, it will be close, and so the Democrats will contest the vote in order to delay seating Brown in time to block any heath care votes. Nonetheless it's been remarkable to see how people from other states have given him so much money for being the "anti health care bill" candidate, and if he is elected (or even if narrowly defeated) it should make Democratic party leaders even more nervous for their prospects in November.

  12. Well, it looked like we were going to have Alex Epstein in a month or so, but unfortunately that fell through. At this point, I don't have any idea when the next one will be. Dr. Brook was talking about possibly having a debate. Any ideas about who a good opponent would be?

    The way the comments section in the article is going, we could probably just ask that French guy. <_< Seriously, I wouldn't know-- but I would be very interested in attending something like that, and it seems like Dr. Brook would enjoy it, given how well he was handling the Q&A.

  13. I always find it interesting how people object to the very notion of knowing something objectively. It's as though its a personal affront to them to state something as fact, and that unless everything is couched in terms of personal opinions you're using force in some strange way.

    It would have taken me everything to not come back at "It's not an objective fact, it's just an opinion!" with "Well, that's just your opinion."

  14. I would be curious as to why as well. Would a Communist nation not be as likely to create a virus that turns us all into zombies?

    Indeed. And even if they didn't do it intentionally, seems to me you'd get the best, quickest solution to the problem under a system where freedom is maximized, not where everyone is shackled by the state.

  15. So you are saying that we humans are evil genius enough to create this genetic nightmare but we could not hope to be smart enough to fix the problem?

    Not really, no-- it stands to reason that we would be able to figure out a way to reverse such damage if we have the technical ability to cause it in the first place. I was just curious about why the OP thinks that such a thing would only be a problem under laissez-faire as opposed to some other system.

  16. I'm not suggesting that the government control the economy and scientific research. I just want to know how this would be addressed in an LFC society, because it is a potentially very large problem.

    How could such a problem be addressed in any society? If it's really an unavoidable problem that no one can do anything about until after the fact, no one can act to stop it, by hypothesis.

  17. If it's just a matter of unavoidable accidents due to limitations of our knowledge, I don't see why having government control of the economy and scientific research would do anything to alleviate the problem. The government wouldn't have any extra knowledge pertinent to the situation, so they would probably just ban any research or technology they consider might be dangerous.

    Of course in the best case scenario they would decide what constitutes "dangerous research" based on choosing scientists (by whim, seniority, or common political ideology) to advise them, by listening to the screeching of the loudest activists on the front steps of congress, or because they feel it would make them look better in the eyes of their constituents who believe in an imaginary deity. It hardly inspires confidence.

  18. I think Lockhart correctly identifies the problem in today's math curriculum, which is that students are taught to shuffle symbols around by rote so they can pass the test without learning what any of it really means. However, the solution is to motivate the subject material, not to throw out systematic instruction and have the students reinvent the wheel as they go along. In particular, this requires math teachers that themselves actually know what it means and why the methods taught are efficient, not just teachers with education degrees reading out of the textbook as we have now.

    Also, his argument seems to appeal to an intrinsic/subjective dichotomy. He discards the intrinsic approach of "apply these formulas in the gray boxes to these story problems because we said so" with the subjective "just explore the ideas at whim and if we never get to long division it couldn't have been that useful or interesting anyway." His whole "ladder myth" is a rejection of hierarchical knowledge and further suggests subjectivism is involved in his reasoning.

  • Create New...