Jump to content
Objectivism Online Forum


  • Content count

  • Joined

  • Last visited

1 Follower

About isaac

  • Rank
    Junior Member
  • Birthday 07/01/79

Contact Methods

  • AIM
  • MSN
  • Website URL
  • ICQ
  • Yahoo

Profile Information

  • Location
    San Diego, CA

Previous Fields

  • Country
    United States
  • State (US/Canadian)
  • Real Name
    Isaac Z. Schlueter
  • Copyright
    Must Attribute
  • School or University
  • Occupation
    Computer Stuff
  1. Dondigitalia, you make a lot of good points. Personally, I believe that the Copenhagen interpretation is a good example of how bad science can happen when good scientists fall victim to bad philosophy. The Heisenberg uncertainty principle only states what we can know - it does not state what is. (Heisenberg himself, in his analysis of the principle, blurs the distinction badly.) When dealing with questions of certainty and knowledge, it makes perfect sense to say that the "exactness" of the location * the exactness of the momentum cannot exceed some fixed value. And, when dealing with questions of knowledge and certainty, this kind of discovery is useful and valid. However, if an electron is capable of having the properties of "location" and "momentum", then it must posess both in some fixed amount at a certain time. The fact that I don't know if the cat is alive or dead until I look doesn't mean that the cat is "half-dead" until we open the box - it's either dead or it's not. The law of identity does not play dice. But that's getting off-topic... Regarding shapeshifters, I think that there were a few good points made. 1. Shapeshifters are a violation of the law of identity. False (or at least, not necessarily true.) While we know of no way that a being could alter its form at will to any great degree (like Odo in Star Trek), we can say with absolute certainty that, if such a being were to exist, it would be able to alter its form in some manner, with some limits, according to its identity as a shapeshifter. In order to suspend the audience's disbelief, a writer must portray fantastical things in such a way that they are consistent and do not violate this principle. Odo has limits. His powers are great, but they're not infinite. He changes shape according a method, and according to his identity. A similiar principle applies to all fiction, whether we're talking about wizards or vampires or jedi or talking lizards or static-electricity-powered-engines. The exact powers may actually be impossible or require a leap of imagination on the part of the audience, but such a leap is impossible unless it is portrayed in a consistent and at least semi-plausible manner. If a writer doesn't follow this rule, then he ends up creating the "flame-apple-cup" thing that was discussed earlier. I can play along with a wizard who casts spells - but if the nature of his spellcasting changes from one page to the next, or if his powers do not have some limits, then I'm lost. That's one reason why the rules in a table-top RPG are what make it fun, even though the imagination and role-playing is what you focus on in the game. Without the rules, there are no "roles" to play. If a player has his every wish granted, it quickly becomes boring and the sense of realism and immersion is lost. A question that lends itself to a wide array of compelling plot twists, and becomes a metaphor for deception in general. When I receive an email from "[email protected]" asking for my password, is it actually eBay, or just a scam? This is a fundamental human experience, and so it should not be surprising or unusual to see it portrayed in a dressed-up form in fiction. 2. Art needs to portray "real" things This was already covered to some extent, but I'd like to point out that this is just not true. Metaphysical value judgements can be expressed in a wide variety of media, some of which may be very off the wall. Furthermore, one of the purposes of fiction is to take us away from our "normal" life, and express ideas through metaphor. When you set a story in a fantasy setting, it sometimes allows the archetypical aspects of man and society to be more clearly exposed, since nearly all of the nonessentials are different. (For example, consider the many relationships in LOTR or Orson Scott Card's Ender saga. Buggers might not exist, but "other-ness" is certainly a fundamental human experience. Elves may be make-believe, but heroism, loyalty, and grace are real qualities. Evil wizards and orcs are fantasy, but tyrants and their armies are real.) Does the "concept of middle-earth exist or portray a real thing"? Of course the concept exists - the fact that "Middle-Earth" refers to a fictional non-existent thing doesn't change the fact that it does refer to something (namely, a fictional setting). Middle-Earth has enough in common with our world that we can concieve of it easily. Trees there are roughly the same as trees here, and so on. I believe that there was a discussion in either the afterward of ItOE or somewhere in OPAR where they discuss whether or not "unicorn" is an invalid concept. Well, if you see a unicorn in a movie, do you know what to call it? There's your answer. Invalid concepts are ideas that cannot be described or reasonably talked about, concepts that ask us to suspend the axiom of identity. "God" is an invalid concept, because it is, by definition, something that is undefinable (i.e., something without any limits.) ...no real Middle Earth exists to serve as a standard of truth. Sure it does. The "real" middle earth is defined by the writings of J.R.R. Tolkien. If I wrote a story about a lake of fire in Chicago in the year 2045, where "elves" have 20 legs and breathe purple smoke, and said, "Oh, this story is set in Middle Earth," then that would be a false statement. To unpack it, I'd be saying, "The setting of this fictional story the same fictional setting that J.R.R. Tolkien described in the Lord of the Rings Trilogy, The Hobbit, and The Silmarillion." Clearly, that would be an incorrect statement, since my fictional setting bears little or no resemblance to his. Actually, I disagree somewhat with the idea that a proper noun cannot be a concept, or that it does not unite two or more existents. The key is thinking of temporal/spatial location as a property of an object. For example, "Isaac Z. Schlueter" refers to a particular person, and is a proper noun. However, the concept omits the measurement of time as well as (most of the) changes that occur over time, and (usually) refers to the parts of this person that are continuous throughout time. Consider these statements: "Yesterday, Isaac ate at McDonalds." "Tomorrow, Isaac is going to the park." Between yesterday and tomorrow, I changed, if ever so slightly. The measurement of those changes are omitted when using the term "Isaac". As the changes become more significant, it sometimes (but not always) becomes necessary to sub-categorize the concept further. For example, "When I was 5, I went to the beach every day. Today, I can't stand getting wet." There are two related sub-concepts here: "Isaac at 5 years old" and "Isaac as an adult." The same principle can be applied to any proper noun. When we say "Middle Earth" we are omitting the measurements of any differences in the places/people/events described in each of the different books. We are integrating the settings from all of the scenes that JRR Tolkien wrote in his fiction. A similiar principle can be applied to the "identical" tires. Even if two tires are exactly the same, down to the last quark (which is actually quite impossible, but whatever), they're still not in the exact same place at the same time. Temporal/spatial location is a measurement, just as much as length or width. 3. The Devil's Pitchfork It's an optical illusion. The term "devil's pitchfork" is a valid concept, but it refers to a drawing, not a square/round peg-thing. The drawing is not a valid representation of a "real" thing, but so what? That's the point of it. It plays on our interpretation of 2-dimensional line patterns into 3-dimensional objects, and purposely creates an invalid conflict to be visually jarring. The effect is kinda cool, but nothing to get all philosophically worked up about, IMO.
  2. What is the nature of deja vu?

    Chances are, you didn't actually see, dream, or experience the experience that you're going through during deja vu. If you really think about it, deja vu feels completely different than actually half-recognizing something that triggers a memory. Think of it this way. A coworker comes up to as you walk into work, when you were up too late the night before. You haven't finished your morning coffee or checked your email yet. Your stomach hasn't finished turning your breakfast into blood sugar. You're just getting rolling, and your brain is in a particularly shaky state. Situation 1: You feel an eerie sense of having heard it before, but you can't think of where. You know what he's saying, and what he's talking about, but you have a certain sense that the tone of his voice and the specific words he's choosing are reminescent of something else. It feels like recalling a dream. You start to wonder if you dream the future. Situation 2: He's talking about a project that the two of you are working on. You'll have to get to your desk and check your notes before you can give him an answer, and you tell him this. I don't think that anyone would have a problem picking out which one is deja vu, and which one is genuine half-recollection. It's the same issue that I take with desCartes' "Am I dreaming now?" line of reasoning. It's immediately apparent that deja vu is not genuine recollection, but it's similar enough that we might be intrigued by it. (I typically find myself trying to remember the "rest of the dream", and then realizing that there probably was no dream.) You might not be getting enough sleep. Or you might have temporal lobe epilepsy.
  3. ARI vs. TOC

    For clarity, it should be noted that "objectivism" was not a proper noun prior to Ayn Rand. It meant, in short, "not subjectivism," and still does in many contexts. After Ayn Rand capitalized the word and made it popular, it came to mean "Ayn Rand's philosophy," including, of course, egoism, capitalism, atheism, her view of concept formation, etc. - all of which is not directly implied by stating that one is not a subjectivist. The meaning of a word is all about context. Kelley means atheist, capitalist, etc., when he calls himself an Objectivist. That's not right. He ought to call himself a Randian, because that's what he is (in the sense that Rand was an Aristotelian - she borrowed much of her philosophy from him, but didn't 100% agree on all points.) If he only meant by the o-word that he is not a subjectivist, then it would be appropriate. Keep in mind, of course, that many senses of the word "objectivism" are utterly incompatible with Objectivism. For myself, I shy away from the word most of the time. More trouble than it's worth, IMOO. (in my objectiv(e/ist) opion, that is ) For the record, while I may disagree with the ARI on some things, I think that TOC is worse than useless. The analysis on their site is shallow, their explanations are weak, and it just stinks of a trite "you can be an Objectivist, too! oh, you're religious, well, don't worry, we're tolerant" kind of attitude. I find the same problems with Atlasphere.
  4. Gmail

    Well, logging into blogger and changing my profile doesn't do it. Maybe I'd have to post something. Or maybe I just missed the boat and will have to wait for the 0.1 release. Meh. GC: As you may have heard, MT is not going to be free any longer. However, b2evolution has pretty much all the same doodads as MT, and then some, and is very much free, and has a much simpler setup than MT. (All php/MySQL - No perl.) It's a cousin of Wordpress (both are decended from the original b2/CafeLog) but it's got much more going for it in the way of features. In fact, if you wanted to set up a metablog, you can even use it to host multiple different blogs all in the same site. EDIT: Oh, yeah, forgot to mention... I've been using b2evolution for some time now, and am a member of the dev team. I'd be happy to help set it up if you wanted to go that route and ran into any snags.
  5. Gmail

    So, let's say that someone has a blogger account. How do you sign up for gmail? Is it a "don't call us, we'll call you" sorta thing? (Since switching to b2evolution, I haven't used the blogger much. She is not so rich in features, after all, and php/MySQL is more fun than recreating the page each time.)
  6. Can a new language lead to better thinking?

    Speaking of English, I heard once years ago that "As American as Apple Pie" is actually a phoetic anglicization of a french phrase that was popular before the Revolution. Something about this thread got me thinking about that, and now I can't seem to dig up the reference, or remember what the french phrase was. Any ideas?
  7. Can a new language lead to better thinking?

    Capforever, Yes, that is amusing I have been in love with english as far back as I can remember. In high school, when I studied Latin, and learned more about the history of what happened on The Islands to make the language what it is, it became even more appealing. English truly is a "bastard tongue" in many ways - it doesn't really fit into a category nicely. It was once almost purely Germanic, but two infusions of Latin made it closer to a Romance language in many ways than a Germanic. (First was in the first century AD, when Rome conquered about 2/3s of the English Isle, and the second was about a millium later, when the Normans conquered the Saxons, and of course all the trade and war with France throughout much of the middle ages, all of which led to a proliferation of Anglicized French terms.) Then, of course, even more recently, English came to America, where it collided with the native tongues, and where wave after wave of immigrants came to this nation, bringing their own influence. And, of course, since America is an economic mega-giant, in many situations people must either learn English or go bankrupt! English is also useful in corporate settings where none of the participants speak a common language. (What dialect should be used when the German, Italian, Dutch, and French VPs of a multi-national company all meet? Quite often, it's English.) I was going to cite a few books, but y'all beat me to it. So here's a website that I find interesting: http://www.yourdictionary.com/library/index.html
  8. Re: Of/about... Knowing of something doesn't necessarily imply that you know much of anything about it, except that it exists. I.e., if someone says, "I have to go to the store to pick up a pack of gahoozits," then all you really know about gahoozits is: 1. They are things that exist. 2. They can be acquired at stores. 3. They come in packs. That isn't much. Learning that cars exist doesn't make you an automotive engineer, or even tell you what they are. It does tell you that they have some properties - i.e., that there is an "about" to learn - but it does not tell you what those properties are. Do you really disagree with me on this? Come on. I mean, the distinction between existence and identity is practically the introduction to Galt's speech. That's an ad hominem fallacy, and you know it. If you have a personal problem with me, you can PM or email me, and I'll be happy to ignore you. Please don't clutter up the board with this tripe.
  9. The ridiculous title is a joke, sort of. The essay uses a several reductions to absurdity to show that most of the discussion of this topic is misguided.
  10. You and Sloman have similar opinions of the debates by academics surrounding Nagel's article. He simply provides much more analysis of the situation, and doesn't set up a strawman to criticize Nagel. There's a big difference between knowing OF something and knowing ABOUT it. I know OF performing open-heart surgery. But I know quite a bit less ABOUT it than a doctor. We can speak of things that we know OF without knowing much of anything ABOUT them.
  11. Bob, Check out Jarvis's Violinist Example. Even if unborn humans are composing sonatas, abortion would not be unjust killing in the vast majority of cases. And you're right, this belongs in the abortion section. Though I am by no means an expert in the field of prenatal neurology, I've heard it quoted often that a fetus starts having some sort of brain activity around 28 weeks after conception, and it is at that point that we can say that it has some sort of human consciousness. (It's likely closer to a lower animal than to an adult human at that early stage of development, but it's something, and it is human.) Since "rational capacity" is just that - a capacity - it is the potential that matters. Before objecting, think of it this way: even willfully stupid people have rights. Because, if they choose to, they can think. (It may require years of training, but they have the tools, and that's what counts.) Marotta, Dolphins are damn smart - as animals go. Like chimps, and rats, and humans, they are pack predators, and they have developed intricate and powerful means of communicating. They work together, and they're ruthless. (Like every other mammalian pack predator.) Their survival depends upon being able to solve a wide variety of problems in a changing environment. They're magnificent creatures. But they don't rival humans. Not by a long shot. We can do calculus. We have countless different, distinct languages. We use tools to make tools to make tools that are only used for making millions of hammers at a time. We plan for our retirement when we're 25 (if we're smart!) As smart as dolphins are, they aren't anywhere near humans. Animals don't have rights. But I'd be suspicious of and repulsed by someone who beats a dog for fun. All the really smart animals are vicious as hell. They may make allies, in fact they probably will, but the smarter they get, the meaner they are to everyone else. Humans provide many prime examples of this tendency. Observe an average 2-year old, and consider his level self-restraint and his utter raw determination to have his every whim satisfied, and you'll be glad that he's so small. If humans didn't find a way to get out of that stage before adulthood, we'd've killed ourselves off long ago. That is the evolutionary advantage of civilized society.
  12. Yep, the news has broken, alright. If you could email me the pdf, or a link to it if it's really big, I'd absolutely love it. Technical is good ones. Thank you. isaacschlueter is the beginning part, and then an @ of course, and then hotmail.com (Fighting spambots.)
  13. Bowser, are you using the term "intentionality" in the philosophical sense? Or do you mean Making a genetic fallacy about DNA. That's classic. Is your wit intentional? (Or is it just a series of states with no correllation to anything outside itself? Ok, I'll stop with the puns, I promise.) And how! Stephen, that's fascinating! I've read about projects that do things that could have conceivably yielded stuff like this, theoretically, blah blah, but someone's actually doing it? Wow! Do you know specifically when it will be published? Are there any web references you could provide us with? Nature should thank you. You just sold me a copy of their magazine.
  14. Bowser and Ash, you should both read Nagel's What is it like to be a bat? and Sloman's What is it like to be a rock? Both articles lend a lot of good analysis to this discussion. Sloman's is one of the best discussions of this problem that I've seen. Also, both of them say a lot of the same things that you guys are saying. It's just that rationalists of all varieties like to sieze on the "what is it like to be" question with glee, in ways that are clearly not what Nagel was getting at. That's actually what the Sloman article is all about - rationalists thinking that the words like "consciousness" and "phenomenology" can still have meaning after they've been stripped of definitions and placement in the heirarchy of concepts.
  15. Sir Karl Popper

    His point was that, whether you get your ideas from reading tea leaves or studying the evidence - if they come to you in the laboratory or in the bathroom - that's not relevant. What is relevant is that your theory holds water and makes risky predictions. I should have known better. Forget I asked.