Jump to content
Objectivism Online Forum

isaac

Regulars
  • Posts

    94
  • Joined

  • Last visited

Everything posted by isaac

  1. Dondigitalia, you make a lot of good points. Personally, I believe that the Copenhagen interpretation is a good example of how bad science can happen when good scientists fall victim to bad philosophy. The Heisenberg uncertainty principle only states what we can know - it does not state what is. (Heisenberg himself, in his analysis of the principle, blurs the distinction badly.) When dealing with questions of certainty and knowledge, it makes perfect sense to say that the "exactness" of the location * the exactness of the momentum cannot exceed some fixed value. And, when dealing with questions of knowledge and certainty, this kind of discovery is useful and valid. However, if an electron is capable of having the properties of "location" and "momentum", then it must posess both in some fixed amount at a certain time. The fact that I don't know if the cat is alive or dead until I look doesn't mean that the cat is "half-dead" until we open the box - it's either dead or it's not. The law of identity does not play dice. But that's getting off-topic... Regarding shapeshifters, I think that there were a few good points made. 1. Shapeshifters are a violation of the law of identity. False (or at least, not necessarily true.) While we know of no way that a being could alter its form at will to any great degree (like Odo in Star Trek), we can say with absolute certainty that, if such a being were to exist, it would be able to alter its form in some manner, with some limits, according to its identity as a shapeshifter. In order to suspend the audience's disbelief, a writer must portray fantastical things in such a way that they are consistent and do not violate this principle. Odo has limits. His powers are great, but they're not infinite. He changes shape according a method, and according to his identity. A similiar principle applies to all fiction, whether we're talking about wizards or vampires or jedi or talking lizards or static-electricity-powered-engines. The exact powers may actually be impossible or require a leap of imagination on the part of the audience, but such a leap is impossible unless it is portrayed in a consistent and at least semi-plausible manner. If a writer doesn't follow this rule, then he ends up creating the "flame-apple-cup" thing that was discussed earlier. I can play along with a wizard who casts spells - but if the nature of his spellcasting changes from one page to the next, or if his powers do not have some limits, then I'm lost. That's one reason why the rules in a table-top RPG are what make it fun, even though the imagination and role-playing is what you focus on in the game. Without the rules, there are no "roles" to play. If a player has his every wish granted, it quickly becomes boring and the sense of realism and immersion is lost. A question that lends itself to a wide array of compelling plot twists, and becomes a metaphor for deception in general. When I receive an email from "[email protected]" asking for my password, is it actually eBay, or just a scam? This is a fundamental human experience, and so it should not be surprising or unusual to see it portrayed in a dressed-up form in fiction. 2. Art needs to portray "real" things This was already covered to some extent, but I'd like to point out that this is just not true. Metaphysical value judgements can be expressed in a wide variety of media, some of which may be very off the wall. Furthermore, one of the purposes of fiction is to take us away from our "normal" life, and express ideas through metaphor. When you set a story in a fantasy setting, it sometimes allows the archetypical aspects of man and society to be more clearly exposed, since nearly all of the nonessentials are different. (For example, consider the many relationships in LOTR or Orson Scott Card's Ender saga. Buggers might not exist, but "other-ness" is certainly a fundamental human experience. Elves may be make-believe, but heroism, loyalty, and grace are real qualities. Evil wizards and orcs are fantasy, but tyrants and their armies are real.) Does the "concept of middle-earth exist or portray a real thing"? Of course the concept exists - the fact that "Middle-Earth" refers to a fictional non-existent thing doesn't change the fact that it does refer to something (namely, a fictional setting). Middle-Earth has enough in common with our world that we can concieve of it easily. Trees there are roughly the same as trees here, and so on. I believe that there was a discussion in either the afterward of ItOE or somewhere in OPAR where they discuss whether or not "unicorn" is an invalid concept. Well, if you see a unicorn in a movie, do you know what to call it? There's your answer. Invalid concepts are ideas that cannot be described or reasonably talked about, concepts that ask us to suspend the axiom of identity. "God" is an invalid concept, because it is, by definition, something that is undefinable (i.e., something without any limits.) ...no real Middle Earth exists to serve as a standard of truth. Sure it does. The "real" middle earth is defined by the writings of J.R.R. Tolkien. If I wrote a story about a lake of fire in Chicago in the year 2045, where "elves" have 20 legs and breathe purple smoke, and said, "Oh, this story is set in Middle Earth," then that would be a false statement. To unpack it, I'd be saying, "The setting of this fictional story the same fictional setting that J.R.R. Tolkien described in the Lord of the Rings Trilogy, The Hobbit, and The Silmarillion." Clearly, that would be an incorrect statement, since my fictional setting bears little or no resemblance to his. Actually, I disagree somewhat with the idea that a proper noun cannot be a concept, or that it does not unite two or more existents. The key is thinking of temporal/spatial location as a property of an object. For example, "Isaac Z. Schlueter" refers to a particular person, and is a proper noun. However, the concept omits the measurement of time as well as (most of the) changes that occur over time, and (usually) refers to the parts of this person that are continuous throughout time. Consider these statements: "Yesterday, Isaac ate at McDonalds." "Tomorrow, Isaac is going to the park." Between yesterday and tomorrow, I changed, if ever so slightly. The measurement of those changes are omitted when using the term "Isaac". As the changes become more significant, it sometimes (but not always) becomes necessary to sub-categorize the concept further. For example, "When I was 5, I went to the beach every day. Today, I can't stand getting wet." There are two related sub-concepts here: "Isaac at 5 years old" and "Isaac as an adult." The same principle can be applied to any proper noun. When we say "Middle Earth" we are omitting the measurements of any differences in the places/people/events described in each of the different books. We are integrating the settings from all of the scenes that JRR Tolkien wrote in his fiction. A similiar principle can be applied to the "identical" tires. Even if two tires are exactly the same, down to the last quark (which is actually quite impossible, but whatever), they're still not in the exact same place at the same time. Temporal/spatial location is a measurement, just as much as length or width. 3. The Devil's Pitchfork It's an optical illusion. The term "devil's pitchfork" is a valid concept, but it refers to a drawing, not a square/round peg-thing. The drawing is not a valid representation of a "real" thing, but so what? That's the point of it. It plays on our interpretation of 2-dimensional line patterns into 3-dimensional objects, and purposely creates an invalid conflict to be visually jarring. The effect is kinda cool, but nothing to get all philosophically worked up about, IMO.
  2. Chances are, you didn't actually see, dream, or experience the experience that you're going through during deja vu. If you really think about it, deja vu feels completely different than actually half-recognizing something that triggers a memory. Think of it this way. A coworker comes up to as you walk into work, when you were up too late the night before. You haven't finished your morning coffee or checked your email yet. Your stomach hasn't finished turning your breakfast into blood sugar. You're just getting rolling, and your brain is in a particularly shaky state. Situation 1: You feel an eerie sense of having heard it before, but you can't think of where. You know what he's saying, and what he's talking about, but you have a certain sense that the tone of his voice and the specific words he's choosing are reminescent of something else. It feels like recalling a dream. You start to wonder if you dream the future. Situation 2: He's talking about a project that the two of you are working on. You'll have to get to your desk and check your notes before you can give him an answer, and you tell him this. I don't think that anyone would have a problem picking out which one is deja vu, and which one is genuine half-recollection. It's the same issue that I take with desCartes' "Am I dreaming now?" line of reasoning. It's immediately apparent that deja vu is not genuine recollection, but it's similar enough that we might be intrigued by it. (I typically find myself trying to remember the "rest of the dream", and then realizing that there probably was no dream.) You might not be getting enough sleep. Or you might have temporal lobe epilepsy.
  3. For clarity, it should be noted that "objectivism" was not a proper noun prior to Ayn Rand. It meant, in short, "not subjectivism," and still does in many contexts. After Ayn Rand capitalized the word and made it popular, it came to mean "Ayn Rand's philosophy," including, of course, egoism, capitalism, atheism, her view of concept formation, etc. - all of which is not directly implied by stating that one is not a subjectivist. The meaning of a word is all about context. Kelley means atheist, capitalist, etc., when he calls himself an Objectivist. That's not right. He ought to call himself a Randian, because that's what he is (in the sense that Rand was an Aristotelian - she borrowed much of her philosophy from him, but didn't 100% agree on all points.) If he only meant by the o-word that he is not a subjectivist, then it would be appropriate. Keep in mind, of course, that many senses of the word "objectivism" are utterly incompatible with Objectivism. For myself, I shy away from the word most of the time. More trouble than it's worth, IMOO. (in my objectiv(e/ist) opion, that is ) For the record, while I may disagree with the ARI on some things, I think that TOC is worse than useless. The analysis on their site is shallow, their explanations are weak, and it just stinks of a trite "you can be an Objectivist, too! oh, you're religious, well, don't worry, we're tolerant" kind of attitude. I find the same problems with Atlasphere.
  4. Well, logging into blogger and changing my profile doesn't do it. Maybe I'd have to post something. Or maybe I just missed the boat and will have to wait for the 0.1 release. Meh. GC: As you may have heard, MT is not going to be free any longer. However, b2evolution has pretty much all the same doodads as MT, and then some, and is very much free, and has a much simpler setup than MT. (All php/MySQL - No perl.) It's a cousin of Wordpress (both are decended from the original b2/CafeLog) but it's got much more going for it in the way of features. In fact, if you wanted to set up a metablog, you can even use it to host multiple different blogs all in the same site. EDIT: Oh, yeah, forgot to mention... I've been using b2evolution for some time now, and am a member of the dev team. I'd be happy to help set it up if you wanted to go that route and ran into any snags.
  5. So, let's say that someone has a blogger account. How do you sign up for gmail? Is it a "don't call us, we'll call you" sorta thing? (Since switching to b2evolution, I haven't used the blogger much. She is not so rich in features, after all, and php/MySQL is more fun than recreating the page each time.)
  6. Speaking of English, I heard once years ago that "As American as Apple Pie" is actually a phoetic anglicization of a french phrase that was popular before the Revolution. Something about this thread got me thinking about that, and now I can't seem to dig up the reference, or remember what the french phrase was. Any ideas?
  7. Capforever, Yes, that is amusing I have been in love with english as far back as I can remember. In high school, when I studied Latin, and learned more about the history of what happened on The Islands to make the language what it is, it became even more appealing. English truly is a "bastard tongue" in many ways - it doesn't really fit into a category nicely. It was once almost purely Germanic, but two infusions of Latin made it closer to a Romance language in many ways than a Germanic. (First was in the first century AD, when Rome conquered about 2/3s of the English Isle, and the second was about a millium later, when the Normans conquered the Saxons, and of course all the trade and war with France throughout much of the middle ages, all of which led to a proliferation of Anglicized French terms.) Then, of course, even more recently, English came to America, where it collided with the native tongues, and where wave after wave of immigrants came to this nation, bringing their own influence. And, of course, since America is an economic mega-giant, in many situations people must either learn English or go bankrupt! English is also useful in corporate settings where none of the participants speak a common language. (What dialect should be used when the German, Italian, Dutch, and French VPs of a multi-national company all meet? Quite often, it's English.) I was going to cite a few books, but y'all beat me to it. So here's a website that I find interesting: http://www.yourdictionary.com/library/index.html
  8. Re: Of/about... Knowing of something doesn't necessarily imply that you know much of anything about it, except that it exists. I.e., if someone says, "I have to go to the store to pick up a pack of gahoozits," then all you really know about gahoozits is: 1. They are things that exist. 2. They can be acquired at stores. 3. They come in packs. That isn't much. Learning that cars exist doesn't make you an automotive engineer, or even tell you what they are. It does tell you that they have some properties - i.e., that there is an "about" to learn - but it does not tell you what those properties are. Do you really disagree with me on this? Come on. I mean, the distinction between existence and identity is practically the introduction to Galt's speech. That's an ad hominem fallacy, and you know it. If you have a personal problem with me, you can PM or email me, and I'll be happy to ignore you. Please don't clutter up the board with this tripe.
  9. The ridiculous title is a joke, sort of. The essay uses a several reductions to absurdity to show that most of the discussion of this topic is misguided.
  10. You and Sloman have similar opinions of the debates by academics surrounding Nagel's article. He simply provides much more analysis of the situation, and doesn't set up a strawman to criticize Nagel. There's a big difference between knowing OF something and knowing ABOUT it. I know OF performing open-heart surgery. But I know quite a bit less ABOUT it than a doctor. We can speak of things that we know OF without knowing much of anything ABOUT them.
  11. Bob, Check out Jarvis's Violinist Example. Even if unborn humans are composing sonatas, abortion would not be unjust killing in the vast majority of cases. And you're right, this belongs in the abortion section. Though I am by no means an expert in the field of prenatal neurology, I've heard it quoted often that a fetus starts having some sort of brain activity around 28 weeks after conception, and it is at that point that we can say that it has some sort of human consciousness. (It's likely closer to a lower animal than to an adult human at that early stage of development, but it's something, and it is human.) Since "rational capacity" is just that - a capacity - it is the potential that matters. Before objecting, think of it this way: even willfully stupid people have rights. Because, if they choose to, they can think. (It may require years of training, but they have the tools, and that's what counts.) Marotta, Dolphins are damn smart - as animals go. Like chimps, and rats, and humans, they are pack predators, and they have developed intricate and powerful means of communicating. They work together, and they're ruthless. (Like every other mammalian pack predator.) Their survival depends upon being able to solve a wide variety of problems in a changing environment. They're magnificent creatures. But they don't rival humans. Not by a long shot. We can do calculus. We have countless different, distinct languages. We use tools to make tools to make tools that are only used for making millions of hammers at a time. We plan for our retirement when we're 25 (if we're smart!) As smart as dolphins are, they aren't anywhere near humans. Animals don't have rights. But I'd be suspicious of and repulsed by someone who beats a dog for fun. All the really smart animals are vicious as hell. They may make allies, in fact they probably will, but the smarter they get, the meaner they are to everyone else. Humans provide many prime examples of this tendency. Observe an average 2-year old, and consider his level self-restraint and his utter raw determination to have his every whim satisfied, and you'll be glad that he's so small. If humans didn't find a way to get out of that stage before adulthood, we'd've killed ourselves off long ago. That is the evolutionary advantage of civilized society.
  12. Yep, the news has broken, alright. If you could email me the pdf, or a link to it if it's really big, I'd absolutely love it. Technical is good ones. Thank you. isaacschlueter is the beginning part, and then an @ of course, and then hotmail.com (Fighting spambots.)
  13. Bowser, are you using the term "intentionality" in the philosophical sense? Or do you mean Making a genetic fallacy about DNA. That's classic. Is your wit intentional? (Or is it just a series of states with no correllation to anything outside itself? Ok, I'll stop with the puns, I promise.) And how! Stephen, that's fascinating! I've read about projects that do things that could have conceivably yielded stuff like this, theoretically, blah blah, but someone's actually doing it? Wow! Do you know specifically when it will be published? Are there any web references you could provide us with? Nature should thank you. You just sold me a copy of their magazine.
  14. Bowser and Ash, you should both read Nagel's What is it like to be a bat? and Sloman's What is it like to be a rock? Both articles lend a lot of good analysis to this discussion. Sloman's is one of the best discussions of this problem that I've seen. Also, both of them say a lot of the same things that you guys are saying. It's just that rationalists of all varieties like to sieze on the "what is it like to be" question with glee, in ways that are clearly not what Nagel was getting at. That's actually what the Sloman article is all about - rationalists thinking that the words like "consciousness" and "phenomenology" can still have meaning after they've been stripped of definitions and placement in the heirarchy of concepts.
  15. His point was that, whether you get your ideas from reading tea leaves or studying the evidence - if they come to you in the laboratory or in the bathroom - that's not relevant. What is relevant is that your theory holds water and makes risky predictions. I should have known better. Forget I asked.
  16. I'll dig up my philosophy texts when I get home tonite, and point out some articles. If you read his stuff on abduction, I don't think that anyone would be able to say that he wasn't putting reality first. Falsification is just the last step in a long process, and he makes that pretty clear. According to Popper, if the process ends in falsification, then it's science, and if not, then it's not. Personally, I'm not sure that's true, as fact-finding (while less exciting) is still valid and useful science.
  17. I've read Lakatos and Feyerabend. I agree with your assessment of them. I didn't realize that they were Popper's students. (I've always found the theories much more interesting than the soap operas.) Of course, a good teacher can have bad students, and vice versa. It doesn't justify crediting Popper with unleashing Kantianism into the philosophy of science. Lastly, of course Kuhn's work was heavily influenced by Popper's. I think it'd be hard to find a single philosopher of science who wasn't influenced by Sir Karl in some way. I mean, the guy made a lot of really good points. (I think that his only major failing, if there is one, is that he focused too heavily on the "testing new theories" aspect of science, and sort of ignored the "filling in the blanks" part. The human genome project, for example, made no risky predictions, but it was certainly science. Granted, filling in the blanks is much less sexy ) Isaac http://isaac.beigetower.org
  18. Ah, good point, Stephen. Nevertheless, that seems really weak as an objection to Popper. If I say "A", and that "inspires" other philosophers to say "not A", then you can't really say that I'm responsible for the "not A" attitudes, can you? That'd be like crediting Marx with the rise of Capitalism. Isaac
  19. Can someone please define the word "consciousness" for the purpose of this discussion? It seems to me that consciousness is a sort of information processing. The question is: is it logically possible that the same sort of information processing can be done by something other than a biological organism? Why or why not? First, though, we must define what we're talking about, or we're wasting electrons on this.
  20. I just got this in my spambucket: I'm a little disturbed by this. I've read quite a bit of Popper's writing, and I've never known him to claim "that scientific facts are products of the social interaction among scientists." (That was an idea which is attributed to Kuhn, and even in Kuhn's case, it's really a distortion by those who followed after him. What he said was that paradigm shifts are akin to political upheavals. I've never read where Sir Karl said any such thing.) Claiming that he says that "induction is a myth" is taking a word out of context, I believe. Popper's account of what he called "abduction" is identical to the back-and-forth empirical/logical brand of thinking that Peikoff and Rand label "induction." When he said "induction," he was responding to anti-science anti-realist rationalists who viewed deduction as the only valid form of reasoning. Looking at it etymologically, "abduction" is a better term. Deduction: applying the properties of the set to individual cases. Induction: applying the properties of individual cases to the set. Abduction: finding which possible properties of the set would make the individual cases seem reasonable. He made a good case that humans actually do abduction most of the time - deduction is really only for special cases. "Induction" is an ambiguous term in many contexts. He claimed that the "deductivists" were beating up a straw man. He let them have the word, and created a new one to express what he really meant. With the concept of abduction tying theory to reality, Popper points out that scientific theories are specifically not arbitrary. The point about the "absence of falsification" also seems like a mischaracterization. Popper said that, in order for a theory to be considered "science," it must be, in theory, falsifiable. For example, Marxism started out making risky predictions, and at that time, could have been called science. When the predictions didn't come true, and Marxists invented a new angle in the theory to account for every possible circumstance, it stopped being science and started being Astrology. He never attacked positive evidence - he merely stated that positive evidence for something which cannot logically be disproven is not meaningful. For example, take this theory: According to Marxism/Leninism, the workers will overthrow the bourgoisie, unless there is an Imperialist force. The workers did not overthrow the bourgoisie. Therefor, there is an Imperialist force. Therefor, Marxism/Leninism is correct. Clearly, that doesn't hold water. No matter WHAT happened, the Marxist would say, "AHA! This PROVES it!" Karl Popper, as far as I've seen, was an adamant realist, pro-reason, and one of the best philosophers of science ever. I'd like to know: Has anyone seen/heard this lecture? Have you read any of Popper's works? Have you read any of Kuhn's works? What is your opinion of Popper? Am I in error about him? What is your opinion of Kuhn? What is your opinion of the lecture? Worth $60? I realize that I may be reading too much into the advert. After all, the best way to sell stuff to objectivists is to claim "So-and-so made society break, and we'll show you why," and the ARB is in the business of selling lectures. But I'm really wondering if it's worth shelling out $60-$70 just to find out that Dragsdahl has mischaracterized a philosopher I respect, and heaped upon him abuse and criticism that should be directed at others in his field. Isaac http://isaac.beigetower.org
  21. English is the greatest language ever. No need to change it. (Of course, I'm speaking almost entirely out of my personal bias, since it's the language I am most fluent with ) Languages evolve over time, generally in response to changes in the ideas that people need to communicate. I really don't think that english is very inefficient. And, I think that it's already getting more and more effecient each day, as changes occur in the language. Common phrases get shortened into single words, long words that are used often get shortened, etc. Isaac http://isaac.beigetower.org
  22. Also, I would like to point out that some of the research in cognitive science and artificial intelligence have already produced useful technology. That doesn't prove that the theories are valid - but it does strengthen them considerably.
  23. Bowser, It sounds like this is your argument, in a nutshell. Correct me if I'm missing something. 1. Brains are biological. 2. It is impossible to do what a brain does without being biological. .: A (non-biological) artificial brain is impossible. You say that you believe that there is evidence available right now to confirm this. I believe that your second proposition is specious and untrue. If you could provide some of the "available evidence," or at least a strong argument for this based on mutually held propositions, I'm sure we'd all appreciate it. It's easy to say that there's a problem. It's much harder to point out exactly what that problem is, and even harder to come up with a solution. Personally, I believe that you're right, to an extent: a lot of work in cognitive science and artificial intelligence has been short-circuited by bad philosophy. (please excuse the pun!) However, that does not necessarily mean that all the research done in these fields is bogus or useless. I'll grant that a lot of time and energy has been wasted by people who really don't understand what a mind is, which could have been put to more productive use. But you're going to have to show what those errors are, and that they are a fundamental problem with the topic as such, if you're going to make a case for writing off the entire field. Here's a similar example, where the error is more apparent: Altruism may be a philosophical error, and it may be the case that many or most doctors are altruists. However, it's not clear that altruism will throw off the conclusions that one draws in the field of medicine. Furthermore, even if the error in question does affect the conclusions, it's not clear that ALL doctors are altruists - therefor, we've only shown that one must investigate claims made by doctors carefully, with our philosophical radar on active ping. Isaac Schlueter http://isaac.beigetower.org
  24. W00t! That website says I read in (or near) the top 1%. Yay! (993wpm, 82%) Prometheus, I find just the opposite. When I read novels, I really take my time with each sentence, but when it's something technical, I usually breeze through it pretty fast. Then again, I usually spend 9 hours a day reading technical logs and documentation, and since it's my job, and I have to find and deliver solutions to our customers fast, I think I've grown accustomed to skimming effectively. It's definitely something that can be learned and comes with practice. Also, missing details in that context could be disastrous. The odd thing is that I didn't really try to learn to speed read. It's kinda like learning to speak French by moving to Paris. It just sort of happens. My skimming habits actually get in the way of reading non-technical info. When I'm skimming, it feels to me that I take in a whole section sequentially, but if I slow down and analyze what I'm really doing, I usually jump to the end of the second paragraph, read the last sentence, read the rest of the second paragraph, skim the beginning, glance over the third paragraph, etc. It's like I'm all over the place, painting back and forth. It doesn't throw me off with technical stuff. Actually, it helps me understand it more quickly. (New concepts or really complicated stuff often requires that I slow down, of course.) But when I read a story, I find that I sometimes have to actively focus to not do that, or else the scenes make no sense, and I have to reread paragraphs frequently. Part of it is probably that most technical info is kinda dry, and it's all about getting the facts and sorting them quickly, whereas I read fiction specifically for the smooth flow of the story, and the discovery of information as the author intended. Also, I'm often reading call logs, where the last line is always the most interesting and relevant part, since that's how the call ended. A lot of times, once I read that, I don't need the rest, or it's easier to internalize the rest because I see where it's headed. As much reading as I do at work, I find that I'm significantly more tired out at 17:00 if I don't read a little fiction each day. It's odd that being tired from reading so much would be helped by more reading, but I figure it must straighten something out upstairs, like it balances the mental meal or something. If you want to increase your reading ability, it's just like any other learned skill. Read a lot, and read the sort of material that you want to be able to get quickly, and try to push yourself to comprehend quickly. The skeptical website is right about that: it's all about comprehension. I don't know about the intuitive point regarding always moving forward, since my own experience seems to back up the "jump around fast" method. But I think that a non-linear approach to the problem, if appropriate, will sort of just become natural if you're working on fast comprehension. I would doubt that a single method would work best for everyone. The best advantage of speed reading is being able to watch subbed movies without missing the action Isaac
  25. The Philosophic Thought of Ayn Rand is an interesting read, if only because it's not by an Objectivist. Some of the articles are pretty bad, but some of them are really good. (I thought the one comparing Rand's and Aristotles' ethics was great.) Used starting at $8.95. Not a book for beginners. I'd recommend studying OPAR thoroughly first. A big part of reading TPOAR is spotting errors. Isaac http://isaac.beigetower.org
×
×
  • Create New...