Jump to content
Objectivism Online Forum


  • Posts

  • Joined

  • Last visited

Previous Fields

  • Country
    Not Specified
  • State (US/Canadian)
    Not Specified

June's Achievements


Novice (2/7)



  1. Okay, Stephen, I'm leaving now. But I just want to say, it's not about the attention, because if it was, I could have also been a good Objectivist contributor to this forum; thus, increasing my acceptability and attention attainment. It's more like: I don't get a kick in the pants too often, and it is healthy every once in a while.
  2. I wouldn’t say that I’ve given up on myself or other people. As one becomes increasingly smarter, it feels increasingly better. I’m sure most rational people experience this phenomenon and, therefore, desire incessant self-enhancement. However, there’s a limit to the level of smartness human beings can attain, while there’s also much relative variance in those limits among people and more so when considering all sentient creatures as a whole. It just so happens that the emergence of transhuman optimization systems is a plausible event, one likely to occur before effective-yet-arbitrary value designers are able to proselytize scary arms bearers. If so, its study and deliberate pursuit facilitates recognition of our physical and cognitive shortcomings, which are discovered, not invented. In trying to foresee beyond this event, our model of reality only breaks down; it’s impossible to apprehend what it’s like to be smarter-better-kinder than human. In the meantime, I believe it’s in our best interests to retain adequate levels of economic freedom that catalyze a break-off from our primitive chemical make-up. But it simply isn’t necessary to rigidly delimit the commensurate range of integrity appliance. "When you call yourself an Indian or a Muslim or a Christian or a European, or anything else, you are being violent. Do you see why it is violent? Because you are separating yourself from the rest of mankind. When you separate yourself by belief, by nationality, by tradition, it breeds violence. So a man who is seeking to understand violence does not belong to any country, to any religion, to any political party or partial system; he is concerned with the total understanding of mankind." ~ J. Krishnamurti, "Freedom from the Known"
  3. When I was an Objectivist, I had similar frustrations, i.e., everyone seemed to be an enemy unless I turned to Objectivist fora for consolation and reinforcement. Fortunately, I somehow reasoned my way toward learning evolutionary psychology, among other invaluable lessons, and came to realize that many of my fellow humans aren’t innately built to deny their collectivist tendencies even in light of contrasted intellectual conclusions and devastating antinomies. It’s a strong selection pressure not to vociferously exhibit self-interested behavior. Not holding everyone fully responsible for their conduct leads to several refreshing conclusions. One immediately coming to mind is simply conceding that humans aren’t optimal—and certainly not heroic. What may follow is a quieter and less anxious-prone climb toward developing optimization processes (e.g. transhuman intelligence). Instead of enduring a lifetime of bewilderment as to why so many people seem to “choose” being stupid, let’s create excess wealth and use it to minimize the opportunity costs of getting at the root of the problem.
  4. Specific numbers don’t say very much. Ranges are a little more telling, since tests vary in modes of analysis. From personal experience only, I’ve observed that if one averages within 130 to 160 across various IQ tests, then he is likely to have the ability or potential to develop his physical and mental world in ways with which he can be comfortable in the long run. Most good business persons and very creative introverts would fall in this range. Consistent numbers higher than this range are a likely indication of a potentiality for modifying or revising major paradigms. Groundbreaking scientists, theorists and philosophers would fall above 160. Those with mental handicaps, bad attitudes toward life, or those who are culturally incompatible or complacent, are likely to fall below 130 with little exception.
  5. Yes, ethics are more fundamental than science, but they can’t be so detached that anthropocentric conceits are held onto for dear life in the face of super-high rationality and Bayesian reasoning. What follows are a philosophy and actual practice demonstrating a supreme—on the subscale of human minds—syncretist, memetic structure. Is it agreeable with the status quo? Can we afford to abandon absolute, self-interested behavior by shifting efforts, resources, and smug mind-sets toward accelerating the emergence of recursive self-improvers, which fortunately happens to be driven by both self-interest and compassion toward sentient-kind? For some, probably no—some simply can’t afford to abandon the perceived pragmatic power of a core, objectivist value system. It’s refreshing and comforting to view the human mind as having boundless potential. But it doesn’t. Neither do “man qua man” heuristic propellers have any bearing on humankind’s diligent push toward smarter-than-human, kinder-than-human intelligence and their bootstrapping using nanotechnology and seed-AI. However, the tensional sentiments in other responses are respectfully understood, just unexpected in ways that could have alternatively shown curiosity and engagement, especially since the transhumanist epistemology exhibits forward stepping and an honesty with ourselves rather than a regression.
  6. No, transhumanism is not a joke. There are many rational and scientific people who consider transhumanism. You're right, it is a false assumption if one definition was inferred instead of the other. In this case, however, libertarianism is the belief of free will. Undoubtedly there is a lot of controversy surrounding FW's actual meaning. It can be extrapolated how objectivists think about FW, and from this it is an appropriate assumption.
  7. How do objectivists perceive transhumanism? They are mostly compatible, especially in the degree realism is assumed, except in their ethics and the magnitude of respectability the human mind deserves. Where objectivists generally believe that humankind’s basic virtue is irreconcilable with altruism, transhumanists tend to believe this is a false dichotomy. According to transhumanists, anything possessing sentient properties deserves to be optimized with adaptive qualities primed for nature’s default without a prerequisite that any form of suffering is necessary. Since emphasis is placed on the development of recursive self-enhancing transhuman intelligence, presuming the subsequent emergence of superintelligence and posthumanity, there is no reason to think that ultra-advanced technology or smarter-than-human, kinder-than-human intelligence can’t richly accommodate all behaviors, even the behaviors of those who opt not to become augmented. Hence, transhumanism subsumes altruism while transiently holding human rationality in higher regard, just not, while in procession, at the expense of sentient-kind and qualified skepticism. Also, while objectivists seem to be totally confident that human minds are optimal volition containers indefinitely, transhumanists perceive human minds as primordial on a minds-in-general continuum and human bodies as vulnerable, mortal entities requiring extensive augmentation if it is assumed that life is preferred over death and, by extension, extreme longevity, if not physical immortality, preferred over mortality. Some transhumanists, as they learn more, are even beginning to abandon libertarianism since this doctrine assumes a high degree of free will. It simply doesn’t exist at the level it’s inclined to be fatuously presumed, i.e., far-reaching originations are conscious-independent and volition a negligible, subservient manifestation. In regards to the original question, would objectivists be at odds with transhumanists? Is the fabric of objectivism being challenged by a worldview that appears to be much more sophisticated? Related reading: The Transhumanist FAQ Staring into the Singularity
  • Create New...