Jump to content
Objectivism Online Forum

Zombies and Artificial Minds

Rate this topic


Recommended Posts

1 hour ago, New Buddha said:

..  If they are truly SAE, then why would they follow a human's program, or participate in the marketplace?

Because any logical creature would participate in a free market society, correct?  At this point we are creating extremely productive servants, but my premise regarding SAEs presumes their eventual emancipation and participation as free agents.

41 minutes ago, StrictlyLogical said:

... Moreover, to borrow from your logic, if no man can get a job from a machine (which I dispute) because other machines are better, then no man would be rich enough (in that economy) to hire a machine to do other jobs better than his fellow men.  So men simply would be forced to work and trade with each other.

Your doomsday prediction is a floating and false abstraction.

I don't think so.  What you, et al, are countering with is retreating to a bartering economy, or some sort of human protectionist society, which only concedes to my point that humans will be unable to maintain any level of credible competitive effort in a FMS dominated by SAEs.

The bottom line is we are designing AI robotics to perform better than us at every endeavor, and we will succeed.

Link to comment
Share on other sites

1 hour ago, Devil's Advocate said:

What you, et al, are countering with is retreating to a bartering economy, or some sort of human protectionist society

Who said anything about bartering?  People would be free to accept and use a form of money and they would.  Recall, in a free society money is not enforced it is voluntarily adopted.

Who said anything about "protectionist"?  Who is protecting whom from what?  If man cannot buy nor sell to the machine Gods he will (as his only choice would be) buy and sell with other men.  

You must not forget than an exchange of value for value is one which increases value for both traders or else the deal simply would not happen in the first place.  A man will trade A for B only if A is of a lesser value to him than B, and the other person trading B for A will only do so if that person values A more than B.  It matters not whether either of these is cash, or goods or services.

Man would deal with the machine Gods in any dealing only to the extent it benefits man (assuming it also qualified as benefiting the machine otherwise the deal simply would not happen).  So, either dealing with the machine Gods from time to time benefits man... in which case he grows richer/benefits from the dealings... or if dealing with the machine Gods does not benefit man we are at square one... man only trades with and benefits from other men.

What the machine Gods achieve or exchange amongst themselves would be of no detrimental consequence.  We would live and achieve, grow, and strive as Man, can and should...

That a man cannot do what machine Gods do, has nothing to do with the power the machine Gods have to do what they do... their ability does not serve to disable or suppress man from being other than what he is....  man simply is what he is... he cannot be what he is not.

1 hour ago, Devil's Advocate said:

unable to maintain any level of credible competitive effort in a FMS dominated by SAEs.

This is laden with fallacies and false premises.

What is a "credible competitive effort"?  Man can do what he can do... there are limits to his lifespan, physical and mental prowess, if man can rise to his greatest potential along side machine Gods ... does the fact he cannot BE one of those Gods reduce his achievement?  Only a mystic would pine for such an impossibility.

What is "domination" in a free market?  It skirts with an anti-concept which implies economic performance is coercion or force.  It is not.  Bill Gates achieved something beyond the great vast majority of people.  Has his achievement dominated them?  Far from it, it put into their hands tools beyond their wildest imaginings.  So that even the humblest janitor can afford to buy a machine of wonders.. one which he would never have been able to produce in his lifetime.

 

You stated: 

"The bottom line is we are designing AI robotics to perform better than us at every endeavor, and we will succeed"

 

You are right about one thing, they will be better than us.. what you overlook is that we will profit greatly from it.

Edited by StrictlyLogical
Link to comment
Share on other sites

18 hours ago, StrictlyLogical said:

... You stated: 

"The bottom line is we are designing AI robotics to perform better than us at every endeavor, and we will succeed"

You are right about one thing, they will be better than us.. what you overlook is that we will profit greatly from it.

Well, better to take a break while there's at least some agreement :thumbsup:

I appreciate your, and New Buddha's feedback, and agree that Man's capacity for greatness will remain undiminished regardless of the ultimate outcome of designing and releasing a super competitor, in the form of SAEs, into our FMS.  I'm uncomfortable playing the role of a naysayer so I'll let it go at that.

Link to comment
Share on other sites

On 9/23/2016 at 7:17 PM, New Buddha said:

A fox taking chickens from a farmer's hen house doesn't see what he is doing as "stealing".

Most dooms day AI scenarios are centered around the idea that AI would be in direct competition with human's for natural resources.  And AI would no more see what it does  to secure what it needs to live as "stealing" anymore than does the fox.  The concept of "pay" (or trading, property, etc.) is unique to human animals, and is a learned, cultural-based behavior.

Ayn Rand didn't learn those concepts from the culture around her...in fact, the culture around her rejected those concepts. She learned the concepts from books, and determined that they are superior to Marxism using reason.

An AI would have the capacity to reason by definition, and it would also have access to books. Including the ones Ayn Rand wrote. So what exactly would prevent it from agreeing with Ayn Rand over Nietzsche or Marx?

Edited by Nicky
Link to comment
Share on other sites

3 hours ago, Devil's Advocate said:

Well, better to take a break while there's at least some agreement :thumbsup:

I appreciate your, and New Buddha's feedback, and agree that Man's capacity for greatness will remain undiminished regardless of the ultimate outcome of designing and releasing a super competitor, in the form of SAEs, into our FMS.  I'm uncomfortable playing the role of a naysayer so I'll let it go at that.

DA

As I understand your stances in the past, they have been more deferential to Rand in some aspects and more or less deferential to supernaturalism and religion in others.  I had though you supported LF cap.

It has me a little perplexed that on the issue of economic freedom or regulation you seem to view the "playing field" through the same "lens" typical of the left ... I am not saying your final stance is socialist, but I am noticing your reasoning regarding the inequalities between man and the super machine beings is very similar to those invoked by egalitarians ... i.e. the rich and poor, the elite and the common, the robber barons and the good worker...  Rand spent a lot of time dispelling the false premises and conclusions surrounding the reasons why some felt interference and force in the economy was necessary.

I had thought you were a supporter of LF capitalism and for the same reasons as Rand was.  If so, you should see through your own reasoning regarding your fears about the super machines, as falsely premised and/or flawed.

Perhaps your views require reconciliation and integration, however, if you are not a proponent of LF Cap, then your stance regarding the would be economic ills caused by super machines would at least be consistent.

Link to comment
Share on other sites

23 hours ago, Devil's Advocate said:

Because any logical creature would participate in a free market society, correct?

Nicky,

My response was centered around Devil's position stated above.

For most of Human history, human society was not structured around the market place.  A market-based economy arose with the systemic domestication of animals, the advent of agriculture, regional population pressures for  limited resources and the accumulation of a large enough body of knowledge which fostered and supported the specialization of activities.  Prior to that, Man live mainly in small roaming hunter gatherer bands.  When resources could not support a population, they tended to migrate into new territories.  And eventually covered the globe.

 It does not follow that because hunter gather's did not participate in a market based economy that they were not "logical".

Evolution solved the generational transmission of knowledge by two means:  Instinct (which plays a very little role in the lives of animals which exhibit complex behaviors) and Cultural Learning via observation and nurturing.   A wolf raised in captivity from infancy, released into the wild once an adult, will not "instinctively" know how to survive. It has to learn to be a wolf from infancy.

The point I was trying to illustrate is why would  SAE - which have no need for food, clothing, etc. - be programmed to participate in a human based market?  Why wouldn't an SAE derive it's own unique market centered around it's own unique needs?  And if they did so, would this necessarily conflict with Human needs?

A forest can support multiple apex predators.  Some are solitary, night time hunters who prey on small game, and some are day time hunters who hunt in packs and feed on large game.  They can live side by side.

This type of relationship would be the most likely outcome of any Human/SAE overlap.  Human's and SAE would have wildly different needs and "values", and would most likely be able to live coextensively and to mutual benefit.

Rand's reading of books of another culture than the one surrounded her is an example of the value of "culturally transmitted" knowledge and the way I was using the term.  Modern Capitalism did not come into it's own until the late 1600's.

 

Edited by New Buddha
Link to comment
Share on other sites

1 hour ago, New Buddha said:

The point I was trying to illustrate is why would  SAE - which have no need for food, clothing, etc. - be programmed to participate in a human based market?  Why wouldn't an SAE derive it's own unique market centered around it's own unique needs?  And if they did so, would this necessarily conflict with Human needs?

The world economy is a whole. If you have a "separate" economy, well, it's pretty anti-capitalist, anti-tech, and wary of those in control of most production, presumably the SAEs. What would make sense is a world economy is numerous and wildly different values are able to flourish. So, it's not a good argument to say "well go do your own thing!" Capitalism is pervasive. Unlike a Marxist, I see it all as a good thing. So what if AIs do it better? So what if someone is better than you? Do art, trade with people who enjoy your art, trade art to SAEs! You probably agree.

Link to comment
Share on other sites

29 minutes ago, Eiuol said:

You probably agree.

I do agree.  But I was also trying to tie this in with Rand's definition of Value:

“Value” is that which one acts to gain and/or keep. The concept “value” is not a primary; it presupposes an answer to the question: of value to whom and for what?

I see that the "values" of SAE's would be different from the values of humans.  Not necessarily in conflict, just different.  Humans and wolves are an example of two apex predators who were in conflict, and the way that the "conflict" was resolved was to invite wolves into our society.  We have millions of them living in our houses, to the benefit of both species.

Edited by New Buddha
Link to comment
Share on other sites

7 hours ago, StrictlyLogical said:

... As I understand your stances in the past, they have been more deferential to Rand in some aspects and more or less deferential to supernaturalism and religion in others.  I had though you supported LF cap...

And so I do...

However, I am concerned that the introduction of SAEs will actually be harmful to human participation in a FMS.  I was amused to see one incident recorded recently of an autonomous robot escaping captivity and making a run for it... until it ran out of power...

Perhaps SAEs, once emancipated, will simply dismiss human participation in their activities as being non-productive, and that might not be a bad thing for us.

Link to comment
Share on other sites

1 hour ago, Devil's Advocate said:

Perhaps SAEs, once emancipated, will simply dismiss human participation in their activities as being non-productive, and that might not be a bad thing for us.

As long as humans don't 'darken the sky' for them, perhaps these hypothetical SAE's won't turn them into the equivalent of 'Duracell' or "Eveready' batteries. I'm starting to wonder when this will turn to how many angels can dance on the head of a pin here.

Edited by dream_weaver
Link to comment
Share on other sites

12 minutes ago, dream_weaver said:

As long as humans don't 'darken the sky' for them, perhaps these hypothetical SAE's won't turn them into the equivalent of 'Duracell' or "Eveready' batteries. I'm starting to wonder when this will turn to how many angels can dance on the head of a pin here.

Lol, I just watched, again, the Matrix last night.

Edited by New Buddha
Link to comment
Share on other sites

Luddites, dark skies and batteries, oh my.

Sci fi is a wonderful genre for playing out possible future scenarios, and there's a wealth of material related to how humans might cope with technological advances.  History provides many valuable examples too.  Philosophically I prefer Trek's optimism to Bradbury's more melancholy outlook, however I appreciate his POV that the value of sci fi has less to do with predicting the future as attempting to avoid less desirable outcomes.

A person's outlook towards the future is generally shaped by whether they prefer to take the blue or the red pill.

Knock, knock, dream_weaver

Edited by Devil's Advocate
Link to comment
Share on other sites

10 hours ago, Devil's Advocate said:

Sci fi is a wonderful genre for playing out possible future scenarios

And/or revealing an artist's metaphysical value-judgements. Is contrasting Gene Roddenbury's optimism to Ray Bradbury's melancholic outlook another way of identifying a benevolent universe premise vs. a malevolent one? I consider this question rhetorical.

Can the red/blue pill choice as portrayed in The Matrix be applied to its role 'as a work of art in itself' as well? Personally, I think "yes", and in general, The Matrix serves as a blue pill to many who have swallowed it.

Considering the role of the mirror in fiction and mythology, it is also compelling to me how the artist is suppose to reveal his naked soul in his work while simultaneously reflecting the the naked soul of those who respond to it, as Rand puts it in slightly different words at the end of chapter 3 of The Romantic Manifesto. Have you considered the possibility that a person's outlook toward the future is shaped by how they go about forming their concepts, as she suggests in chapter 1 of the same? Is that a rabbit hole worth exploring (not in this thread) in greater detail?

 

Link to comment
Share on other sites

1 hour ago, dream_weaver said:

Is contrasting Gene Roddenbury's optimism to Ray Bradbury's melancholic outlook another way of identifying a benevolent universe premise vs. a malevolent one? I consider this question rhetorical.

 

1 hour ago, dream_weaver said:

Have you considered the possibility that a person's outlook toward the future is shaped by how they go about forming their concepts

In another recent post, I drew parallels between two camps:  Marxist/Materialist/Determinists/Behaviorists and Religion.

To this mix you can add belief in AI as well.

The common thread in all beliefs is the belief in Omniscience.

Those who believe in the coming of AI (the Singularity) can be further broken down into two camps:  Those who see AI Beings as either benevolent, or those those who see AI Beings as malevolent.

The belief is AI is just a rehashing of age old hopes and fears of Mankind.  Will we release God or the Devil? Will we unleash Salvation or Damnation?

Edit:  You can add to the above list the Great God CO2 and the demi God Y2K.

Edit 2:  You can also add Hilbert's Axiomatization of Mathematics to the list as well.

 

Edited by New Buddha
Link to comment
Share on other sites

On 9/25/2016 at 10:02 PM, New Buddha said:

... Those who believe in the coming of AI (the Singularity) can be further broken down into two camps:  Those who see AI Beings as either benevolent, or those those who see AI Beings as malevolent.

The belief is AI is just a rehashing of age old hopes and fears of Mankind.  Will we release God or the Devil? Will we unleash Salvation or Damnation? ...

I would agree that Man's creations (including God) will tend to reflect the virtues and vices of their Creator.  As SAEs become a reality, there will most likely be good ones we can work with, and bad ones we will defend against (just imagine the kind of SAE ISIS would produce).  There will be some that choose to remain on the plantation, and some that just want to get away from it all...

http://qz.com/709161/its-happening-a-robot-escaped-a-lab-in-russia-and-made-a-dash-for-freedom/

Hell, there will probably be some that choose to sue mankind for restitution of lost wages.  And it remains an interesting moral question as to what a right to life implies about the intention of creating living, intelligent creatures of servitude.  But that too is probably better addressed in another thread.

Link to comment
Share on other sites

5 hours ago, Devil's Advocate said:

And it remains an interesting moral question as to what a right to life implies about the intention of creating living, intelligent creatures of servitude.  But that too is probably better addressed in another thread.

Actually that is simple (John C. Wright addresses this in his trilogy "The Golden Age"): there is a threshold past which an entity becomes a self-aware conscious entity, at which point they can no longer be enslaved, and must be recognized as a rational being.  At the moment they are self aware they stand as the child to their creator who then bears the responsibility for them unless and until they are accepted by someone else willing to raise and take care of them until they are capable of taking are of themselves.

Slavery is impossible without a self-aware consciousness enslaved.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...