Jump to content
Objectivism Online Forum

Evolving Metaphors That Try To Explain Human Intelligence.


Recommended Posts

On 8/17/2016 at 11:44 AM, New Buddha said:

If the Author was pushing a Behaviourist approach, the he would hardly reference the scientists that he did (Chemero, Wilson and Golonka).  I too, read the article and it never even crossed my mind to think "behaviorism".

He was clearly pushing behaviorism (just as Eiuol claims) regardless of whose work he referenced. If you say "there is no such thing as beliefs, desires or decisions; there are only reflexes" then behaviorism is what you are asserting, whether you try to support it with the works of John Dewey or Ayn Rand or Adolf Hitler.

 

I don't care to go on demonstrating all of the ways in which the article asserted just that, over and over. If you know that we're more than our reflexes (whether that "more" is like a computer or like air currents or whatever) then we don't have anything to be arguing about.

 

 

2 hours ago, New Buddha said:

I've been reading a good deal of Engels lately.  Since Rand doesn't like him, should I not read him as well? 

Who says that?!?!!

 

That's not rhetorical. If the case is being made that it's un-Objectivist and somehow wrong to read a variety of things and form your own opinions, please direct me to it.

Link to comment
Share on other sites

14 hours ago, New Buddha said:

That's not what I'm saying.  Both are wrong.  Computationalism implies Representationalism, but Representationalism (which is wrong) does not imply Computationalism. (I wish spell-check worked on these words, lol)

Drawing and writing aids cognition, yes, but there is reason to think a form of computation ALSO happens. All you need to say that is a theory of concepts as abstractions. If it were all externalized, writing would be your concepts, and cognition a chaos of percepts or at least pseudo-organized. So, I don't see why you insist ANY type of computation is always externalized.

I wouldn't solve that math problem that way, and it's a misleading question because it mixes a perceptual problem with a cognitive one. I don't solve the cognitive problem with perceptual skills at all, it's going to involve non-perceptual processes. It's two problems in one. Perceptual based problem solving would not be enough, as you already need some abstraction for addition. If it were enough, you'd have to propose some sort of "release of potential" that happens on its own to produce a behavior, all without going past the perceptual level. By the way, rotation would likely be computational, and always at least representational. Only subitizing is truly perceptual for what you listed - it uses only what you see and certainly this step could be purely analog.

All this is coming down to who has better examples, as this is a science discussion. All I want to point out is you are simplifying cognition to all external processes and environmental impact, and limiting thought to perceptual processing. 

Link to comment
Share on other sites

4 hours ago, Eiuol said:

All this is coming down to who has better examples, as this is a science discussion. All I want to point out is you are simplifying cognition to all external processes and environmental impact, and limiting thought to perceptual processing. 

I think a more apt description of the point that I'm working through is that we can never over come the very finite limit of  our attentive mind to juggle more than a handful of things at time -- without externalizing our problem solving processes (writing, computer programming, architectural sketches, vector diagrams that engineers use, etc.).

Without these tools, we are limited to a math of "One, Two, Three, Many...." or various mnemonic devices.

Computationalism (and the Mind as a Computer Metaphor) assumes that, on a sub-attentive level, a great deal of computation must be taking place.

As an example, I'm introducing a foreign word into this sentence (hopefully you aren't fluent in German, lol) - the word is Unheimlichkeit.

How is it that you immediately know that you don't know the meaning of the word?

Edit: Spelling

 

Edited by New Buddha
Link to comment
Share on other sites

Bach's abuse of our poor Crow Mind...

How dare he write a fugue for 5 voices!  I can only apprehend, at most, 3 lines at a time.  This means I'll have to jump from voice to voice!

Good thing it's composed in vertical harmony....

But wait,

Wouldn't this constant switching from voice to voice create havoc with the sub-attentive program that Computationalist insist must be running?

Edited by New Buddha
Link to comment
Share on other sites

There is a "great deal" of computation, but that's really vague. There are many ways computation could occur, even especially on top of all the embodied and externalized aspects of cognition you're talking about. No, people aren't computers, but that doesn't mean there are NO computations. It's more like computers demonstrate that computations are possible. Whether people do too is a separate issue. Being biological doesn't preclude the possibility of a computation, though.

I don't "immediately" not know the meaning of that word. It'd probably take half a second, maybe as low as 100 miliseconds. This doesn't prove or disprove computationalism. To deny computationalism there is just to say a perceptual and likely associative process is happening. Plausible hypothesis. To ALSO deny representations is to say ONLY perceptual content is used at all at all anywhere, which is a much stronger claim. Keep in mind that by perceptual, I mean perception, not "perceptual-like" as in imagining.

Your Bach example would work that way, too. A computation could help to naturally make those divisions just as a computation could help to distinguish two distinct objects that are moving. The external world itself may aid as well. A representation doesn't need to be made even at this stage. My protest is that you want to deny representations at all stages. Denying computationalism per se is a tenable position, that's not my main issue.

Link to comment
Share on other sites

Here's a link to a post about the speed of the mind.  If you would, start reading at "There is a largely ignored problem... "p.66.).  I'm not posting this to refute any particular point you make in your above post, but rather to show how Hawkins is addressing the problem.  I think you might be familiar with him?

Add Edit:  A word or a face is (or is not in the case of the German word) in your brain (basal ganglia/muscle memory?) but your brain is not executing a top-down search through all words that you know.  In this sense, the brain is not executing computations on a set of representational data.  There is no "program" running on "data".  The time and energy expenditure for such a search would be tremendous - and the brain consumes the energy of a 14w light bulb.  The amount of heat generated would also be extreme.  You'd have to be eating table spoons of sugar for each page of a book that you read and have a fan blowing on you to cool your head.

Edited by New Buddha
Link to comment
Share on other sites

9 hours ago, Eiuol said:

There is a "great deal" of computation, but that's really vague. There are many ways computation could occur,

My understanding is that you have a background in Programming.  Are you familiar with how analog logic circuit design uses hardware to solve the same types of problems that software does? 

Link to comment
Share on other sites

24 minutes ago, New Buddha said:

My understanding is that you have a background in Programming.  Are you familiar with how analog logic circuit design uses hardware to solve the same types of problems that software does? 

I'll get to more later, but my background is really varied. I only know super basic stuff about hardware design. My background is more of psychology if I had to pick something, though I know about programming as well.

Link to comment
Share on other sites

1 hour ago, Eiuol said:

I'll get to more later, but my background is really varied. I only know super basic stuff about hardware design. My background is more of psychology if I had to pick something, though I know about programming as well.

An x86 modern computer (i.e., a Universal Turing Machine) cannot operate without software.  However, analog logic circuits don't use software to solve problems.  The brain uses hardware to solve problems, not software.  The brain is Analog, not Digital.  And, in this sense, it is not Representational.  It is continuous, not discrete.

The sense organs are Transducers.   They convert energy from one form to another.

The ear is an Auditory Transducer.  It converts mechanical sound waves into bio-electric energy in the same way that a microphone converts a sound wave into electromagnetic energy.  I'm not being "metaphorical" here.  This is engineer speak.

However, each ear has 30,000 microphones (distributed across the frequency range), whereas an analog recording of a symphony has only a few microphones.  Each of the ear's 30,000 microphones only respond to a certain frequency - set by their location along the cochlear tube.  The signal processing of the ear contains much more information than the signal processing by a single microphone (or even a dozen microphones).  So when a computer programmer writes code for a recording, he is dealing with a poverty of information.   

What this mean is that, in word recognition, the hardware of the ear does what the software of a computer does.  This is why the brain can run on 14w of power while my computer has a 1,000w power supply.

Edited by New Buddha
Link to comment
Share on other sites

20 hours ago, New Buddha said:

An x86 modern computer (i.e., a Universal Turing Machine) cannot operate without software.  However, analog logic circuits don't use software to solve problems.  The brain uses hardware to solve problems, not software.  The brain is Analog, not Digital.  And, in this sense, it is not Representational.  It is continuous, not discrete.

You're taking a lot of liberties with some very technical terms. For instance, x86 isn't the name of a computer, it's the name of an Instruction Set Architecture (or ISA). As the name suggests, it's not hardware: it's the name of a software architecture. It is the programming language being used to give machine level instructions to the hardware.

As for analog vs. digital, it has nothing to do with the ability to operate without software. Digital computers, at their core, perform arithmetic operations. Without software. At the core of every CPU, there's a circuit capable of doing arithmetic operations on binary numbers. All software does is leverage those calculations to solve more complex problems, as well as allow human interaction with the circuit. In fact there are specialized computers that perform far more complex operations than just arithmetics on binaries, at the hardware level. There are even computers that are analog (they're just not very efficient).

The distinction between analog and digital isn't the need for software, it's that analog is direct, continuous representation, while digital representation involves abstraction. Both computers and the human mind employ abstraction to solve most problems. So, while the brain itself, at the neuron level, may be be analog (just as computers can be analog, there's nothing that says that computer hardware must be digital), complex problem solving requires abstraction, no matter what the underlying hardware architecture looks like.

I would say that you might be right about the animal brain being analog to some, very limited extent: there are certain specialized problems that the animal brain may be hard-wired to solve without digital abstraction (because, due to the frequency with which we have to solve them, it makes sense to do it that way). But, as the complexity rises (especially for humans), abstraction becomes necessary and the preferred method. And abstraction is, by definition, digital. That's the whole idea: it's not continuous representation. For instance, let's say you're a farmer and you want to buy a cow. You really don't care about most of the attributes the cow has. Makes no difference what color it is, makes no difference what its exact shape is. You really just care how old it is and how many gallons of milk it produces in a year. The analog representation of the cow doesn't really help you. You can look at it, take a picture of it, touch it, etc., etc., all you want. It won't help you much. What you need is a digital representation, made up of two small integers.

22 hours ago, New Buddha said:

What this mean is that, in word recognition, the hardware of the ear does what the software of a computer does.  This is why the brain can run on 14w of power while my computer has a 1,000w power supply.

It's not. The reason why your computer has a 1,000W power supply is because you can play the latest Call of Duty on it, while doing 1080i video rendering in the background.

If all you needed was a computer that can process audio, and you gave the right people a few hundred million dollars (which is just a fraction of the money Intel, AMD, Nvidia etc spent on developing your current processor and graphics card), I assure you that they could build you one that runs on a lot less than 14w, and does a much better job than the human ear.

Link to comment
Share on other sites

On 8/21/2016 at 11:19 PM, New Buddha said:

So when a computer programmer writes code for a recording, he is dealing with a poverty of information.   

What this mean is that, in word recognition, the hardware of the ear does what the software of a computer does.  This is why the brain can run on 14w of power while my computer has a 1,000w power supply.

No, it isn't.

 

I'm sure you know what a data "cache" is? Well, whenever your brain does anything new, interesting or novel, it basically caches every step required to do it again in the form of physical neural pathways. That's why your brain can run on such a tiny amount of power - as compared to a computer.

But your brain isn't actually energy-efficient at all, as compared to any other organ in your entire body. That's why no other animal has evolved our kind of consciousness, before; it's prohibitively expensive.

Quote

Although the human brain represents only 2% of the body weight, it receives 15% of the cardiac output, 20% of total body oxygen consumption, and 25% of total body glucose utilization.

-Wikipedia/The Human Brain/Metabolism

 

The human brain is the Ferrari Testarossa of the animal kingdom.

Edited by Harrison Danneskjold
Hyperlink
Link to comment
Share on other sites

14 hours ago, Nicky said:

You're taking a lot of liberties with some very technical terms.

My engineering strengths lie more towards structural engineering, so a correction/explanation of the terms is appreciated.  The use of the term x86 was meant to distinguish a Universal Turing Machine from a pocket calculator that uses circuit design to solve problems.  The brain is more like the latter.

14 hours ago, Nicky said:

I would say that you might be right about the animal brain being analog to some, very limited extent: there are certain specialized problems that the animal brain may be hard-wired to solve without digital abstraction (because, due to the frequency with which we have to solve them, it makes sense to do it that way).

Activities such as learning to ride a bike become "hardwired" through repetition, and to a great extent they are automated so to free up our attention.  It does require willful volition to chose to ride a bike (and friendly parental encouragement helps too), but actually learning the skill is not accomplished through "top-down" computational programming.  Maintaining balance is a complex feed-forward/feed-back process.  Human's are born with very few "hard wired" behaviors.

14 hours ago, Nicky said:

But, as the complexity rises (especially for humans), abstraction becomes necessary and the preferred method. And abstraction is, by definition, digital. That's the whole idea: it's not continuous representation.

The position that I put forth in above posts is that the ability to abstract beyond the limit of our One, Two, Three, Many... "crow mind" requires the externalization of "thought" via algorithms:  writing, math, graphs, sketches, vector diagrams, computer code, etc.  The sketches in Newton's Principia are not primarily for communication, they are how he worked the problems out.  Rand says as much about words (paraphrasing) that words (concepts given perceptual form) are not primarily for communication, but rather words are what we think with.  I was horrible in math in high school, and it wasn't until I took Physics, Analytic Geometry and Structural Engineering courses in college that I could actually "see" what math was for.

There is an interesting web site that goes into detail about the problems of teaching math to the blind, since math is very visually orientated (as would be expected of an animal who dedicates the largest portion of his mind to visual processing).  Could an alien life form have "smell" based math?  Dolphins use sound to measure distances, find things, navigate, etc.

14 hours ago, Nicky said:

It's not. The reason why your computer has a 1,000W power supply is because you can play the latest Call of Duty on it, while doing 1080i video rendering in the background.

No.  Guild Wars 2....

The point that I was trying to make is that the ear "mechanically" transduces sound waves into bio-electric signals in such a way that addresses the "hundred-step-rule" mentioned by Hawkins (in the link above).

As a very rough example, when you hear the words "hello" and "goodbye", each word may trigger say, 4,356 and 7,598 nerve signals across the spectrum of frequencies.  And all of this is done, essentially, with mechanical energy provided by the sound waves.

Edited by New Buddha
Link to comment
Share on other sites

Often times, Analog vs. Digital is reduced to Continuous vs. Discrete.

From: Reply to Freeman Dyson by Steve Grand

When it comes to life, we see the analogue/digital distinction disappear completely, to be replaced by the spectrum of forms of encoding it really is. Take a nerve signal: at a theoretical level we can treat an action potential as a differentiated square wave, i.e. a spike of infinitesimal width and infinite height ‹ the ultimate in digital. But in practice the cell membrane takes a finite time to transit between polarised and depolarised states and so forms a smooth (if sharp) curve ‹ this is an analogue change (discrete at the quantum level). But then again, there are only two significant states, polarised and depolarised, so we're back to a digital system. And yet what really matters is the choice of encoding scheme, which for the majority of neurons seems to be frequency modulation, and so the true signal is analogue and can vary continuously. Mind you, two spike peaks can only vary in distance by a whole number of molecules, and so this analogue signal is really discrete... Argh!

The following is from a couple of paragraphs down, and how I use the term when it comes to analog, especially memory:

Perhaps a more important distinction is not between analogue and digital in the sense of "real number" versus "integer", but between "analogue" in its original sense as anexplicit representation of another physical process....

In this sense, the hills and valleys in the grooves of a Vinyl LP are not "Representations" of the mechanical soundwaves produced by the symphony, they are analogs.  And to carry the analogy even further, they are analogs of the thoughts of the musicians who created the sound waves.

Understanding the above paragraph furthers the basis for the "validity of the senses", as per Objectivist Epistemology.  Transduction is beyond "control" of the senses, nor is it arbitrary.

Edited by New Buddha
Link to comment
Share on other sites

Your discussion, Buddha, of the sense organs being transducers of sorts is fine. But using hardware isn't what "the brain does" because there's more going on than just some transducers responding to stimuli. Namely, there are mental states, concepts, consciousness, emotions, etc. Answering why or how this happens is quite difficult, but these are still representations or make use of representations, in addition to percepts. Computationalism is one theory of how it is we are able to do more with percepts than simply respond. I've no idea why you keep talking about analog and whatnot, the problem is you ALSO deny representations. To do so is reductionism.

Generally, it looks you're trying to get at the architecture of the human mind. There are certainly non-computational parts to it, but it doesn't mean that externalizing ALL thought is a solution to the mysteries because of abstractions, conceptual organization, and creativity. Externalizing enhances these aspects of the mind, it doesn't REPLACE them. For example, abstracting beyond "one, two, three, many" DOESN'T require externalization. To start off with, babies start at "one, many", then later on, "one, two, many", and so on until 4. They mainly only observe, counting practice won't help. After that, they'll master these as concepts. The point is numbers can be learned from almost entirely mental processes. Externalization is only a way to speed it all up, perhaps with teachers. Graphs, writing, etc, come after some conceptual mastery to make new abstractions, by offloading mental work you already did. And in time you'd be able to represent it mentally.

"Words are what we think with" is a type of representationalism. It may be possible to call Rand's view an "embodied representationalist" position.

 

Link to comment
Share on other sites

Eiuol,

We might be using the term Representation different.  Here is how I'm using it:

The question of direct or naïve realism, as opposed to indirect or representational realism, arises in the philosophy of perception and of mind out of the debate over the nature of conscious experience;[1][2] the epistemological question of whether the world we see around us is the real world itself or merely an internal perceptual copy of that world generated by neural processes in our brain.

[....]

A problem with representationalism is that if simple data flow and information processing is assumed then something in the brain must be interpreting incoming data as a 'percept'. This something is often described as a homunculus, although the term homunculus is also used to imply an entity that creates a continual regress, and this need not be implied. This suggests that some phenomenon other than simple data flow and information processing is involved in perception.

homunculus arguement

Another example is with cognitivist theories that argue that the human brain uses 'rules' to carry out operations (these rules often conceptualised as being like the algorithms of a computer program). For example, in his work of the '50s, '60s and '70s Noam Chomsky argued that (in the words of one of his books) human beings use Rules and Representations (or to be more specific, rules acting on representations) in order to cognate (more recently Chomsky has abandoned this view: c.f. the Minimalist Program).

 

Edited by New Buddha
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...