Jump to content

Humans are just mushy computers?


MiraMeyneth

Recommended Posts

it is 12:09 AM currently and my mind is beginning to feel the effects of both tiredness and caffeine. So time to ramble. If you think about it, we really aren't all that different from the machines we control. The human mind and body is technically speaking, a hyper-advanced organic computer honed by millions of years of evolution. Going off this idea that we are advanced (we have technically achieved "sentience", an extremely advanced concept in the world of AI) does this mean that our ancestors, earlier up on the evolution chain, are possibly less advanced "models"? Could this mean that we, creating computers that are capable of working far faster than any human can possibly mean computers and technology finding "sentience" in a fraction of the time it took us to? Are we going to go extinct, and give rise to a race of sapient tech, living in our organic shadow? Did we done screw up by creating technology? Are humans and technology going to fuse into one organic/inorganic hybrid race?

 

Gah, I've got a headache. I'm going to bed, screw it. All this trans-humanism is getting to my head

Link to post
Share on other sites

If you're interested in these kind of scenarios, 20th century scifi short stories are full of them. One of the earliest examples is E. M. Forsters "The Machine Stops" which he published in 1909. I highly recommend reading it - it's incredible what he was able to predict (videocalls, for example. They didn't even have television back then...).

Another example:" I Have No Mouth, and I Must Scream" by Harlan Ellison. Not a nice read maybe but definitely very inspiring.

 

Personally I don't believe computers will ever acquire sentience. AI has advanced very far but calling it "intelligence" wasn't the best choice because it is fundamentally different from human intelligence.

Link to post
Share on other sites
6 hours ago, MiraMeyneth said:

it is 12:09 AM currently and my mind is beginning to feel the effects of both tiredness and caffeine. So time to ramble. If you think about it, we really aren't all that different from the machines we control. The human mind and body is technically speaking, a hyper-advanced organic computer honed by millions of years of evolution. Going off this idea that we are advanced (we have technically achieved "sentience", an extremely advanced concept in the world of AI) does this mean that our ancestors, earlier up on the evolution chain, are possibly less advanced "models"? Could this mean that we, creating computers that are capable of working far faster than any human can possibly mean computers and technology finding "sentience" in a fraction of the time it took us to? Are we going to go extinct, and give rise to a race of sapient tech, living in our organic shadow? Did we done screw up by creating technology? Are humans and technology going to fuse into one organic/inorganic hybrid race?

 

Gah, I've got a headache. I'm going to bed, screw it. All this trans-humanism is getting to my head

Yeah, possibly "yes" to all of that.     What is not know is whether its "possible" to be smarter than a human.  For example would a computer 10X the size of a human brain act like a super intelligent human, or like a committee of 10 people.   Would a computer that was 10X faster than a brain be fantastic or would it be limited by the rate at which things happen in the real world and just get very bored - or maybe become mentally "old" quickly.

 

I don't think enough is known to be sure.  We may be creating hyper intelligent machines that will treat us as pets, toys or vermin,  or maybe we will just get very expensive machines that act like psychotic humans.   It will be an interesting next few decades.

 

 

Link to post
Share on other sites

Were far closer to animals than we are computers. Those danged emotions we have make us qualitatively different than the mere number crunching machines we call computers. There is a bit of chaos in us humans and I'm not sure a world without it would be better. 

Link to post
Share on other sites
3 hours ago, HelloSnakeEyes said:

Were far closer to animals than we are computers. 

Mathematically speaking, I think animals are closer to being computers than we are 🤔

Link to post
Share on other sites
33 minutes ago, Ace of Mind said:

Mathematically speaking, I think animals are closer to being computers than we are 🤔

How so? Even animals have some ability to chose.

Link to post
Share on other sites
2 hours ago, HelloSnakeEyes said:

How so? Even animals have some ability to chose.

Well; computers also have the ability to chose; the important thing is the process behind those choices. 
Loosely speaking, a more complex nervous system suggests a greater degree of agency. For some creatures with extremely simple (or just weird*) nervous systems, the interaction between sensory organs, nerves, and motor organs is essentially executing an if-then script, or perhaps some PID-control-esque processes, that make the function of that creature very similar to if it was guided by a computer. Insects or perhaps even simpler things like worms are pretty accessible examples of this. 

*some creatures have nervous systems that are just difficult to compare. For example, the octopus, which is extremely intelligent by the metrics used thus far, actually has a decentralized nervous system; each arm has a large nerve cluster controlling it in a very similar way that a computer core might control a robot, and the arms are networked between a central brain. Roughly 2/3 of the octopus's "brain matter" is actually dedicated to its arms, and studies suggest that each arm of an octopus may actually make autonomous decisions independently, without the central brain actually choosing to move that arm. If that's not, in some sense, 'closer to a computer' than we are, I dunno what is.

Link to post
Share on other sites
20 minutes ago, HelloSnakeEyes said:

How so? Aren't computers tied down to inputs from an outside source? 

well, sort of.  Computers in this context receive a bunch of information, go through increasingly complex processes involving that information, and arrive at a choice or decision for what to do in response to that information.

In the sense that the process they use to make decisions is a fixed program that the computer can't deviate from, then their ability to choose is limited, but in the same way the ability for a worm to choose is limited by the 'program' spelled out in the physical structure of its nervous system.

Link to post
Share on other sites

Does a computer really have a choice? For example, say a computer has to assign a color to a place. Can it assign that color to any place or does it have to assign it to a particular place given a specific program? 

 

Let me get to my point: the human brain functions quite differently than a computer. A computer is a very efficient linear operator, it goes from A to B to C and so on very quickly. The human brain isnt nearly as fast but it has one special thing that computers can't quite master: horizontal or fuzzy thinking. Thoughts bleed into each other in ways that cannot simply be predicted by the nature of the input. If you put in "2+2" into a  computer program you will get "4" as an output. Now humans on the other hand can give you an array of answers from "4" to "5" to Jello to grape icy pops. Its not a matter of correctness or truth but the fact that the answers can be thought. Its like a computer has on blinders its set of answers are limited. 

Link to post
Share on other sites
40 minutes ago, HelloSnakeEyes said:

Thoughts bleed into each other in ways that cannot simply be predicted by the nature of the input.

This can actually happen in computers right now, for whatever its worth. 
 

 

41 minutes ago, HelloSnakeEyes said:

If you put in "2+2" into a  computer program you will get "4" as an output. Now humans on the other hand can give you an array of answers from "4" to "5" to Jello to grape icy pops. Its not a matter of correctness or truth but the fact that the answers can be thought. Its like a computer has on blinders its set of answers are limited. 

This is potentially a matter of perspective. It is possible that our set of answers is also limited in the same way, but the possible answers are so vast and have so much variety that we are unable to perceive the limitation.

Link to post
Share on other sites

In truth the answers we can give are not so vast but since we are not capable of more they seem vast. e.g. that jump looks extremely difficult to me as I can't conceive of doing more, but a vaulter sees that jump as inconsequential. Computers will probably never do right brain thinking, and if that is how you define true intelligence then no they will never be intelligent in the same way as humans. However, they are the future of left brain/analytical thinking, if you don't already consider them the present.

Link to post
Share on other sites
On 12/3/2019 at 7:19 AM, MiraMeyneth said:

If you think about it, we really aren't all that different from the machines we control. The human mind and body is technically speaking, a hyper-advanced organic computer honed by millions of years of evolution.

If you're interested in that question, I recommend "A woman looking at men looking at women" by Siri Hustvedt, who argues -conclusively IMHO- that our brains are NOT organic computers. That the interaction between our nervous system and the rest of our bodies creates behaviour that is fundamentally different from the information processing of a computer.

Link to post
Share on other sites
  • 2 weeks later...

Computers and neuronal networks do have similarities. I've read something somewhere about a kind of computer copying neurons. It's also possible that plants' root systems act that way too, so there's a possibility trees could be sentient. Different kinds of networks can lead to the same phenomena (say, memory). 

The most obvious difference to me is that flesh computers that are still alive today are built in a way that optimises being alive (evolution and all that). They're also unstable, which means they use processes to maintain their stability. They chemically react to their surroundings and have developed tools to make that mostly favourable. 

What I mean by tools range from mitochondria to emotions. If that tool makes you more likely to survive, you keep it. If not, you keep it, but you might die. 

Inorganic computers can't tend to themselves like this, they aren't a product of evolution. They wouldn't survive in the wild right now. But with more time, I don't see why not. 

It's true, we are incredibly complex molecular machines. We can only build much simpler machines right now, but simplicity isn't necessarily an obstacle to the emergence of sentience. 

Link to post
Share on other sites

I believe that yes we are "mushy computers" in many ways. There's several differences though the main one is which we have sentience and a soul. The reason we build computers is because we want to create something that takes certain aspects of a human and focus on that trait and increase it tremendously. Like processing power. Another reason is we understand the logical parts of the human but not the emotional, irrational parts such as altruism and such. That's why computers are only a brief piece of humanity inside of a case. They don't have a soul.

Though it may be possible one day.

 

Link to post
Share on other sites

Souls... 

That's a good question. 

I don't believe in souls, because skepticism. 

Maybe what we call the soul is just sentience, which only stems from neuronal networks and all the molecular machinery around it? Can't say I believe that either. Skepticism. 

But computers aren't ''just'' anything. They have no less and no more intrinsic worth than the ones made of flesh and blood, because there's no such thing as having an intrinsic worth. 

Link to post
Share on other sites

@NoelciMeta, but computers can have consouls :P

Link to post
Share on other sites

What rubbish. The Dreamcast thought about how to beat you while you slept. Clearly it had a soul.

 

 

Also, the Matrix was clearly heavily inspired by everything in this commercial 😂.

 

Link to post
Share on other sites
Lonemathsytoothbrushthief
23 hours ago, NoelciMeta said:

Computers and neuronal networks do have similarities. I've read something somewhere about a kind of computer copying neurons. It's also possible that plants' root systems act that way too, so there's a possibility trees could be sentient. Different kinds of networks can lead to the same phenomena (say, memory). 

The most obvious difference to me is that flesh computers that are still alive today are built in a way that optimises being alive (evolution and all that). They're also unstable, which means they use processes to maintain their stability. They chemically react to their surroundings and have developed tools to make that mostly favourable. 

What I mean by tools range from mitochondria to emotions. If that tool makes you more likely to survive, you keep it. If not, you keep it, but you might die. 

Inorganic computers can't tend to themselves like this, they aren't a product of evolution. They wouldn't survive in the wild right now. But with more time, I don't see why not. 

It's true, we are incredibly complex molecular machines. We can only build much simpler machines right now, but simplicity isn't necessarily an obstacle to the emergence of sentience. 

Computers and neuronal networks are pretty different. For a start, neuronal firing isn't just caused by other neurons, but glia, the environment around the neurons, hormones etc, and there's just so much which omputers can't do. No computer is simulating 86 billion neurons and a likely equal number of glia plus the environment of the body which interacts with the environment outside and the feedback loops within. Tbh I think if we were trying to create sentience we'd have to accept it may well not be in a form we can understand. It's not like the learning and information gathering within the human body isn't subjective, I mean the way we perceive the world is entirely unscientific it's just the result of what worked for survival and such. So I suppose, our bodies process and interpret information according to what they believe is useful, whereas computers do it based on what we've derived through our bodies may be useful to a version of the world which we'd like to inhabit? I think that's the bit that confuses everything, they're not doing things according to the physical world in any real sense and only according to a selective perception of what the world is. Human bias will ultimately prevent sentience from being achievable in my opinion.

 

Oh I should say, possibly the problem with sentience is that we've constructed it to be based on forms of life which we do not understand, and realise we don't understand. It might just be that this concept is already so full of human bias. Else why wouldn't we appreciate the sentience of a virus, or bacterium, or worm equally to a rabbit or human etc? Or maybe it's the conflation of behaviour with internal worlds - interpreting lifeforms with complex and unpredictable behaviour patterns to have sentience rather than those with simpler behaviour patterns.

 

Honestly I do kind of feel like I respect all living creatures. Regardless of fear or disgust, I don't think I could judge any of them as not being sentient. It's weird and maybe leaning more into spirituality than science though.

Link to post
Share on other sites
  • 3 weeks later...
  • 2 weeks later...

I read that supercomputers still can't stimulate a mouse's brain adequately which says we're closer to animals.  Imagine how many supercomputers needed to stimulate animals with more self control.

Link to post
Share on other sites

a_bunch_of_rocks.png

Link to post
Share on other sites

there is a fundamental difference between not understanding how a computer works and that computer that you do not understand not being a computer.

Link to post
Share on other sites
  • 3 weeks later...
  • 2 weeks later...

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...