There's always fuzzy logic of course, but that's just assigning values between m and n (usually zero and one) to desired targets based upon fitness and then picking the highest ranking from the set.MKJ wrote:the simplest way to explain why there is no real AI yet (and will prolly take a while), is because a computer doesnt know a "maybe", only "yes" and "no".
Random Thought #25
-
- Posts: 4022
- Joined: Sat Mar 12, 2005 6:24 pm
Re: Random Thought #25
I miss Kracus's random thoughts. So insightful...
Re: Random Thought #25
lulz.... I actually further elaborated on that thought theory by suggesting prosthetics as a solution to AI.
There's been development on prosthetic implants that can either give mice long term memory or even implant new memories they didn't have previously. In the study, they used a maze and somehow programmed the solution to the puzzle in this chip they implanted on the mouse. When they turned it on, the mice knew right away how to get to the cheese, if it was off it took them longer.
If you think about that in terms of a prosthetic, say you have Alzheimers or something, you could get this prosthetic implanted that would give you back your long term memory. Let's assume for a minute that over time, humans will learn to perfect a prosthetic. We did start out with clubs for limbs and now we've got robotic arms, eyes, ears etc. We're slowly getting closer and closer to being able to perfect a prosthetic. Let's assume a bit further and also think of brain functions as potential replaceable prosthetics. You go to the hospital, they put you under, you have an operation that replaces a part of your brain but the piece they fit you with works exactly like the old one. So you wake up and feel no different. Except you're now able to remember properly or you no longer have a headache you used to have.
Over time as more and more of your brain is being replaced with prosthetics you're suddenly more technological than biological and eventually you're entirely technological. At what point in this process do you stop being you and are considered an AI?
It also sheds into question the nature of consciousness and how that works. Is it tied to your body? My thoughts are that it isn't and could potentially be transfered.
Of course that also implies that you're just a collection of labels your brain has attached to itself. Strip those away and are dead or are you someone else? So if labels aren't really who you are then what are you and are you really unique in a sea of other consciousness or are you the sea.
There's been development on prosthetic implants that can either give mice long term memory or even implant new memories they didn't have previously. In the study, they used a maze and somehow programmed the solution to the puzzle in this chip they implanted on the mouse. When they turned it on, the mice knew right away how to get to the cheese, if it was off it took them longer.
If you think about that in terms of a prosthetic, say you have Alzheimers or something, you could get this prosthetic implanted that would give you back your long term memory. Let's assume for a minute that over time, humans will learn to perfect a prosthetic. We did start out with clubs for limbs and now we've got robotic arms, eyes, ears etc. We're slowly getting closer and closer to being able to perfect a prosthetic. Let's assume a bit further and also think of brain functions as potential replaceable prosthetics. You go to the hospital, they put you under, you have an operation that replaces a part of your brain but the piece they fit you with works exactly like the old one. So you wake up and feel no different. Except you're now able to remember properly or you no longer have a headache you used to have.
Over time as more and more of your brain is being replaced with prosthetics you're suddenly more technological than biological and eventually you're entirely technological. At what point in this process do you stop being you and are considered an AI?
It also sheds into question the nature of consciousness and how that works. Is it tied to your body? My thoughts are that it isn't and could potentially be transfered.
Of course that also implies that you're just a collection of labels your brain has attached to itself. Strip those away and are dead or are you someone else? So if labels aren't really who you are then what are you and are you really unique in a sea of other consciousness or are you the sea.
Re: Random Thought #25
TLDR: Make robot brain and upload your own fucked up head into it.
1st up I know memories aren't actually 'stored' in our brain, I'm not a professional but from what I think - it's a pattern of impulses through the neurons, it's those patterns that form our memories and there's no guarantee our brains interprete those patterns the same way as each other.
BUT I always found it fascinating if you could upload your brain to a computer, that can obviously interprate or mimic those impulses. I mean hypothetically speaking if you were to switch it off it would be murder.
1st up I know memories aren't actually 'stored' in our brain, I'm not a professional but from what I think - it's a pattern of impulses through the neurons, it's those patterns that form our memories and there's no guarantee our brains interprete those patterns the same way as each other.
BUT I always found it fascinating if you could upload your brain to a computer, that can obviously interprate or mimic those impulses. I mean hypothetically speaking if you were to switch it off it would be murder.
[color=red] . : [/color][size=85] You knows you knows [/size]
Re: Random Thought #25
don't you mean dark mines, welshie?Memphis wrote:Worth the bump for the dank mimes.
Re: Random Thought #25
Consciousness is a byproduct of subconscious thoughts. It's there to sort through the tons of data that your brain is processing. Maybe it doesn't really matter if you're biological or not. It's not like you have free will or are even aware of what the fuck is going on most of the time.
http://gizmodo.com/new-time-slice-theor ... 1770950927
http://blogs.scientificamerican.com/min ... free-will/
http://gizmodo.com/new-time-slice-theor ... 1770950927
http://blogs.scientificamerican.com/min ... free-will/
Re: Random Thought #25
what do you mean by "free will"?
Re: Random Thought #25
That's just kinda like saying your mind remembers what's pleasurable though and what causes discomfort ?. Like, a pattern of impulses in the brain are remembered by the interactions it has with the neurons, that release chemicals. Our entire thought process is governed by chemical processes, no different from animals. There's been many theories on what can be classified as higher intelligence, from self awareness to forward thinking, but all (most?) of it can be exhibited in the animal kingdom.DooMer wrote:Consciousness is a byproduct of subconscious thoughts. It's there to sort through the tons of data that your brain is processing. Maybe it doesn't really matter if you're biological or not. It's not like you have free will or are even aware of what the fuck is going on most of the time.
http://gizmodo.com/new-time-slice-theor ... 1770950927
http://blogs.scientificamerican.com/min ... free-will/
Humans have evolved with higher brain functions and that gave us the evolutionary edge, over the generations it raised our mental capacity to do all the things animals could plus more, we have been bread for the retention of memories because it is these memories that form the way we cook, the way hunt / talk / swim / etc etc and overtime we learned how to store useless shit like algebra in them and refine the way these memories are searched, through better chemical interactions within the brain. Computers are still nowhere near that kind of capacity, and not just in terms of hardware but everything, like, our search algorithms and the entire way they're implemented. Who knows what the future will bring, I mean I can't see why a computer couldn't mimic a human, in the far and distant future, its going to be an evolution process though and weve already made some good progress but it's like theorising about faster than light travel. We won't be seeing any of it regardless.
[color=red] . : [/color][size=85] You knows you knows [/size]
Re: Random Thought #25
Fuck u...Guest wrote:I was thinking about artificial intelligence a moment ago and I realized
Re: Random Thought #25
This... explains a lot.Unbeknownst to participants, the circle that lit up red on each trial of the experiment was selected completely randomly by our computer script. Hence, if participants were truly completing their choices when they claimed to be completing them—before one of the circles turned red—they should have chosen the red circle on approximately 1 in 5 trials. Yet participants’ reported performance deviated unrealistically far from this 20% probability, exceeding 30% when a circle turned red especially quickly. This pattern of responding suggests that participants’ minds had sometimes swapped the order of events in conscious awareness, creating an illusion that a choice had preceded the color change when, in fact, it was biased by it.
Importantly, participants’ reported choice of the red circle dropped down near 20% when the delay for a circle to turn red was long enough that the subconscious mind could no longer play this trick in consciousness and get wind of the color change before a conscious choice was completed. This result ensured that participants weren’t simply trying to deceive us (or themselves) about their prediction abilities or just liked reporting that they were correct.
[quote="YourGrandpa"]I'm satisfied with voicing my opinion and moving on.[/quote]