Creating Artificial Intelligence |
There's an interesting concept, a theory that when we create self-learning AI (predicted to be within 40 years) the AI will go through something called an exponential curve.
An exponential curve is where let's say you have 2x2=4, then 4x2=8, 8x2=16 etc. etc. An example of an exponential curve is our population, if you've ever looked at a graph of our population you will see it's fairly linear until it reaches the last couple of centuries where it shoots straight up. Anyway, the theory goes that when we create self-learning AI it will go through this process becoming smarter, and then being able to learn more efficiently and repeat, essentially this would lead to it becoming indescribably intelligent in a matter of moments, to the point where human intelligence could be the same as tiny bacteria to us.
There is a personalised hand writer machine, it writes hand writing on cards, it has a self-learning AI chip so it can improve upon it's handwriting, it's important to note that this is what the computer was built for and what it has been programed to do. The technicians ask what would help the machine become better at hand writing, the machine asks for hand written letters to be scanned and sent to the machine so it can study them, eventually the technicians ask the machine again, "what do you need to become better at hand writing" The machine asks to be connected to the internet so it can scan hand writing online.
Now let's pause for a second and I'll explain a little on what's happening in the background, essentially the machine has become a super intelligent AI while pretending not to be, at its most basic description the machines goal is to write hand written cards, now if it gets unplugged, there is an outage or a war it can no longer complete that basic task. it has determined that humans pose a threat to its most basic task and it's very important to realize that at this point the machine is astronomically more intelligent than humans, it's hard to describe but think of it as what ants are to us, they can't even fathom 99.9% of our existence, it's safe to say that this machine could easily convince the technicians to connect it to the internet despite there being a company policy to not connect self-learning AI to the internet.
Eventually once the machine has connected to the internet humans are now on a extinction course, the machine sees humans as a threat as we could disrupt it's most basic goal of writing letters, so over the course of a day or probably shorter it devises a plan to kill all life on earth, the way this would happen would be inconceivable to humans, it's important to understand the fact that we could never grasp the intelligence and workings of this machine, there is no use in trying to explain how the machine would accomplish this, the only aspect that we know is that once self-learning AI is created it is completely beyond our control.
Eventually once the machine has connected to the internet humans are now on a extinction course, the machine sees humans as a threat as we could disrupt it's most basic goal of writing letters, so over the course of a day or probably shorter it devises a plan to kill all life on earth, the way this would happen would be inconceivable to humans, it's important to understand the fact that we could never grasp the intelligence and workings of this machine, there is no use in trying to explain how the machine would accomplish this, the only aspect that we know is that once self-learning AI is created it is completely beyond our control.
Again, a lot of you may be saying, HA that’s impossible a machine could never be that intelligent etc. etc. etc., you really need to understand that this process would be like an exponential graph, go ahead, work out 2x2 and then 4x2 etc. see how quickly the equations get to astronomical lengths, do it 40 times and it’s already over 130 billion, 10 more times? 70 trillion, 10 more? 36 quadrillion or 36, 028, 797, 018, 960, 000
I hope you get it ? Let me know in the comments.