Glitchfinder
Staff
Well, I've been thinking about this subject for a while, and wanted to find out what you guys thought. First off, I wanted to give some information.
There are experiments all over the world dedicated to the concept of AI, or "Artificial Intelligence". People try to program it, build it, and generally try to make a new kind of being. One example of an experiment that could result in AI is one in America, where a scientest is trying to build a digital copy of the human brain, on a molecular level. Theoretically, if he succeeds, the brain could become conscious, and thus, able to think for itself.
If AI appears, would we have the right to tamper with it? Would we have the right to shut it down? Or, would that be considered to be subjugation and murder, respectively? Is it even possible to create AI? If AI is created, would it value humans for creating it, or do you think it would look down on humans as lesser beings, worthy only of subjugation or destruction? If AI is created, would its feelings, thoughts, and emotions be any less real to it than ours are to us, just because it is a machine?
As you can see, there is a plethora of questions available regarding the morality and reality of AI. What do you think? Do you think it is possible, and, if so, what opinions do you have regarding what will happen to it, us, and how we interact?
My opinion is that not only is it possible, but is is inevitable. I believe that if AI is generated, we would be too terrified to give it any position of power, and thus use it only as a servant, and try to never allow anything with AI to be as much as it could be. But, that is just my opinion. I'd like to know what you think!
There are experiments all over the world dedicated to the concept of AI, or "Artificial Intelligence". People try to program it, build it, and generally try to make a new kind of being. One example of an experiment that could result in AI is one in America, where a scientest is trying to build a digital copy of the human brain, on a molecular level. Theoretically, if he succeeds, the brain could become conscious, and thus, able to think for itself.
If AI appears, would we have the right to tamper with it? Would we have the right to shut it down? Or, would that be considered to be subjugation and murder, respectively? Is it even possible to create AI? If AI is created, would it value humans for creating it, or do you think it would look down on humans as lesser beings, worthy only of subjugation or destruction? If AI is created, would its feelings, thoughts, and emotions be any less real to it than ours are to us, just because it is a machine?
As you can see, there is a plethora of questions available regarding the morality and reality of AI. What do you think? Do you think it is possible, and, if so, what opinions do you have regarding what will happen to it, us, and how we interact?
My opinion is that not only is it possible, but is is inevitable. I believe that if AI is generated, we would be too terrified to give it any position of power, and thus use it only as a servant, and try to never allow anything with AI to be as much as it could be. But, that is just my opinion. I'd like to know what you think!