In a letter presented at the International Joint Conference on Artificial Intelligence in Buenos Aires, the group wrote that "AI technology has reached a point where the deployment of [autonomous weapons] is – practically if not legally – feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms."
The letter asks the United Nations to ban the use of autonomous weapons.
The argument, as The Guardian points out, is going to war would be an easier decision if robots are the ones fighting.
Drone strikes are already a contentious issue in the U.S., but reliable statistics for how many are killed by those strikes overseas every year are tough to come by.
Civilian deaths caused by drones are also an issue, though President Obama's defended their use.
"Actually, drones have not caused a huge number of civilian casualties. For the most part, they've been very precise precision strikes against Al-Qaeda and their affiliates, and we're very careful in terms of how it's been applied," Obama said.
Musk has warned of this kind of AI takeover before, including this August 2014 tweet reading, "We need to be super careful with AI. Potentially more dangerous than nukes."
Aunt Jeannie , mankind likes wars , it been like that since the beginning of time .
Six on one hand half a dozen on the other hand ,what you think ?
Your Junior scientists : Jonny and Chris
Hello my favorite junior scientists
Thank You
I love this article
Wow! What a great topic! You could make such a fantastic debate out of this.
Autonomous Weapons Systems (AWS) are defined by the U.S. Department of Defense as weapons that, once activated, can select and destroy targets without intervention by a human operator. This will change the entire structure of war. Worldwide concern has been growing about the idea of developing weapons systems that take human beings “out of the loop.” An AI cannot make ethical and moral judgments. It has no intuition to perceive when a change of situation might call for a different plan. It has no compassion for the people it kills. When activated, it has one goal, one prime directive and it may not be so easy to cancel the mission or call back the AWS in future. I believe it is ethically necessary to keep that human factor in warfare ( although I do not agree with warfare at all ... I realize it is not going away any time soon ). It is too easy to push a couple of buttons and let robots take over the battle. Then one does not have to feel remorse or responsibility for the lives that are taken. And one could easily lose touch with the reality that human beings are being annihilated.
If a given AWS is merely applying a set of preprogrammed instructions, then, presumably its designers and operators are the ones morally responsible for its behavior. But if the AWS in question is capable (at some future date) of making genuine moral judgments in its own right, then that would appear to shift the weight of responsibility to the AWS itself. And if this is the case, who is legally and morally responsible for the decisions the AWS makes and the people it kills?
We are going in a bad direction but I don't see how we can stop the progress.
The future holds even greater fears. One hypothesis is antimatter weapons. Scientists are currently working with antimatter and have isolated antimatter particles at CERN.
When matter and antimatter collide, they completely annihilate each other. So you can see the military potential. One gram ( one quarter of a teaspoon ) of antimatter annihilating with one gram of matter produces the equivalent explosion of 42.96 kilotons of TNT (approximately 3 times the bomb dropped on Hiroshima ). There would be no radio-active fallout, nuclear winter or collateral damage.
The delay in this area of research is that antimatter is very hard to produce and even harder to contain, also very dangerous. And the expense would be sky high with enormous amounts of energy used . It would take thousands or even a million years to produce one gram of antimatter with the knowledge in use today. Our present technology is just not sophisticated enough to handle this....yet. But you and I know they will continue to research. For now, they are still working on Fusion Bombs. If these kinds of weapons were loaded on an AWS, can you imagine the enormous responsibility placed on an AI? Would you want the future of the world to rest on a bunch of printed circuits??
The delay in this area of research is that antimatter is very hard to produce and even harder to contain, also very dangerous. And the expense would be sky high with enormous amounts of energy used . It would take thousands or even a million years to produce one gram of antimatter with the knowledge in use today. Our present technology is just not sophisticated enough to handle this....yet. But you and I know they will continue to research. For now, they are still working on Fusion Bombs. If these kinds of weapons were loaded on an AWS, can you imagine the enormous responsibility placed on an AI? Would you want the future of the world to rest on a bunch of printed circuits??
As for whether Mankind is naturally or genetically warlike. This is another topic I like and I bet you could make an awesome debate about the pros and cons :
This is the way I see that situation. If we all enjoyed war, there would be nobody left and efforts at peace would be futile. We all would have just given up and shot each other by now. What we have is a balance of war mongers and peacekeepers. Sometimes the balance tips one way and sometimes the other way.
When we became a species, we were not warlike. We were driven by a biological imperative, which means just surviving and reproducing. So we were dependent on each other to watch each other's backs, hunt together and huddle together in the communal cave for warmth. Of course we had to be a little aggressive and up for a challenge, so we could hunt and explore and defend ourselves from a world full of predators. So we had no time or need for war.
So, what changed us? Scientists think war came into being about ten thousand years ago, when we discovered agriculture. We began to acquire things that other people wanted and tried to steal...like stored food, domesticated animals and tools and little plots of cultivated land. Men became territorial about their plots of land and possessions and defended them. This appears to be when hostilities between men became a part of our history.
The danger lies in believing that the violence we see now is an innate or natural instinct for war. When you believe that, it can become a highly destructive, self-fulfilling prophecy, not only closing off possible avenues of peaceful conflict resolution, but actually making war more likely.
If you have an argument that can change my mind then 'come on back'.
Much love,
Peace Keeping Genie
PS: Sorry guys, I got pretty long winded. But these are things I like to discuss.
No comments:
Post a Comment
Through this ever open gate
None come too early
None too late
Thanks for dropping in ... the PICs