Bishop should go. Good idea.
April 13, 2004 3:01 PM Subscribe
The robot should go in first. Between 50 and 100 Packbot [13MB wmv] unmanned ground vehicles (UGV) are currently being used for battlefield reconnaissance. One proved its worth last week when it uncovered a bomb and was destroyed in the process. Colin Angle, CEO of Packbot maker iRobot, doesn't rule out the eventual weaponizing of UGVs and quips "we're not using these robots to hand out flowers".
I love technology and the latest gadgetry, but when it's engineered to efficiently kill humans, it's troubling.
There's no stopping it mathowie... seriously. I mean robots that kill just seems to be part of the "natural" evolution of said technology.
posted by Witty at 3:30 PM on April 13, 2004
There's no stopping it mathowie... seriously. I mean robots that kill just seems to be part of the "natural" evolution of said technology.
posted by Witty at 3:30 PM on April 13, 2004
No points for dropping the original story title (which is still visible in the page header)... :)
Then someone asked what happens when the robots are subject to human error, when someone accidentally shoots folks, or maybe a corrupt police dept in the US gets one and uses it on a crowd.
How is that any different from existing technology used by law enforcement? Should the robots be responsible in any way for correcting human errors? Is that a road we really want to go down?
posted by elvolio at 3:51 PM on April 13, 2004
Then someone asked what happens when the robots are subject to human error, when someone accidentally shoots folks, or maybe a corrupt police dept in the US gets one and uses it on a crowd.
How is that any different from existing technology used by law enforcement? Should the robots be responsible in any way for correcting human errors? Is that a road we really want to go down?
posted by elvolio at 3:51 PM on April 13, 2004
I love technology and the latest gadgetry, but when it's engineered to efficiently kill humans, it's troubling.
Well, the U.S. military budget alone is about $2 billion per day. I don't think there's any other industry that has that huge of a budget to develop technology. Thus I imagine we'll see a lot of the newest gadgets coming in the form of "killing machines" of one sort or another.
posted by sixdifferentways at 3:59 PM on April 13, 2004
Well, the U.S. military budget alone is about $2 billion per day. I don't think there's any other industry that has that huge of a budget to develop technology. Thus I imagine we'll see a lot of the newest gadgets coming in the form of "killing machines" of one sort or another.
posted by sixdifferentways at 3:59 PM on April 13, 2004
No chance this is viral marketing?
On a semi-related but more serious note, what does the "i" stand for? The first thing I noticed with it was the iMac, and then you had the iPod, so it struck me as an Apple branding thing, but then it started popping up all over the place. Given that Apple isn't calling trademark infringement left and right, where does the "i" come from?
posted by rafter at 4:23 PM on April 13, 2004
On a semi-related but more serious note, what does the "i" stand for? The first thing I noticed with it was the iMac, and then you had the iPod, so it struck me as an Apple branding thing, but then it started popping up all over the place. Given that Apple isn't calling trademark infringement left and right, where does the "i" come from?
posted by rafter at 4:23 PM on April 13, 2004
Given that Apple isn't calling trademark infringement left and right, where does the "i" come from?
Issac Asimov's 1950 short story collection I, Robot (on which the film you linked to is based). The book included the three laws of Robotics:
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm
2) A robot must obey orders give in to it by human beings except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
So iRobot would seem to be violating I, Robot's first law.
posted by donovan at 4:46 PM on April 13, 2004
Issac Asimov's 1950 short story collection I, Robot (on which the film you linked to is based). The book included the three laws of Robotics:
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm
2) A robot must obey orders give in to it by human beings except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
So iRobot would seem to be violating I, Robot's first law.
posted by donovan at 4:46 PM on April 13, 2004
So iRobot would seem to be violating I, Robot's first law.
Now, my entire understanding of the finer plot points Asimov's I, Robot comes from watching the trailer on Quicktime, but is that really where the "i" comes from and why it is used so widely? I really can't see that being a positive brand-building image for the iMac, et cetera...
posted by rafter at 5:39 PM on April 13, 2004
Now, my entire understanding of the finer plot points Asimov's I, Robot comes from watching the trailer on Quicktime, but is that really where the "i" comes from and why it is used so widely? I really can't see that being a positive brand-building image for the iMac, et cetera...
posted by rafter at 5:39 PM on April 13, 2004
I'm sorry, does our military need robots now? It seems like technology is making war easier to stomache for those who have it, and more deadly and (for lack of a better word) unfair to those who don't.
The nature of war changes when there is a reasonable expectation on one side of being totally annihilated on the battlefield, and on the other side, the reasonable expectation that you won't lose any soldiers on the battlefield.
What it changes into is a situation where the powerful side no longer has any fear of going to war, and the weak side no longer has any options but to pursue terrorism.
No apologies for "going there" -- I think this technology has overtly political implications.
posted by Hildago at 5:44 PM on April 13, 2004
The nature of war changes when there is a reasonable expectation on one side of being totally annihilated on the battlefield, and on the other side, the reasonable expectation that you won't lose any soldiers on the battlefield.
What it changes into is a situation where the powerful side no longer has any fear of going to war, and the weak side no longer has any options but to pursue terrorism.
No apologies for "going there" -- I think this technology has overtly political implications.
posted by Hildago at 5:44 PM on April 13, 2004
iRobot... doesn't rule out the eventual weaponizing of UGVs and quips "we're not using these robots to hand out flowers".
It's a damned shame they won't consider it, though, because just try to imagine how much more efficiently robots could hand out flowers. WOW.
posted by Hildago at 5:46 PM on April 13, 2004
It's a damned shame they won't consider it, though, because just try to imagine how much more efficiently robots could hand out flowers. WOW.
posted by Hildago at 5:46 PM on April 13, 2004
Maybe more apt a reference than the (admittedly classic) I, Robot corpus is the e-sheep comic "Spiders". Not only does it take place in a modern mid-east battlefield (Afghanistan IIRC), but the bots are much closer: small remote controlled recon drones, not truly autonomous "robots". Good stuff - without giving too much away, they play some very clever games with the "observers".
posted by freebird at 5:51 PM on April 13, 2004
posted by freebird at 5:51 PM on April 13, 2004
Some notes from a roboticist-in-training:
First, there's no chance that this is viral marketing for the movie. iRobot has been around for a very long time; my grad school owns many of their robots. They're hardly a huge, sinister corporation either, though I'm sure they do a brisk business with people doing military research.
Second, the technology necessary for creating fearsome autonomous killbots is a long way out. A lot of people don't realize this. As an example, my officemate is working on code that recognizes buildings in images--that we're still working on such a specific task hints that we're far away from the general vision system that any thriller movie juggernaut would need in the first place. In short: robots are remarkably stupid things, and they'll be stupid for many years to come. (If you like, I can continue in great detail on just how dumb they are.)
For that reason, robots won't be making many important decisions on their own for some time. They just can't do it. There will still be a human being close behind the scenes for a while.
We're in a good position right now to discuss the ethical issues surrounding military robots. We should certainly be doing it. All I'm saying is that we have some time.
Personally, I can't foresee robots killing more people, or making death any more abstract and distant, than current innovations like the atomic bomb or landmines. Nobody can truly understand the wholesale, instant destruction of an entire city, and nobody seems to think much about the person who will be unlucky enough to lose a leg to a mine. The "nintendo war" argument is a bit misleading that way---it seems to me that we're already quite good at ignoring the human significance of our targets.
The one thing that makes robots different is that deploying them will likely be politically cheap, and here's where the nintendo war argument really comes to bear. Hildago makes the point well.
posted by tss at 5:58 PM on April 13, 2004
First, there's no chance that this is viral marketing for the movie. iRobot has been around for a very long time; my grad school owns many of their robots. They're hardly a huge, sinister corporation either, though I'm sure they do a brisk business with people doing military research.
Second, the technology necessary for creating fearsome autonomous killbots is a long way out. A lot of people don't realize this. As an example, my officemate is working on code that recognizes buildings in images--that we're still working on such a specific task hints that we're far away from the general vision system that any thriller movie juggernaut would need in the first place. In short: robots are remarkably stupid things, and they'll be stupid for many years to come. (If you like, I can continue in great detail on just how dumb they are.)
For that reason, robots won't be making many important decisions on their own for some time. They just can't do it. There will still be a human being close behind the scenes for a while.
We're in a good position right now to discuss the ethical issues surrounding military robots. We should certainly be doing it. All I'm saying is that we have some time.
Personally, I can't foresee robots killing more people, or making death any more abstract and distant, than current innovations like the atomic bomb or landmines. Nobody can truly understand the wholesale, instant destruction of an entire city, and nobody seems to think much about the person who will be unlucky enough to lose a leg to a mine. The "nintendo war" argument is a bit misleading that way---it seems to me that we're already quite good at ignoring the human significance of our targets.
The one thing that makes robots different is that deploying them will likely be politically cheap, and here's where the nintendo war argument really comes to bear. Hildago makes the point well.
posted by tss at 5:58 PM on April 13, 2004
Maybe more apt a reference than the (admittedly classic) I, Robot corpus is the e-sheep comic "Spiders".
Freebird, Spiders is awesome and chilling--you're right--much closer to the way things seem to be developing with a twist that recalls Ender's Game.
It's been years sincce I've read I, Robot but I wonder if completely remote controlled (very little autonomy) 'bots would qualify as robots in Asimov's thinking.
posted by donovan at 6:51 PM on April 13, 2004
Freebird, Spiders is awesome and chilling--you're right--much closer to the way things seem to be developing with a twist that recalls Ender's Game.
It's been years sincce I've read I, Robot but I wonder if completely remote controlled (very little autonomy) 'bots would qualify as robots in Asimov's thinking.
posted by donovan at 6:51 PM on April 13, 2004
If only the Colonial Marines had had enough sense to have Bishop do all the recon for them in Aliens.
posted by alumshubby at 7:02 PM on April 13, 2004
posted by alumshubby at 7:02 PM on April 13, 2004
Robots don't kill people. People kill people.
Oh, and alumshubby, he may be synthetic, but he's not stupid...
posted by inpHilltr8r at 7:17 PM on April 13, 2004
Oh, and alumshubby, he may be synthetic, but he's not stupid...
posted by inpHilltr8r at 7:17 PM on April 13, 2004
I don't really see any new ethical problems. How is looking at a monitor and firing at people from a robot different from looking at a monitor and firing at people from an apache helicopter? In one case you're outside the machine doing the firing and in another case you're inside.
As long as they're not autonomous, I'd guess the robots would be used for sniping by proxy.
posted by bobo123 at 8:05 PM on April 13, 2004
As long as they're not autonomous, I'd guess the robots would be used for sniping by proxy.
posted by bobo123 at 8:05 PM on April 13, 2004
I wonder how well these things would cope with a good-sized electromagnetic pulse?
posted by arto at 9:53 PM on April 13, 2004
posted by arto at 9:53 PM on April 13, 2004
we're not using these robots to hand out flowers
Japanese Prime Minister uses robot to hand out flowers
posted by eddydamascene at 11:28 PM on April 13, 2004
Japanese Prime Minister uses robot to hand out flowers
posted by eddydamascene at 11:28 PM on April 13, 2004
What interested me was this:
posted by riviera at 1:14 AM on April 14, 2004
"One robot was blown up," said retired Vice Adm. Joe Dyer, general manager of iRobot's government and industrial robotics division...in relation to this:
The Army will be able to deploy these units "from bases in the United States directly into the open desert," said retired Lt. Gen. Daniel Zanini, corporate vice president at Science Applications International.
The rules that allow Pentagon officials to accept defense industry jobs are complex and vary depending on seniority and position. Before government officials can begin negotiating jobs in the private sector, they must recuse themselves from making decisions that could have a financial impact on their potential employers. Also, officials who take jobs with contractors are prohibited from representing those companies on projects they supervised or worked on while in the government.I suppose there are only so many jobs for retired generals and vice-admirals on 24-hour news channels...
But the defense companies are so huge and the rules so elastic that loopholes exist. For instance, in most cases, if a contracting official has awarded a significant deal to a company, the official can't accept compensation from that company for a year after leaving the government. But the official can avoid the restriction by taking a job at an unrelated unit of the same company, industry analysts say. Also, senior government officials who move into the business world are prohibited from lobbying their former agencies for a year, but they can work behind the scenes to help a company develop strategies to pursue federal contracts.
posted by riviera at 1:14 AM on April 14, 2004
So iRobot would seem to be violating I, Robot's first law.
Nope, because iRobot is not a robot, it's an RC (remote control) device. In order for a machine to be a robot, it has to be autonomous, which as tss points out, is still in the future.
How is looking at a monitor and firing at people from a robot different from looking at a monitor and firing at people from an apache helicopter?
The heli needs to be in the area and the people inside it are still risking their balls. But it is true that there are other existing forms of remote warfare, such as a variety of long range missiles. Very cowardly, if you ask me. Then again, as a soldier in Iraq shouted into the TV the other day "We're marines; we don't fight a fair fight; we want to win every time".
posted by magullo at 4:43 AM on April 14, 2004
Nope, because iRobot is not a robot, it's an RC (remote control) device. In order for a machine to be a robot, it has to be autonomous, which as tss points out, is still in the future.
How is looking at a monitor and firing at people from a robot different from looking at a monitor and firing at people from an apache helicopter?
The heli needs to be in the area and the people inside it are still risking their balls. But it is true that there are other existing forms of remote warfare, such as a variety of long range missiles. Very cowardly, if you ask me. Then again, as a soldier in Iraq shouted into the TV the other day "We're marines; we don't fight a fair fight; we want to win every time".
posted by magullo at 4:43 AM on April 14, 2004
The nature of war changes when there is a reasonable expectation on one side of being totally annihilated on the battlefield, and on the other side, the reasonable expectation that you won't lose any soldiers on the battlefield.
What it changes into is a situation where the powerful side no longer has any fear of going to war, and the weak side no longer has any options but to pursue terrorism. Posted by Hidalgo.
That's probably the most astute observation pertinent to the US war on terrorism that I've read, on MeFi or elsewhere. I'd like to see this theory fleshed out more fully, so it may inform public policy.
posted by yesster at 6:32 AM on April 14, 2004
What it changes into is a situation where the powerful side no longer has any fear of going to war, and the weak side no longer has any options but to pursue terrorism. Posted by Hidalgo.
That's probably the most astute observation pertinent to the US war on terrorism that I've read, on MeFi or elsewhere. I'd like to see this theory fleshed out more fully, so it may inform public policy.
posted by yesster at 6:32 AM on April 14, 2004
« Older Kung Fu: The Legend Continues | Chat, Copy, Paste, Prison Newer »
This thread has been archived and is closed to new comments
The CEO began by talking about the cool robot vacuums and how there's a huge Roomba hacking community that the company supports. Then she talked about loving to build robots as a kid. So then she segued into the military robot research.
And then she scared the shit out of everyone in the room. She showed robots that could be thrown off buildings and right themselves, they were strong enough to be thrown through windows into fire. And they had military simulations from the year 2015 where robots basically get air dropped and then some tech controls what it sees and shoots, essentially becoming a true nintendo war, and haves vs. have-nots where robots kill humans at the touch of a button.
During the questioning, many raised the ethical issues, but the CEO dodged them. She said they would never sell robots to terrorist and that robots didn't make any decisions, only humans controlling them did. Then someone asked what happens when the robots are subject to human error, when someone accidentally shoots folks, or maybe a corrupt police dept in the US gets one and uses it on a crowd. But it seemed like those situations never occurred to anyone at iRobot.
I love technology and the latest gadgetry, but when it's engineered to efficiently kill humans, it's troubling.
posted by mathowie at 3:16 PM on April 13, 2004