View Full Version : Cybernetic killing machines: have these people not seen Terminator?
English assassin
02-16-2009, 23:36
http://technology.timesonline.co.uk/tol/news/tech_and_web/article5741334.ece
People, we need to sit down and think here. I know, I know, killer robots are way cool. IN COMICS. Real killer robots may not be so cool. And where I start to get just a little bit worried is when the US navy is getting worried. (Or at least, paying someone else to worry) That is because (1) they probably know about robots and (2) they re basically on board with the idea of blowing stuff up already.
I'm hoping this might all be some inter service rivalry thing, and the Navy is way behind the army in using robots so is trying to pretend that robots are a bad thing, because if its not that, this is a teeny tiny bit scary. What is the sense in banning poison gas and clusterbombs, but not banning terminators?
I just think that a world without machines designed and programmed to kill humans is probably better than a world with machines designed and programmed to kill humans.
Besides which, imagine what the world would be like if a government could declare war without having to explain to the people why it was sending their sons (and daughters) to be killed? We don't seem too backward in having wars as it stands, once politicians can pretend there's little risk to real human soldiers it's going to be mayhem.
Sasaki Kojiro
02-16-2009, 23:46
I approve of killer robots as long as they're like the one in Terminator III...
There is no way a robot could ever weigh up the various ethical, political and human concerns of killing someone. As such, computers should never be built to take human life.
InsaneApache
02-17-2009, 00:53
Where's Sarah Connor when you need her? :inquisitive: :yes:
Beefy187
02-17-2009, 01:00
Where's Sarah Connor when you need her? :inquisitive: :yes:
You better not spoil it... I haven't seen it yet..
So as we start making killer Robots, we'll start having a absolute law. Rich people will make better robots. Therefore rich people wins..
I suggest killer Robots to be exterminated immediately..
I'm hoping this might all be some inter service rivalry thing, and the Navy is way behind the army in using robots so is trying to pretend that robots are a bad thing, because if its not that, this is a teeny tiny bit scary.
I doubt there is gonna be any rivalry to slow it down. Even if there is any it will quickly go away when China and Russia starts using them.
What is the sense in banning poison gas and clusterbombs, but not banning terminators?That is assuming robots are gonna be just as indiscriminate in killing civilians which they most certainly won't be.
Besides which, imagine what the world would be like if a government could declare war without having to explain to the people why it was sending their sons (and daughters) to be killed? We don't seem too backward in having wars as it stands, once politicians can pretend there's little risk to real human soldiers it's going to be mayhem.That might be a risk but that will just be for minor wars. Any war against major powers won't be more tempting because of WMD and other stuff. Maybe robots enables us to intervene in a future Rwanda and all the other small stuff we don't want to do anything about while hundreds of thousands of civilians die.
Plus robots don't have a habit of panicking and spraying bullets around them killing innocent bystanders. Neither do they go crazy and raze villages or rape little girls. So maybe robots won't be such a bad thing after all.
CBR
Hooahguy
02-17-2009, 02:43
have you seen Robocop?
have you seen Robocop?
I've seen it. But what does a 80's Hollywood scifi movie have to do with real life? We still haven't been invaded by giant women or ants so excuse me if I don't see screenwriters as oracles.
CBR
Where's Sarah Connor when you need her? :inquisitive: :yes:
Singing songs most likely. (http://en.wikipedia.org/wiki/Sarah_Connor_(singer))
I agree with a lot of what the article says, it should be pretty hard to make a robot differentiate between a fighter and a civilian as well, you need a ppropriate sensors and programming and there are quite a lot of things that could go wrong.
tibilicus
02-17-2009, 03:16
Awsome. I'm converting my cellar into a bunker for when the bombs hit just like the start of terminator, maybe I might form a resistance movement when it all goes pear shaped as well. I'm sure I'll have a whale of a time it'll be just like the movies!
Anyway I don't see what the big deal with the military being increasingly reliant on A.I technology is. If we had a whole army of super robots to do our battles for us not only would it save a lot of human life but we also wouldn't have to deal with war crime courts and the like.
Unless we here about the robots setting up their own Guantanamo. Then were in big trouble..
Gregoshi
02-17-2009, 05:55
We can take great comfort in the knowledge that these mechanical warriors will be as indestructable as possible.
To quote Jeff Goldblum's character in Jurassic Park: "They were so focused on whether it could be done, they didn't stop to think if it should be done."
My suggestion for development of these robotic weapons if it must continue is that the AI programmers be required to "volunteer" for the "civilian recognition" testing. And none of that sissy body armour either. :yes:
Sasaki Kojiro
02-17-2009, 06:01
Seriously though, killer robots would be remote controlled--any terminator types would be controlled by a human in a virtual reality type setup. That would be about 10x* easier from a technology perspective.
*I made all this up
Gregoshi
02-17-2009, 06:09
Remote controlled would be preferable. Imagine a army barracks full of men wearing virtual reality headsets and armed with Wiimotes.
The weakness with the remote control idea is that the controllers become the target.
InsaneApache
02-17-2009, 06:13
Aye but you've got to battle through the bloody robots to get to them.
A small criticism of your overview. :sweatdrop:
For now they will be remotely controlled. But I doubt it will be more than a few decades before we have robots as a more or less permanent part of squads and/or platoons. At first just for carrying stuff and medevac and capable of understanding simple orders.
But since they have steadier hands and better eyesight than humans it won't be long after that before they do the shooting too. Human controllers/supervisors would handle a squad of them instead of only one. Or escort vehicles with several AI controlled weapons which can respond quickly to enemy gunfire.
They would also need a lot of autonomy as radios can be jammed, might not be important against current lowtech insurgents but they won't stay lowtech forever.
The big problem is to get an AI to process visual information from cameras but that is something that will be solved over time with better AI programming and hardware. In the end they will be able to spot and identity targets faster than a human and have dealt with it before a controller has time to decide.
CBR
Shaka_Khan
02-17-2009, 06:36
Awsome. I'm converting my cellar into a bunker for when the bombs hit just like the start of terminator, maybe I might form a resistance movement when it all goes pear shaped as well. I'm sure I'll have a whale of a time it'll be just like the movies!
But if they find a way to make a time machine, they'll send a terminator to get you.
InsaneApache
02-17-2009, 06:47
I hope the AI is better than the one in RomeTW. :wall:
Strike For The South
02-17-2009, 06:49
Does anyone else want this to happen? I mean killing robots would be the bees knees. Me, my dog, my black friend that says witty things, and my scantly clad maiden whom has fallen madly in love with me.
That's all kinds of win.
Gregoshi
02-17-2009, 07:09
I hope the AI is better than the one in RomeTW. :wall:
And the voice acting too. We can't have these robots spouting off I'm-reading-cue-cards-with-an-awful-accent level vocals.
Gregoshi
02-17-2009, 07:13
Aye but you've got to battle through the bloody robots to get to them.
Not if you are partisan to wearing civilian clothing.
Alexander the Pretty Good
02-17-2009, 08:52
If robots get smart enough to turn on us, we probably have no right forcing them to fight for us.
/only partially kidding
The problem is making them identify a threat and not kill a civilian.
If they can identify a guy holding a gun, which sounds pretty hard in itself already, then what if he has covered half his gun in cloth or what if he has a bomb belt?
What if a scientist walks around with a crowbar in his hand?
Will the robot be able to differentiate all that? What if it gets attacked by dogs? Dogs with bombs strapped to them? Suicide-bomb-donkeys? Threats from behind? Shooting out of bushes, windows?
If it's all pre-programmed, it can likely be exploited or it will have to kill so universal that it might just mow down everything including civilians.
Oh and they won't exactly make war cheaper either and a bit of colour sprayed onto the camera by "harmless civilians" makes them worthless unless they shoot everything with colour in hand as well. :dizzy2:
tibilicus
02-17-2009, 13:31
But if they find a way to make a time machine, they'll send a terminator to get you.
No problem I'll just send one myself from the future to protect my younger self. See, im prepared!
rasoforos
02-17-2009, 13:52
The problem is making them identify a threat and not kill a civilian.
That does not seem to be stopping human soldiers lately...
...you call em side losses, you claim they were militants, you say the enemy killed them and you 're fine!
InsaneApache
02-17-2009, 14:02
Not if you are partisan to wearing civilian clothing.
Oh you...you punny bugger! :laugh4:
...Will the robot be able to differentiate all that?
Eventually it will. And having pure robotic units will bring the advantage that they don't even have to react quickly while trying to protect fellow human soldiers. Sure you lose more robot soldiers that way but a dead robot does not come back in body bags so no big problem there.
Robots doing house clearing, driving supply convoys and basic patrol duty. Suddenly fanatic insurgents, ambushes or roadside bombs only do material losses.
CBR
rory_20_uk
02-17-2009, 16:22
Well, with weapons unless we research them we might get a rather nasty surprise down the line. The 1930's showed where pacifism gets you.
~:smoking:
"Drop your weapon, you have 15 seconds to comply!"
Sentient, armed robots are all well and good, but as we all know the true future is in Mechs.
Hooahguy
02-17-2009, 17:14
I've seen it. But what does a 80's Hollywood scifi movie have to do with real life? We still haven't been invaded by giant women or ants so excuse me if I don't see screenwriters as oracles.
CBR
well Robocop had a moral system in it and that seemed to work out pretty well, IIRC.
Gregoshi
02-17-2009, 19:40
well Robocop had a moral system in it and that seemed to work out pretty well, IIRC.
~:doh: How silly of us. Forget the AI developers, we'll just have Hollywood screenwriters program the things! ~D
well Robocop had a moral system in it and that seemed to work out pretty well, IIRC.
Was that Robocop himself or that other monster police robot? Robocop was not really a robot but more some cybernetic freak with a half working brain, heh
CBR
Ironside
02-17-2009, 20:06
Eventually it will. And having pure robotic units will bring the advantage that they don't even have to react quickly while trying to protect fellow human soldiers. Sure you lose more robot soldiers that way but a dead robot does not come back in body bags so no big problem there.
Robots doing house clearing, driving supply convoys and basic patrol duty. Suddenly fanatic insurgents, ambushes or roadside bombs only do material losses.
CBR
Downside with very smart robots are that they might get a bit too close to sentience, something you definatly not want combat robots to develop, at least not before you've seen how compatable our logic are with thiers. Not that this is a problem until at least a few decades forward.
Anyway, I'm gonna get my EMP granades ready soon.
Edit: If an easy EMP gets into development, then robotic armies will mainly be against low tech enemies and almost useless in a equal level war.
well Robocop had a moral system in it and that seemed to work out pretty well, IIRC.
Robocop was a cyborg, thus having a human moral framework to build on.
And while at it, I thought the people here are geeky enough to know that the title is wrong. They don't talk about cyborgs or even android infiltrator units, so what's up with the cybernetics?
Remote controlled would be preferable. Imagine a army barracks full of men wearing virtual reality headsets and armed with Wiimotes.
Soldiers of the future:
Twenty nerdy, glassy men with pimples, in their 20's leave the remote control room in which they controlled the robot soldiers. One turns to the other and says. "hahaha!11 I pwned dat n00b right in teh head lolz. u suck u only killed 1 terrorrist omgroflzorn00b111!1"
:P
I can already imagine NATO interventions being some kind of LAN Tournment with the best players in the world of the Robot Soldiers. :P
Downside with very smart robots are that they might get a bit too close to sentience, something you definatly not want combat robots to develop, at least not before you've seen how compatable our logic are with thiers. Not that this is a problem until at least a few decades
I'd say we will develop some very smart robots without going anywhere near sentience. There are loads of instincts/emotions that are an important part of who we are. Such instincts have no place in a robot and will not pop up by itself either. Such stuff is better left to scifi movies.
CBR
Ironside
02-17-2009, 20:32
I'd say we will develop some very smart robots without going anywhere near sentience. There are loads of instincts/emotions that are an important part of who we are. Such instincts have no place in a robot and will not pop up by itself either. Such stuff is better left to scifi movies.
CBR
Sentience might be a too strong word, but those robots require something that can very accuratly analyse evolving complex situations in a "game" where the opponent is human and the rules are about infinite. They might very well use unintended solutions or get buggy.
The scary part about advanced robotics is that we don't know what happens with a system that can fully mimic a human, but still lacking those instincs and emotions.
Sasaki Kojiro
02-17-2009, 21:41
I'd say we will develop some very smart robots without going anywhere near sentience. There are loads of instincts/emotions that are an important part of who we are. Such instincts have no place in a robot and will not pop up by itself either. Such stuff is better left to scifi movies.
CBR
Exactly...people worry about an army of robots taking over the world, but all one would have to do is program the robots so they don't know how to load ammo into themselves or recharge themselves...there would be no point in making a robot warrior who could think just like a person.
Gregoshi
02-17-2009, 21:44
Soldiers of the future:
Twenty nerdy, glassy men with pimples, in their 20's leave the remote control room in which they controlled the robot soldiers. One turns to the other and says. "hahaha!11 I pwned dat n00b right in teh head lolz. u suck u only killed 1 terrorrist omgroflzorn00b111!1"
"Well, sonny, I was a proud member of the battle hardened Pocket Protector Brigade in the Geek Guard Division during the war. Did my boot camp at Fort Gates. Back then we jokingly referred to it as 'reboot camp'. Got my purple heart in '32 for carpal tunnel." :army:
"Well, sonny, I was a proud member of the battle hardened Pocket Protector Brigade in the Geek Guard Division during the war. Did my boot camp at Fort Gates. Back then we jokingly referred to it as 'reboot camp'. Got my purple heart in '32 for carpal tunnel." :army:
"I keep telling ya..kids nowadays have it easy with all those fancy neural sockets. When I did my tour of duty in Emote Company in WWW 2.0, not many of us came back. 14 BSoD, 40 CTD and 3 DoS. It was just horrible"
CBR
Gregoshi
02-17-2009, 23:26
"I keep telling ya..kids nowadays have it easy with all those fancy neural sockets. When I did my tour of duty in Emote Company in WWW 2.0, not many of us came back. 14 BSoD, 40 CTD and 3 DoS. It was just horrible"
:laugh4:...ahem, sorry, respect...:bow:
Eventually it will.
I wouldn't be surprised if that takes quite a while. I'm not even sure whether current sensors would be sufficient for the scenarios I mentioned.
Concerning EMP, would that go through a faraday cage? I guess not so it doesn't sound all that useful to me.
:laugh4:...ahem, sorry, respect...:bow:
You are still the master though :bow:
CBR
I wouldn't be surprised if that takes quite a while. I'm not even sure whether current sensors would be sufficient for the scenarios I mentioned.
Sensors is not the problem though. A good HD camera and some microphones can deliver all the information needed. It is all about the AI processing it to know what it means and then how to react. It certainly won't be happening tomorrow and most likely will require some hefty CPU needs but that is just a question of time. And perhaps even a better understanding of how our own brain processes information and makes decisions.
Concerning EMP, would that go through a faraday cage? I guess not so it doesn't sound all that useful to me.
Electronics can be hardened but not sure how well for such a system.
CBR
Alexander the Pretty Good
02-18-2009, 01:47
Soldiers of the future:
Twenty nerdy, glassy men with pimples, in their 20's leave the remote control room in which they controlled the robot soldiers. One turns to the other and says. "hahaha!11 I pwned dat n00b right in teh head lolz. u suck u only killed 1 terrorrist omgroflzorn00b111!1"
:P
I can already imagine NATO interventions being some kind of LAN Tournment with the best players in the world of the Robot Soldiers. :P
Forever Peace by Joe Haldeman has this (though through neural "jacking" and not just mice&keyboards).
Sensors is not the problem though. A good HD camera and some microphones can deliver all the information needed. It is all about the AI processing it to know what it means and then how to react. It certainly won't be happening tomorrow and most likely will require some hefty CPU needs but that is just a question of time. And perhaps even a better understanding of how our own brain processes information and makes decisions.
Well yes, but how does the HD camera prevent a perfectly harmless person from walking by and spraying paint onto the camera lens? Should the person be shot by other robots? what if some kid does it just for fun or appears to do it just for fun?
I think a human soldier could react to that in a better way, not just the way of reaction but he could arrest the person, maybe a robot could, too with a helluvalot of work but that sounds like a really, really expensive robot, same for cleaning that lens automatically. -without a clear lens the robot would be almost useless unless it had other sensors or a protection cap it can remove. But some creative people might be able to come up with more ways to cripple some really expensive robot without appearing dangerous to it. Simulating a human brain to prevent that might work somewhat if the robots work in squads but once the media robots see a lot of dead civilians with only paint in their hands you get some really bad press. ~;)
Well yes, but how does the HD camera prevent a perfectly harmless person from walking by and spraying paint onto the camera lens? Should the person be shot by other robots? what if some kid does it just for fun or appears to do it just for fun?
"..but how does the Human eye prevent a perfectly harmless person from walking by and spraying paint into the eye?"
How would you answer that question? What makes you think robots can only perform lethal action?
Plus there can be several kinds of robots. Current day tanks and vehicles use cameras too and so far kids with spray cans is not what modern armies worry about.
And yes people are creative and come up with new stuff. The difference is that today it means more dead soldiers instead of damaged/destroyed robots. So soldiers adapt and change their tactics just as programmers/controllers will adjust the AI.
CBR
I hope the AI is better than the one in RomeTW. :wall:
What are those machine expected to do? Say Ander muv, ser?
Major Robert Dump
02-19-2009, 06:11
I'm all for killer robots if you can also have sex with them.
Lord Winter
02-19-2009, 06:59
The problem with Robots is cost, face it. Send 1,000 flesh soliders will probably be cheaper then sending in even one Robot.
Major Robert Dump
02-19-2009, 07:24
I doubt that.
Robots don't have to be trained, fed and housed. Robots don't have to be insured. Robots don't have to recover in a hospital and learn to walk and speak all over again.
I doubt that.
Robots don't have to be trained, fed and housed. Robots don't have to be insured. Robots don't have to recover in a hospital and learn to walk and speak all over again.
They still need maintenance, huge amounts of energy, transport, repairs or replacements etc.
And you have to train experts to maintain and repair them, to control them etc.
Unless of course we establish a robot slave race that does all that for us automatically.
Vladimir
02-19-2009, 14:19
They still need maintenance, huge amounts of energy, transport, repairs or replacements etc.
And you have to train experts to maintain and repair them, to control them etc.
Unless of course we establish a robot slave race that does all that for us automatically.
Robots can be mass produced dramatically reducing cost. Personnel, or human factors, comprise the bulk of Army spending. Robots/androids/machines are a cheaper alternative. Look at the UAV craze for comparison.
Well, UAVs have other advantages over piloted planes, aside from the pilot training/care/feeding, the whole design of the craft can be cheaper. No life support/environmental controls. No displays, no oxygen systems, no seats, no control interfaces. The craft can be made smaller without a cockpit, thus lighter, smaller radar signature, less susceptible to enemy fire, it can pull more Gs, it can stay on station longer. And if it gets shot down, no biggie and no dangerous SAR mission to recover the pilot/dead body.
Yoyoma1910
02-19-2009, 18:27
As long as they have a final battle with the Killer Dolphins the navy has already developed.
BTW, they got loose in 2005 due to Katrina. Good luck scuba divers, they could be anywhere by now.
Gregoshi
02-19-2009, 20:35
Well, UAVs have other advantages over piloted planes...
You would say that... ~D
You would say that... ~D
:bow:
English assassin
02-20-2009, 14:58
And while at it, I thought the people here are geeky enough to know that the title is wrong. They don't talk about cyborgs or even android infiltrator units, so what's up with the cybernetics?
Yeah, sorry, I'm more of a giant squid expert, I'm a bit off my usual territory with robots.
Wow, people are a lot cooler with the idea of designing killer robots than I expected. I'm still not convinced.
The issue isn't with remote controlled killer robots: we have missile firing drones already. A human still has to press the button, and it doesn't seem to me this is different in kind from replacing a spear with a rifle. The rifle makes it easier to kill but its plain who is responsible.
Likewise, unarmed robots. Automated trucks to convoy supplies: no problem.
But a system designed to decide for itself when to kill a human? Seriously, that is scary. Talk of sentience/non-sentience is a side issue, presumably it will be programmed to try to preserve itself, and to eliminate whatever its programming identified as threats. Given the impossibility of knowing how it will really behave, that's all you need for a major worry. it doesn't matter whether it knows it is killing humans or has a view on whether it should kill humans, in fact, the probability that it will have no such insight at all is what is worrying.
And, call me Mr Paranoid, but very similar systems could be used for law enforcement, surely? Given the UK government's current desire to outdo Stalin in survellance of its citizens, there is just no way they will be able to resist patching all their CCTV cameras into all the email/phone monitoring that goes on at CCHQ, and using all that data to drive robot police. Think how "safe" we would all be with robot police watching our every move. After all "if you've got nothing to hide, you've got nothing to fear" tm Then we really will be able to do only what Big Brother thinks is good for us.
Given the impossibility of knowing how it will really behave, that's all you need for a major worry.
I'd say it is easier to predict AI behavior than human behavior in a combat situation. No matter how good combat training is a human will still feel different when its real bullets and enemies. A robot does not know the difference so tests and exercises will go a long way in predicting its behavior.
In the end we already have had autonomous weapon systems that can kill people for quite a while now. But naval air defense systems just have a targeting problem that is a lot easier than land systems. Plus most of their potential targets are unmanned missiles but apparently the risk of pilots being killed by automatic system has so far not caused much debate. Robots in land combat is just a more advanced system really.
And, call me Mr Paranoid, but very similar systems could be used for law enforcement, surely?
Technology might make it easier to create the Big Brother society but ultimately that is the responsibility of governments. And I think there are all kinds of nano techs to go paranoid over a lot more than advanced robots.
CBR
English assassin
02-20-2009, 15:53
I'd say it is easier to predict AI behavior than human behavior in a combat situation. No matter how good combat training is a human will still feel different when its real bullets and enemies. A robot does not know the difference so tests and exercises will go a long way in predicting its behavior.
We don't seem to be able to predict how Windows will behave yet, so I'm less optimistic that you I'm afraid.
You can't predict individual human behaviour, possibly. But en masse and on average the past 2000 years experience does mean you can predict the sorts of behaviours. Mostly, soldiers will obey orders. Sometimes they won't, for a short period. Very rarely, they may be out of control for some time, although query whether that may be because no real effort is being made to bring them under control.
Technology might make it easier to create the Big Brother society but ultimately that is the responsibility of governments
Yes, and experience shows that governments take exactly the opposite view to Hume, and assume if they can do something they should. Which is good reason not to give them the capability in the first place. I go back to one of my first points, what happens when governments don't even need the limited consent of the people for a war implied by the fact that people have to enrol in the forces?
You can't predict individual human behaviour, possibly. But en masse and on average the past 2000 years experience does mean you can predict the sorts of behaviours. Mostly, soldiers will obey orders. Sometimes they won't, for a short period. Very rarely, they may be out of control for some time, although query whether that may be because no real effort is being made to bring them under control.
And yet we have blue on blue or civilians getting killed in the heat of combat. It is unfortunate and we try to learn from it to prevent it, but it still happens and we do accept it. The advantage of robots is that we can improve them in a way that we cannot improve humans.
Of course a very simplified example, but aimbots in first person shooters do not kill team members and are faster than humans. A robot will be able to deliver more accurate and faster fire than a soldier. It is all about target recognition which in real life is more complex than in a computer game. Eventually hardware and AI programming will get there.
We might even reach a point where we don't want humans to do the actual shooting in war but only let humans handle decisions related to when and where.
CBR
English assassin
02-20-2009, 17:18
We might even reach a point where we don't want humans to do the actual shooting in war but only let humans handle decisions related to when and where
We MIGHT reach a point where we don't want to have a war, but that would be silly :beam:
Without wanting to offend anyone who has a relative killed or injured in service, I still say its not obvious that even if you could design 100% reliable robot soldiers who only killed other robots (or, anyway, other soldiers), which I doubt, that would be a good thing.
Anything that makes it easier to get what you want by force, eg by removing the fact that to use force you have to risk your own citizens' (read voters') lives, is at best a mixed blessing.
In order not to derail the thread with modern examples, look what happened when the UK encountered various less developed people, where our better technology made it relatively easy to take what we liked by force. The British Empire, that's what.
Anything that makes it easier to get what you want by force, eg by removing the fact that to use force you have to risk your own citizens' (read voters') lives, is at best a mixed blessing.
In order not to derail the thread with modern examples, look what happened when the UK encountered various less developed people, where our better technology made it relatively easy to take what we liked by force. The British Empire, that's what.
Yes and one could look at the last few thousand years of history to see similar things. And yet the West is still very strong but we don't have colonies anymore. We might still not be behaving as well as we could but we have IMO certainly moved away from good old Rudyard Kipling and White Man's Burden.
So even though we might still display arrogance today it is nothing compared to a century ago, even though we now have the capability to kill "brown people" while sitting in a comfy chair watching the live video feed from a GPS guided bomb.
So maybe weapon capability alone is not something to worry about?
We could of course stop any development in new weapon techs but I doubt the world would be a better place when even the Taliban etc start wielding more fancy stuff than us.
CBR
Vladimir
02-20-2009, 18:05
You can't predict individual human behaviour, possibly. But en masse and on average the past 2000 years experience does mean you can predict the sorts of behaviours. Mostly, soldiers will obey orders. Sometimes they won't, for a short period. Very rarely, they may be out of control for some time, although query whether that may be because no real effort is being made to bring them under control.
In picking your posts apart I'd say that soldiers will generally follow orders. They're not dogs of war in the literal sense. In the U.S. Army officers tell NCOs what to do, NCOs tell the soldiers how to do what the officer said, and soldiers make it happen. It's like "telephone" with guns, explosives, and a lot more people.
Robots on the other hand obey laws, often to a fault, in less the laws conflict (perhaps because of information received), in that case you get Windows. In the next 100 years you won't see autonomous killing machines employed in urban combat. You'll most likely see what you do today with machines controlled by computers killing machines controlled by people.
Without wanting to offend anyone who has a relative killed or injured in service, I still say its not obvious that even if you could design 100% reliable robot soldiers who only killed other robots (or, anyway, other soldiers), which I doubt, that would be a good thing.
Anything that makes it easier to get what you want by force, eg by removing the fact that to use force you have to risk your own citizens' (read voters') lives, is at best a mixed blessing.
In order not to derail the thread with modern examples, look what happened when the UK encountered various less developed people, where our better technology made it relatively easy to take what we liked by force. The British Empire, that's what.
Or you might want to say anyone here who has killed or was injured in service (not me though :sweatdrop:). The book On Killing (http://www.amazon.com/Killing-Psychological-Cost-Learning-Society/dp/0316330116/ref=sr_1_1?ie=UTF8&s=books&qid=1235148988&sr=1-1) is a look at the affect killing has on the human mind. I believe that with every person you kill a piece of yourself dies. I hope to spare that fate from as many as I can.
A society increasingly removed from the horrors of war will be a society more horrified by war. History shows that the harsher your existence is the more indifferent you are toward the suffering of others. Once the inevitable combat footage from our terminators is reviewed the destruction will still strike a cord with people (likely more so by the side who employs the machines).
You won’t find me lamenting the creation of the British Empire, nor a great percentage of those from India. Perhaps that’s not the best example. Don’t demonize technology.
English assassin
02-20-2009, 18:29
You won’t find me lamenting the creation of the British Empire, nor a great percentage of those from India. Perhaps that’s not the best example. Don’t demonize technology.
I think you are slightly missing my point. Technology per se is neutral. But in as much as it enables us to get what we want, as technology develops, we need to be more and more careful about whether we are the sort of people who ought to have the capabilities that that technology gives us.
I happen to think we are already not the sort of people who should have some of the destructive capabilities we do. I know very well that you cannot put the genie back in the bottle, so we have to live with what we have, but that is not to say, before the genie comes out of the bottle, you ought not to think long and hard about what you are doing. As for the danger that we will be swamped by Taliban designed robots, somehow, I think not. A more plausible scenario is being swamped by Taliban inspired brown people, who our robots will kill for us.
Come on, is this REALLY not raising even a slight concern out there?
As for the British Empire, I chose an example against myself to avoid a nationalist diversion. But, the Congo, then, if you prefer. Or (here goes the diversion) Gaza.
Vladimir
02-20-2009, 18:35
As for the British Empire, I chose an example against myself to avoid a nationalist diversion. But, the Congo, then, if you prefer. Or (here goes the diversion) Gaza.
:laugh4: Getting the firehose.
Banquo's Ghost
02-20-2009, 19:44
A society increasingly removed from the horrors of war will be a society more horrified by war.
This is a fascinating discussion and it seems a little trite for me finally to take the opportunity to rebut an argument through a Star Trek reference alone, but (does happy dance)...
A Taste of Armageddon (http://memory-alpha.org/en/wiki/A_Taste_of_Armageddon_(episode)).
I would argue that modern societies have already shown a significant psychological dissonance from the horrors of war because they don't actually see or experience those horrors themselves. Worse, soldiers returning from those wars are misunderstood. Robots are not going to be killing robots but other human beings. They may not be the human beings you care about, but they are dying nonetheless.
Come on, is this REALLY not raising even a slight concern out there?
It raises huge concerns. None that I care to discuss here any more.
Vladimir
02-20-2009, 20:12
This is a fascinating discussion and it seems a little trite for me finally to take the opportunity to rebut an argument through a Star Trek reference alone, but (does happy dance)...
A Taste of Armageddon (http://memory-alpha.org/en/wiki/A_Taste_of_Armageddon_(episode)).
I would argue that modern societies have already shown a significant psychological dissonance from the horrors of war because they don't actually see or experience those horrors themselves. Worse, soldiers returning from those wars are misunderstood. Robots are not going to be killing robots but other human beings. They may not be the human beings you care about, but they are dying nonetheless.
I'll never criticize an argument that references Star Trek but is a far from modern reference. When you speak of modern societies I believe you're thinking along terms of "modern" warfare i.e. industrial era. The creation of the 24-hour news cycle places the war in our living room, or cars, and our MP3 players. Government censorship is ineffective, the world feels much smaller, and people don't want to be bothered by pleasantries. The war is in your pocket. Look at how the rules of engagement have changed over the years. They largely change to bring warfare in-line with social norms.
Robots can already kill people, and someone always cares. That episode was filling in the gaps, much the same way Dr. Strangelove did.
Banquo's Ghost
02-21-2009, 14:19
I'll never criticize an argument that references Star Trek but is a far from modern reference. When you speak of modern societies I believe you're thinking along terms of "modern" warfare i.e. industrial era. The creation of the 24-hour news cycle places the war in our living room, or cars, and our MP3 players. Government censorship is ineffective, the world feels much smaller, and people don't want to be bothered by pleasantries. The war is in your pocket. Look at how the rules of engagement have changed over the years. They largely change to bring warfare in-line with social norms.
Fascinating. Do you really think so?
I would argue that government (and by extension, the corporate media) censorship is widely practised and has desensitised us to all manner of suffering. Very few channels show the real horror of war through death and dismemberment and there has been much discussion on how images of the coffins of the dead soldiers coming home from Iraq have been banished from mainstream media. How much television time is devoted to the suffering of the veterans of that war now returned (or indeed the veterans of any modern war)? Much less the graphic suffering of those where the war is being fought.
As societies, I believe we are inured to the famines, brutal civil wars, and disasters of the world by constant, sanitised exposure through the idiot's lantern.
How possible is it for anyone who has not experienced war directly and in all its ugliness to actually understand it through the medium of print or television anyway? I cannot say I had any inkling of the true suffering of war until smelling that terrible mix of blood, terror-sweat, excrement and decay and seeing the light extinguished in another's eyes. I watch television reports through that lens nowadays, so I cannot remember what it was like to be without the memory of that smell.
I'm therefore very interested in your opinions as to how such televisual impressions affect you (and all commentators here, of course). :bow:
KukriKhan
02-21-2009, 16:12
Yeah: the smell.
The scent, the stench of fear, suffering, and death. At once oddly attractive, then utterly abhorrant, by turns, to our less-than-human lizard brains.
If that could be transmitted over the electronic airwaves...
Louis VI the Fat
02-21-2009, 20:43
I just think that a world without machines designed and programmed to kill humans is probably better than a world with machines designed and programmed to kill humans. Pah! If these cybernetic killing machines are Vista run, my money's on the starving, rock-armed humans. :sweatdrop:
Gregoshi
02-21-2009, 21:50
Pah! If these cybernetic killing machines are Vista run, my money's on the starving, rock-armed humans. :sweatdrop:
At the very least, the constant "confirm authorization" prompting has got to hamper the cyber-army.
English assassin
02-22-2009, 13:03
Pah! If these cybernetic killing machines are Vista run, my money's on the starving, rock-armed humans. :sweatdrop:
I see what you did there... :clown:
vBulletin® v3.7.1, Copyright ©2000-2025, Jelsoft Enterprises Ltd.