wombat_socho: the mark (the mark)
[personal profile] wombat_socho
You know, if you take this to its logical (if perhaps absurd) conclusion, the use of unintelligent software packages to guide cruise missiles may someday be regarded as morally equivalent to teaching kids with Down's syndrome to become tokkotai pilots. Come to think of it, I seem to recall a story in Heavy Metal years ago about this.

(Instapundit)

(no subject)

Date: 2007-03-07 11:37 pm (UTC)
From: [identity profile] nornagest.livejournal.com
This seems like jumping the gun to me. We don't even have a satisfactory definition of "intelligence" yet, and as far as we can tell the machine intelligence that we have created thinks in a very different way than we do.

The inclusion of "futurists" and a sci-fi writer on the committee is a big red flag, anyway.

(no subject)

Date: 2007-03-08 12:40 am (UTC)
From: [identity profile] wombat-socho.livejournal.com
The inclusion of "futurists" and a sci-fi writer on the committee is a big red flag, anyway.
Eh, maybe not. If the rest of the panelists are engineers and cybernetics types, they may not be familiar with the ethics surrounding the question, which is a really tricky one once you get into it. If you can "back up" your consciousness before going on a suicide mission, is it really a suicide mission? What if you can duplicate your consciousness into a company of troops? (Both points avoided in Glasshouse, BTW.) Are self-aware robots going to have the same opinions on the matter? Somebody ought to start thinking about those questions before they come up.

(no subject)

Date: 2007-03-08 02:29 am (UTC)
From: [identity profile] nornagest.livejournal.com
The ethics question was given some time when I studied AI. I suspect the only reason it isn't given more is that it doesn't need to be, given our current technology; otherwise classes in cyberethics would be required for programmers, just as a bioethics class was required for my friends in neuroscience.

I suspect the question of duplicating consciousness already has an answer, anyway -- you're effectively creating a new consciousness, as the instances will be measurably different from each other from the moment you make the copy. Granted, the cost/benefit analysis of terminating one copy isn't the same as killing a discrete being would be (at least from an information-theory perspective), but that's a topic for our collective morality to hash out if and when it becomes possible.

The philosophical implications of combining consciousness seem much more interesting to me, since there isn't a biological analogue to the process.

(no subject)

Date: 2007-03-08 02:45 pm (UTC)
From: [identity profile] wombat-socho.livejournal.com
I suspect the question of duplicating consciousness already has an answer, anyway -- you're effectively creating a new consciousness, as the instances will be measurably different from each other from the moment you make the copy.

Is that actually the case? I know that in SF there are several answers to that question. Algis Budrys' classic "Rogue Moon" agrees with you (even though the duplicates have all the memories of the originals) but more recent works (the Star Trek series, the novels of Charles Stross and John Scalzi) don't.
Page generated May. 30th, 2025 10:28 am
Powered by Dreamwidth Studios