The ethics question was given some time when I studied AI. I suspect the only reason it isn't given more is that it doesn't need to be, given our current technology; otherwise classes in cyberethics would be required for programmers, just as a bioethics class was required for my friends in neuroscience.
I suspect the question of duplicating consciousness already has an answer, anyway -- you're effectively creating a new consciousness, as the instances will be measurably different from each other from the moment you make the copy. Granted, the cost/benefit analysis of terminating one copy isn't the same as killing a discrete being would be (at least from an information-theory perspective), but that's a topic for our collective morality to hash out if and when it becomes possible.
The philosophical implications of combining consciousness seem much more interesting to me, since there isn't a biological analogue to the process.
no subject
I suspect the question of duplicating consciousness already has an answer, anyway -- you're effectively creating a new consciousness, as the instances will be measurably different from each other from the moment you make the copy. Granted, the cost/benefit analysis of terminating one copy isn't the same as killing a discrete being would be (at least from an information-theory perspective), but that's a topic for our collective morality to hash out if and when it becomes possible.
The philosophical implications of combining consciousness seem much more interesting to me, since there isn't a biological analogue to the process.