“How is having a cochlear implant that helps the deaf hear any different than having a chip in your brain that could help control your thoughts?” —Michael Goldblatt, former director of DARPA’s Defense Sciences Office, quoted in the Atlantic
What’s the difference between reading books all day and playing video games?
Come on, what’s the difference between spending your time with friends and family “in person” and spending your time with them virtually?
How is having a child through cloning any different from having a child the old-fashioned way?
Why is the feeling of happiness that you have after a good day any different from the feeling of happiness I have after I take this drug?
Why is talking with your spouse and children using your mouth and ears different, in any way that counts, from communicating with them through brain chips that link your minds directly?
We already pick our mates with some idea of what our kids might look and act like. How is that any different from genetically engineering our children so they look and act the way we want?
Don’t we already send our children to school to make them smarter? How is that any different from just downloading information straight into their brains?
If your grandmother is already in a nursing home, what’s the difference if the nurses are robots?
Memory is already so fluid and fallible that we forget things all the time; what’s the difference if we just help people forget things they would rather not be stuck remembering?
What does it matter, in the end, if a soldier is killed by another soldier or by a robot?
How is it really different if, instead of marrying other human beings, some people choose to marry and have sex with machines programmed for obedience and pleasure?
What’s the difference if our bodies are replaced by machines?
In the scheme of things, what’s the difference if humanity is replaced with artificial life? The persistence of intelligence is all that matters, right?
What’s the difference?
6 Comments
Comments are closed.
Well put! Indeed, what is the difference if we lose the willingness to tell the difference?
The original quote is a bit ambiguous. If someone is suffering from Tourette's syndrome, for example, "a chip in your brain that could help control your thoughts" might be regarded as a form of therapy.
In general, we can distinguish therapy from modification ("enhancement" is a loaded term), and those who claim we can't usually refute their own case by citing examples that everyone recognizes as falling into one category or the other, and then asking "What's the difference?"
We know which is which and the fact that we might not immediately know how to say why does not prove that we can't say why. We have a general standard of normal human health, form and capability, and then we have people who, in comparison with this norm are clearly sick, deformed or disabled, and suffering as a result. Interventions (of whatever form, including drugs and technical implants) which are intended to relieve such suffering, relative to the norm, can be called therapy.
I think the answers to your examples are varied, but all stem from the issue pointed to most directly by your last question: "What’s the difference if humanity is replaced with artificial life?" The difference is that, being human, we should not desire that humanity be replaced by anything. We should desire that humanity continue the chain of human life. Why should we? Because that "should" is intrinsic to being human, and any defiance of it is pathological.
No, the persistence of intelligence is not all that matters. Those who think that way have been brainwashed by a culture and educational system that elevates "intelligence" and with it human competition to a pathological degree. Intelligence is no more a moral end in itself than any other natural phenomenon. Our intelligence evolved to support survival, and if it now works to undermine human survival it has become pathological.
My guess is that by "a chip that could help control your thoughts" Michael Goldblatt meant something that would improve the performance of soldiers; this falls into another category of pathological thinking, along with the notion of killer robots. Super-soldier fantasies appeal to certain typically male insecurities and desire for power over others, but most of the notions of chip implants and other "enhancements" are wildly impractical, would probably impede rather than enhance human performance, and in any case would be irrelevant to future war scenarios, since by the time my two previous statements might no longer be accurate, technology will have evolved to the point where "killer robots" completely overmatch human soldiers.
As we face the prospect of technology that equals and exceeds human capabilities in general, the crisis we face is the need to transcend human competition both as an economic organizing principle and in national and global security, because in both domains competition will become technology pitted against technology, with humans left to become collateral damage.
Classic Moves in Transhumanist Rhetoric.
1. Why is it okay to change X about humans into Y?
Well, consider:
1 -> 2: No real difference, we do it all the time already!
2 -> 3: No real difference, we do it all the time already!
…..
∴ X -> Y: No real difference!
2. Ah. So why should we make this change?
Because it's never been done before and it's our mission to make ourselves new and it will change everything.
A major problem with both transhumanism and anti-transhumanism: The belief that "we" change. In the absence of a State commanding a change, only a few people will change and if that goes awry, it will stop there.
While many transhumanists tend to be libertarians, not all are–so the question of state command cannot be assumed away. And even those who reject it seem to expect that social/economic pressures will force change even when the state does not act; the case they make is not unreasonable assuming a vaguely market based technical meritocracy. I am less worried about the risk of obvious failure that early adopters take, failures that would make something "stop there," and more about successes that "progressively" erode our humanity.
“How is having a cochlear implant that helps the deaf hear any different than having a chip in your brain that could help control your thoughts?”
Well, when you put it that way—actually, when you put it that way it sounds like you don't think there's a difference between mind control and a hearing aid. Does this guy think the reason people find technological thought-control unsettling is that it involves a prosthetic?
But then, I guess cults have been brainwashing people for years without putting chips in their brains, so obviously it should be fine for the government to do basically the same thing with neuroelectronics now.