Input/Output and Big Think interview Jaron Lanier, a computer scientist, composer, visual artist, and author.
What’s the Big Idea?
We may not yet possess those cool transparent computers they have on CSI, but we live in a science fiction fantasy world of seamless information exchange, one in which even our telephones seem to possess magical powers. The less you know about technology, the more magical it seems, so the more the sophisticated the tech becomes, the vaster the cultural gulf between the computer literate and the computer-challenged.
This is a dangerous situation, says Jaron Lanier, author of You Are Not a Gadget. Lanier, an early virtual reality pioneer and researcher for Microsoft Kinect, has been a vocal critic of what he sees as a dehumanizing trend in our relationship with technology. When we treat computers as godlike beings, he warns, we become dependent upon them to solve problems that require human ingenuity to address. And by denying the incremental, human-driven progress that, say, enables Siri to offer you exactly what you want, we also run the risk of creating an economy that doesn’t support human contributors — the legions of everyday people behind the “Great and Powerful Oz.”
One problem is economic, he says: If we pretend that the people don't exist, then people (like artists and musicians) don't get paid. Eventually we'll have self-driving cars and automated manufacturing processing... which means even less employment for humans. "If we don't learn to acknowledge that real people are actually creating the value online," says Lanier, "we're never going to create the information economy that can really create employment and self-determination for people when the machines get really good."
He argues that we need to remember to include the human as we move forward with innovation... because ultimately it is all created by us humans, and it's to benefit us humans. That isn't necessarily easy. "In a world created by hackers, those who can't hack are the underclass."
What’s the Significance?
Lanier argues that we’d be better served by broad-based computer literacy courses that teach the logic behind our IT. Languages change fast, he says – what people need to understand are the underlying capabilities and limitations of the hardware and software. "I wish there were general literacy in computer science," he says, "Which is different from learning to program." Understanding that machines aren’t magical, and grasping at least the essentials of how they “think” can enable individuals and businesses to make smarter IT decisions about which problems machines and software can solve independently and which require direct human intervention.
For example, search engine optimization can empower media companies to experiment intelligently with headlines and content rather than submitting blindly to external advisors. We can’t afford to live and do business in a world so specialized that every aspect of existence requires a paid consultant. What Lanier advocates is a simple yet profound shift of priorities off of technology’s demands and onto the needs of the humans who use it. In the long run, so Lanier’s thinking runs, the change of focus may prevent us from creating a robot dystopia in which humans are useless even as slave labor. In the near term, it will help us to make smarter decisions and waste far less of our time and money.