In my recent client work around digital trends well underway, from predictive analytics and Big Data, artificial intelligence, internet of things (IoT) to everything connected, it’s not a matter of trying to slow them down (no one can), but asking an important deeper question: is there any moral or ethical compass helping to guide them? Can companies or organizations effectively frame or respond appropriately?
In January, The White House published a report outlining, like many reports we’re seeing, what some of the longer-term consequences of this digital acceleration might be over the next fifteen to twenty years, including:
- 83% of the jobs where people make less than $20 per hour will be subject to automation or replacement,
- Between 9% and 47% of jobs are in danger of being made irrelevant due to technological change, with the worst threats falling among the less educated, and
- Between 2.2 and 3.1 million car, bus and truck driving jobs in the U.S. will be eliminated by the advent of self-driving vehicles.
These are grim predictions to some companies that are not preparing fast enough. But, it is very exciting to those who see big opportunities in this transformation.
But, let’s step back for a minute and look hard at today’s digital landscape unfolding. First, writing algorithms to predict or help develop machine learning is not an objective art. Developers writing new code are being guided by a company’s leadership, sometimes explicit or implicit in nature (and, possibly very little guidance) – and the intended and unintended consequences of all these predictive analytics under development, both for a company and combined with others, is yet to be determined. Cathy O’Neil’s book, Weapons of Math Destruction, more eloquently makes the case from her years of work and research at Harvard, MIT, Berkeley and Columbia, and makes two strong statements about algorithms and data science: algorithms are nothing more than opinions embedded in code, and, there is no such thing as an objective algorithm, because, at the very least, the person building the algorithm defines success.
If you take this a step further, consider a Wired magazine’s recent book review in March titled “Tech Bigwigs Know How Addictive their Products Are. Why don’t the Rest of Us?” about Adam Alter’s new book, Irresistible (https://www.wired.com/2017/03/irresistible-the-rise-of-addictive-technology-and-the-business-of-keeping-us-hooked/?mbid=email_onsiteshare). In the article, Alter talks about how the tech giants behind our current technology – Steve Jobs, Chris Anderson, among others interviewed imposed strict regulations on their children’s use of technology. Why? It is addictive – and designed to be so. In the article, Tristan Harris, a “design ethicist,” interviewed, suggests “the problem isn’t that people lack willpower; it’s that there are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have.”
So, what does this mean? There is currently little monitoring or helping companies think through the tech consequences of such rapid digital acceleration on us, our children and our society. It’s the wild, wild west again, but with the globe being the eminent domain endpoint for tech leaders and pioneers. Is the only answer for us to be like the tech titans and strictly try to enforce some guidelines on our own use?
Remember Enron? The smartest guys in the room back in the late 1990s? An important point to ponder is not only the securities fraud they committed, but the energy futures trading they were inventing in those early days – and, way ahead of competitors and the Securities and Exchange Commission’s (SEC) understandings for regulation. Energy futures have since transformed much of our energy trading and understandings of what’s possible in the energy landscape. Today’s tech titans and entrepreneurs in this digital acceleration are, I would argue, at this exciting, early stage of development – with no reins in sight.
Is too much to ask for companies in the midst of their digital development plans to ask some important moral and ethical questions about the future they are designing for? Does it benefit more than hurt others and our planet? I remember asking a new pharma company’s scientists years ago about whether their pharma innovations and patents, drawn from Amazonian plants, were done in a sustainable manner, involved local people, and the benefits shared back with the locals? The scientists were stunned with my question and had no response.
In my recent digital strategy and scenario planning work with companies around self-driving cars, smart cities and infrastructure, IoT for real estate properties, among others, I am helping leaders and teams ask many of the same questions. The good news: more are listening than ever before and are concerned enough to address these questions in their planning. The bad news: the digital acceleration train has left the station – and it’s accelerating fast.