Every year in December I ask myself what the next year will bring. I often expect surprises. However, the future does not work this way. Rarely are any events a surprise. In most cases, “future” events are a natural result of what is happening “now”. And because perceiving “the now” is not that easy, we all like to think that the future needs to be “predicted”. But, come to think of it, we do not need “futurists” to tell us about the future. The future is already here; it is being created now.
In this article, I am not trying to predict how the world will change in 2019. Instead, I am listing nine trends that are already changing the way we live, work and think. I am confident these trends will continue to shape our world in 2019, but we have not been paying enough attention to them just yet. It is easier than trying to predict the future, and might be even more accurate!
1. Algorithms go shopping
For the first time in history, we are allowing algorithms to make business decisions on our behalf. We have fridges that buy milk, dishwashers that stock up on detergent. Our homes decide when to sell and when to store solar power. Smart assistants ask for our commands and then interact with businesses. These new economic agents, algorithms, are forcing us to rethink business.
If you are a business, in 2019 reconsider how to serve your new customers: algorithms. Do you have the right channels? Should you partner with vendors of smart assistants to develop “skills”? Should you reconsider your pricing strategies — perhaps offer lower prices to algorithms? Governments can lead in this space too — imagine food trucks being able to automatically request food preparation licenses as they enter a council area. Could my car automatically pay registration fees or insurance premiums every time it leaves my garage? We have one-stop-shops for humans. Can we have a one-stop-shop for algorithms acting on behalf of citizens? Academics will have to understand this new world too. Can you reverse-engineer the behaviour of an algorithm? How do you advertise to a fridge? Will behavioural economists now also need to understand algorithms?
2. Full focus on the ART of digital
Remember how practically every digital giant was scrutinised in 2018 on their mismanagement of user data and other misbehaviours? With a growing understanding of the importance of trust, the focus will now shift to smaller organisations. Everyone will be scrutinised.
Businesses will need to increase their focus on the ART of
Digital and on gaining and maintaining the social license to operate. How does your organisation ensure it is aspirational, responsible, and trusted? The public sector and NGOs will play an essential role in helping citizens understand what to expect from a business — the more tools like https://tosdr.org, the better. There’s a task for researchers too: how do you build and measure trust, responsibility or aspiration of an organisation?
3. Walled gardens opening up
It is becoming evident that digital giants are using their technologies in monopoly-like ways. We have seen the anti-trust case against Google in 2018, and now Amazon is being scrutinised by the same lawmakers. We can expect increased legislative pressure, forcing platforms to open up, allow external players to participate on fair terms.
Before these platforms do open up, businesses should prepare strategies for entering them. Do you know which platforms are relevant to your business? Do you have a plan? Even if the global platforms open up, the challenge for governments will remain. While global platforms get much attention, local platforms might exhibit similar monopoly-like behaviours. It will be now up to national and local governments to identify those and act accordingly.
4. Conversational commerce taking off
Voice synthesis and recognition have been around for a while. However, 2018 has seen breakthroughs in applications. Google’s Duplex demos have shown us the potential. It is not just mindblowing. It just works, it seems! However, it does not come without challenges too. Will businesses now be overwhelmed with robocalls from their clients?
Given that the technology is now becoming available to first phone owners, you should expect that in 2019 you will receive the first robocall from your customer. This technology, in the hands of individuals and organisations, gives unprecedented power to scale outreach. Should legislators force robocallers to identify themselves? Perhaps there should be limits on numbers of robocalls per citizen?
5. Humans and machines will increase collaboration
This trend has been around for a while. Automation of work is not exactly a new topic. However, expect even more to be happening in this space. We shift from automation to augmentation. Co-work: humans seamlessly working with machines.
Businesses should continue exploring what and how to automate. As everyone does right now, look for opportunities of pairing up humans and technology, not just replacing one with the other. Governments have a tough task ahead of them to keep exploring what skills matter for the future of citizens. Results of these explorations should be shared to help citizens prepare. And, hopefully, researchers will keep coming up with ways to automate tasks that have not been automated yet!
6. More employees will self-automate
Remember the case of a developer who automated his work and for years didn’t work? Thanks to the variety of tools available for individuals, we can expect to see more and more people automating themselves. Business process optimisation, as well as automation, is not a domain of large organisations anymore.
Businesses can learn from your employees. The “automators” are a great asset. Keep them close, do not fire those who know how to automate themselves! A challenge for governments will be to scrutinise who benefits from such automation. It may be that those who automate themselves will not see the benefits of it, or may even end up worse off.
Scholars will need to redefine what it means “to work”. “Labour” may not be the right term anymore.
7. A shift from cybersecurity to digital security
Cybersecurity is essential in the digital age — no doubt about it. However, after many years focusing on cybersecurity we now see that the most significant challenges are the human and management aspects. It is not just “cyber”. Digital security extends cybersecurity beyond just technology assets.
Businesses and governments will increase their focus on all aspects of security — not just cyber. Trust is one of them.
As digital security is still a vague concept, academics will need to work on harnessing it. There is a need to explore the topics of trust and digital security more broadly. There are still more questions than answers in this space.
8. Our digital scepticism will grow
After years of being exposed to promises that do not materialise, we are becoming increasingly sceptical. Some believe that “powered by AI” is the equivalent of “organic” food labels. The world in which a robot, Sophia, receives citizenship of a country, despite it being a “chatbot with a face”, is about to end. I hope. We will not take for granted overhyped promises about technology anymore.
Businesses need to tune their BS detectors. Make sure you hire people who can assess the viability of technology claims made by other companies.
Governments may need to consider certification validating claims about AI, ML and other overhyped technologies to protect customers. In the same way, certification was developed to protect organic food buyers from fraud. It is also a big task for schools and universities: to promote understanding of opportunities and limitations in technology, and curiosity, not just acceptance of what we hear.
9. We will fight for robot rights
With the increased role that algorithms and robots play in our individual and business lives, we start to ask fundamental questions about the rights of robots. Who owns data created by algorithms? Who is responsible if something goes wrong?
Many businesses currently monetise data that is created by devices they sold or while customers use their services. However, the data property rights space is a tricky area. Do you know who, legally, owns the data that your devices produce? Who should be the owner? Who is liable if something goes wrong? Lawmakers might need to step in to clarify many such questions. The European Union is currently leading in this space. Academics should explore the boundaries between individual, product, and corporate liability in case of algorithms. They will need to understand the nature of algorithms. Are they just tools, or economic agents that should be recognised as entities?