Google recently boasted that it has used its conversational AI - Google Duplex - to call up thousands of doctors in the U.S. to ask them whether they accept Medicaid plans. Instead of relying on data that doctors or patients share with Google on their own accord, Google put its AI to work to actively gather data from human sources. As such, these doctors were enrolled to act as human ‘sensors’ for Google’s ever-expanding database of real-world intelligence. Indeed, while the Matrix films predicted a future in which machines use humans to harvest energy, we are much more likely to be used as sensors of the mega machine.
Over the last decades, while using all sorts of (free) digital services, we have provided the necessary data to improve these services and train their underlying algorithms. Our input included private pictures and conversations on social media, but also the information we added to Wikipedia, or the roads we drove using Google Maps. With a little help from algorithms that kept us glued to these services, we have voluntarily provided these AI’s with our data. There is however a limit to the amount - and kinds of data - that can be acquired in this way. More active forms of information gathering are time-consuming and costly, e.g. human drivers operating Google Street view cars. Google’s use of Duplex shows us how intelligent systems can increasingly solicit our ‘help’ to gather data in an extremely efficient manner. By implication, this kind of active data harvesting makes that humans become more and more like the eyes and ears of the digital Stack. Rather than autonomous users who provide input as part of a mutual transaction, we will be actively triggered to act, and provide data, according to the machine’s ‘needs’. Beyond automated phone calls - or platforms begging for reviews - we can also imagine how, for instance, games can be used by machines to study human behavior or how recommender systems will nudge users to try new restaurants, different kinds of content or uncharted roads.
Services sourcing data from humans is not necessarily a problem. It helps to make these services ‘better’ in many respects and not all relevant data can be gathered from physical sensors. Yet, over time the balance of power may actually shift towards the machine. With ongoing technological progress and a further concentration of data and intelligence in a handful of AI’s, the question arises whether any human actors, let alone democracies, will still have ‘meaningful control’ over these machines and their hunger for data.