One of the greatest challenges facing the modern world is the absorption of rapid technological change. Much of our behavior as humans is governed not by laws, written down and enforced by institutions, but by norms, unwritten guidelines of behavior within cultures. The rapidity of change outpaces the ability for norms to adapt.
Much of technological change is in some sense agnostic to potential alternative uses. While television produced a unifying force for communication of the population, so important to enriching an informed citizenry, it also interfered with time formerly spent reading and conversing within families. While the automobile conquered the limitation of work to one’s immediate neighborhood, its internal combustion engine threatened clean air. The list goes on.
The explosion of data available on every human activity is a hallmark of recent technological change. Most all of these activities are made possible by the vast reduction of cost of assembling digital data. Once software/hardware platforms are built to create data, they produce exabytes of data at near zero marginal cost. The data can be collected with very good intentions in mind. For example, they permit instant access to the cumulative knowledge of the research community; they make cars safer on our highways; they assist in the diagnosis of diseases and the course of treatment regimens.
Unfortunately, the same data that can be used for good, can be used to harm people.
Some of the technological changes are raising the specter of surveillance capitalism, linked to central governments and large private sector organizations, threatening the autonomy of individual residents. Facial recognition used in ways that could harm individuals, GPS based mobile phone tracking by companies, wholesale trading of individuals’ data among private sector entities to improve their performance, use of CCTV data to give individuals social credit scores – all of these activities spur fears based on possible harms wrought on the person.
Data have no inherent value in themselves. Data gain their value in their use. Data possess no inherent values regarding their use. Users bring their values to their use of data.
Thus, the world faces the choice of how to maximize good uses and minimize harmful uses of data.
In this world of diminished trust in institutions, digital technologies and the data they produce are disproportionately held by large organizations, be they in the private, government, or nonprofit sectors. Hence, it is most common that, when data use by these organizations is discussed, given ubiquitous mistrust, data abuse is a focus of attention.
Of course, a fool-proof way of eliminating data abuse is to avoid creating data in the first place. Without data, none of livelihoods, our freedom, or our autonomy could be threatened by data abuse.
A world without data, however, would also strip us of a variety of conveniences that are essential to modern life.
Another track in discussions of data would be to create a new discipline of decision-making regarding new data resources. Whenever new data resources are a focus of discussion, we force a two-prong dialogue among the holders of data, potential users of the data, and the persons whose data are in question. The discipline would pose the questions:
- How can the new data resource serve a common good purpose, making the lives of large portions of the population better?
- How can the new data resource be used to harm people?
- Are there actions that should be taken to maximize 1) and minimize 2)?
Unfortunately, in many domains, the people who are answering question 1) are not in dialogue with those answering question 2) and that leads to the data crises our current world is facing.
Shouldn’t the discipline also pose the question whether the person whose data is being collected/used should have the ownership rights over those data, i.e., should have the opt-in rights rather than being robbed of those data by whoever collects them without their explicit (rather than implicit) consent? There might still be many people who willfully give up their data for some other benefits (those membership cards from Safeway and Giant are prime examples) but making it more clearly an option where the default is that you do not want the data to be collected would make a big difference.