Georgetown (through the leadership of Professors Nitin Vaidya and Maggie Little) were awarded an important innovation grant from the Mozilla Foundation. It will permit the collaboration of EthicsLab in the Kennedy Institute of Ethics and various computer science courses.
I’ve written earlier about the promise that Georgetown has in building a bridge between technology and larger societal issues . This new grant is another signal that Georgetown is recognized as having unique institutional strengths in this domain.
What’s it all about? The amplification of every thought and word that is possible through the netting together of billions of people on internet platforms has no historical precedent. The instantaneous nature of the amplification exceeds the speed of all prior ages of human interaction. Unfortunately, the good intentions of a garage startup are difficult to maintain when billions of persons are affected by its activities.
Institutions touching myriad parts of societies throughout the world encounter unique challenges. We have already seen both deeply harmful effects and wondrous benefits of new technologies. Looking forward, it seems appropriate to rewrite Dickens a bit to note it can be the best of times; it can be the worst of times. Attention to the societal impact of new technology, guided by explicit values, is required for “the best of times.”
There has been much discussion of the slowness of regulatory behavior to ameliorate some of the harmful emergent behaviors on the internet. However, even before one begins to conceptualize rules and regulations, societies need norms (unwritten standards of behavior) to guide the rulemaking.
Some of the capabilities of internet platforms are so unprecedented that even norms cannot keep up with the changes. My behavior can affect the fate of others in ways for which no norms have been articulated. Rumors that were shared in private in side conversations can now be instantly known by all. Many can trace the location of others minute by minute. “Private” behaviors are routinely public.
Further, empirical analytic techniques can use vast volumes of data to predict events and outcomes of processes second by second. Machine learning algorithms can control devices and vehicles sufficiently well that human intervention is less frequently needed. Completely autonomous processes are imagined.
But all such algorithmic approaches require data to build the predictive models. Such data must be collected before it can be used. What data to collect? Deep understanding of the processes is required to collect data to build good algorithms. Common examples of failure from using flawed data abound. Arrest data-based algorithms for where police should be deployed exacerbates any racial or socioeconomic bias in police behavior. Using mortgage default geo-data to build decision algorithms for loan applications suffers the same biases that produced the geographical spread of defaults in the original design. Biased data sets to train algorithms produce biased algorithms. Data use without explicit values can bring on “the worst of times.”
So, circling back to the good news about an innovation grant to Georgetown – as we educate the next generation of leaders in technology, Georgetown must confront the central questions of how to use technology for good. New ethical puzzles confront us right now. Others will confront us with each new development.
Society is now facing the effects of too much technology developed without attention to the ethical implications of its uses. With its decades of contributions to applied ethics and its growing bench of talent in computer science, Georgetown can lead in designing technical curricula that raise questions of values and norms at the point of development.