Over the past few years, Georgetown faculty have greatly enlarged their activities studying the impact of technological change on the country. Some of this work, performed by the Center on Privacy and Technology, has illuminated potential biases in facial recognition algorithms because of restricted training data in the machine learning. Others, at the Beeck Center for Social Impact and Innovation, have built networks of state and local government technical staff to improve the accessibility of benefits delivery or assembly of evidence for policy-making. The Massive Data Institute, with its Federal Statistical Research Data Center, has supported faculty and students access government data within a heavily secure computing environment. Ethics Lab has led design groups in tackling practical ethical issues including clinical trials during pregnancy. The Center for Security and Emerging Technologies has assembled unique international data that provide ongoing insights into R&D activities in artificial intelligence, producing scores of white papers that directly affect policy decision-makers. The Institute for Technology Law and Policy has successfully assisted many federal government units and built out the state Attorney General Technology Education Network to create a knowledge community assisting regulatory initiatives at the state level. Just recently, the Center for Digital Ethics was launched to add to the resources of the Tech and Society network. All of the units are staffed by research entrepreneurs, funding their work through external grants and gifts. Educational programs connected to the network assure an integration of education and research.
These activities rest on an unusually strong base of talent, with a breadth of expertise that includes legal studies, computer science, philosophy, social sciences, and data science.
Recently, the group became an attractive home for a new, related activity. We announced yesterday a new unit – the Knight-Georgetown Institute (KGI), which will focus on the effects of technology on the information environment in which we live.
The mission of the Knight Foundation and the mission of Georgetown seemed closely aligned. The Foundation is built on the belief that a well-informed community can best determine its own true interests and is essential to a well-functioning, representative democracy. Georgetown seeks to tackle important problems in service to the common good, with special emphasis on underserved communities.
Knight over the past few years has invested nearly $100 million in a large set of university-based and nonprofit units that are now examining all aspects of the growing role of digital media in the democracy. The new institute, based on an initial $30 million investment, will serve as a nonpartisan convening hub for the Knight Research Network throughout the country, as a catalyst to increased impact. It will mount active translation of scholarly research into widely accessible language for decision-makers in the private, government, and nonprofit sectors. It will build an academic field that mixes together the social science, legal studies, data science, practical ethics, and computer science. This last activity could develop curricula that could be shared across many universities.
The new KGI will be co-located with other units of the Georgetown Tech & Society Initiative, at 500 First Street NW, on Georgetown’s Capitol Campus. In the coming months, a staff will be built and activities begun. It is hoped that many of the activities will be joint with those of other units in Tech and Society, taking advantage of the existing talent at Georgetown but adding new investment and a focus on digital information.
It takes a village to conceptualize such an enterprise. All of us at Georgetown should convey our appreciation for the work of Paul Ohm, Laura DeNardis, Leticia Bode, Soyica Colbert, Alyscia Eisen, Irina Netessina, Emily Tavoulareas and our colleagues in the General Counsel’s office, the Office of Advancement, and the Office of the CFO.
This new institute is yet another resource for Georgetown to aid society in coping with the pace of technological change.
Address
ICC 650
Box 571014
37th & O St, N.W.
Washington, D.C. 20057
Contact
Phone: (202) 687.6400
Email: provost@georgetown.edu
Author Archives: Robert Groves
The Value of Small
Posted on
When our son graduated from college, the department of his major hosted a little gathering. We met in the hallway of the department, sat in a conference room, munched on Oreo cookies and drank lemonade out of paper cups. We talked with faculty that had mentored my son and shared stories about him. The memory of their praises of our son are still strong years later, and I saw in some of the faculty the attractions that my son had seen in them. It was a small event but important.
However, commencement ceremonies are large enterprises. Some universities have graduations of the whole, presided over by the president or chancellor. In very large schools, these are now held in stadia or massive outdoor venues, suitable for seating 20-30,000 people. Georgetown, with its school-level ceremonies, has avoided that state, but even then, its ceremonies are major affairs.
Paralleling these large gatherings, however, are smaller affairs, which seem to be proliferating. Some are based on individual curricula pursued by the graduate. A program has a small event, not unlike that of our son’s. This is common among graduate programs, where graduates have shared a large portion of their courses.
The ROTC cadre has a commissioning ceremony, separate from graduation, where awards for academic, physical, and leadership performance are given. Students who share race/ethnicity identities or non-majority cultures gather to celebrate their degree completion, regardless of whether they pursue the same program. LGBTQ and ally graduates gather together to celebrate their time at Georgetown. This year, the Georgetown Cultural Initiative sponsored DISCO Grad (Disability Community Graduation Celebration), in honor of disability pride among graduates of that community. The Phi Beta Kappa induction ceremony gathers students with high GPA’s. Athletic teams gather to wish the seniors well in their next stage.
These events have a very different feel than that of the large assemblies of thousands of students and many thousands of families and friends. First, smaller size means that each graduate gets a little more air time. In many of these smaller ceremonies the graduates know each other quite well – and shout-outs are common among friends. Stolls and cords given to the graduates to wear with their graduation gown symbolic seal this membership in the given community.
Second, many of these smaller events tend to have more visible organization by students. Indeed, some are entirely student-run. The absence of ritual that results from this is often more than compensated by the sheer energy of the program. Whoops and hollers seem much more prevalent. Laughter is contagious.
Third, the smaller size also means that families get to meet friends and colleagues of their graduate, before or after the ceremony. They get to see the friends their student has been describing to them during the program. The small size also means that faculty present can interact with families and graduates in a more meaningful way.
These ceremonies seem to be fulfilling a quite different, but wonderful purpose compared to the large graduation events. Social support networks can be recognized as key vehicles to the success of individual graduates; friends networks formed by shared identifies can cement bonds from their shared success; families can learn how their graduate both nurtured others and in turn were nurtured by them. Vivid memories that last a lifetime can be imprinted in these small ceremonies.
Explanation versus Prediction
Posted onThere are increasing contrasts between most scientific and scholarly enterprises and much of the data science, machine learning, and artificial intelligence protocols. The former is often focused on discovering the mechanisms that produce various outcomes that are observable. What causal mechanisms underlie the process that produce metastatic cancers? What social forces produce widespread lack of trust in social institutions? How does the loss of biodiversity affect social inequalities?
These are often “why” or “how” questions versus “what” questions. Many of the tasks of machine learning and artificial intelligence systems are interest is prediction of an outcome. GPT-4 predicts the next word in a sequence of words based on learning from massive quantities of text. Simpler algorithms predict the likelihood of recidivism of an incarcerated person. Others predict a product we might be interest in buying given various expressions of interest in purchases.
Early uses of new models made popular in data science took as their starting point an existing set of data. The perceived job of the data scientist was to use all the data possible to predict a given outcome of interest. But such a task of prediction does not require the analyst to seek understanding of the causal mechanisms of an outcome of interest. The goal of accurate prediction can often be attained merely by sophisticated analytic techniques. That is, this can happen many times successfully, until it doesn’t happen. Changed circumstances, new populations that experience influences the training data did not, can lead to model breakdown.
It is interesting that many of the social sciences that use data to discover causal mechanisms, often do not attempt a prediction step. Instead, they are interested in the relative importance of alternative causal factors. The debate in fields is not about the accuracy of prediction but about whether the scholars have missed important factors that may influence the outcome or whether they have misunderstood the manner of influence on the outcome of some factor.
Of course, some academic efforts move from the identification of causal factors and estimation of their relative impact to prediction. This is common, for example, in simulation of policy options for interventions in some social outcome. How much would the child tax credit change the number of children in poverty? How much would increasing the number of police reduce the number of burglaries in a city?
Sometimes models similar to those that have been built to understand causal mechanisms are used in such predictions. But such models tend to be much more heavily scrutinized with regard to their applicability, because their designs are open and transparent.
The system of models that underlying many of the prediction tools arising in artificial intelligence are much more complex, because the sole goal of prediction accuracy doesn’t require the scrutiny of causal assertions. The application of models with billions of parameters yields the result that even the inventors of the system cannot explain its performance on a particular outcome. Transparency is lost through complexity.
Prediction models formed with disregard for the causal mechanisms of an outcome are doomed to failure at some point. The real challenge is whether humanity is capable of building general purpose artificial intelligence systems that are aware of such causal mechanisms – in the extreme, the identification of causal mechanisms for all possible outcomes.
The Current Value of Long-term Staff
Posted onI had the honor to be part of the Staff/AAP service awards held on Monday, April 24. This event honors staff who have completed 20, 25, 30, 35, 40, and 45 years as a Georgetown employee. For some time, this event was routinely scheduled as an annual event. Unfortunately, the global pandemic disrupted this schedule, and the Monday event honored those staff who achieved their year thresholds in 2020, 2021, 2022, and 2023. In total, about 300 staff were honored for their years of employment. One of the highlights of the event was a special honor to Mr. John Matthews, who completed 55 years of service at Georgetown!
It is interesting that we use the term “service award” because the word “service” is quite apt. It is too easy to think of the term “jobs” when discussing a work organization. At Georgetown, we aspire to be more than collection of jobs. The animating spirit of the university is that we form a community. The mission of the community is service to larger goals. We seek collectively use our individual talents to build a better world, especially with regard to populations who have not enjoy the benefits of others easily enjoy.
Universities do this by forming the characters of young minds, introducing them to new knowledge, exposing them to alternative perspectives, and helping them identify their own way forward to improve the human condition. At Georgetown this formation process is the work of the whole community – faculty and staff, as well as fellow students.
The majority of the staff honored in the service award ceremony held jobs that maintain the environment within which students experience their education. These included grounds people, bus drivers, skilled trades people, but also administrative professionals working in human relations, academic units, and finance operations.
Watching the honorees mount the stage for their applause and pictures prompted reflections on the past three years and what has occurred at Georgetown. In March, 2020, the pandemic disrupted our community. Most all students left the campus. Online classes replaced in-person instruction. Most academic staff also transitioned quickly to full teleworking.
But not all staff could continue their work remotely. Buildings still needed care. The grounds needed attention. The campus still needed cooling, heating, and electricity. So, many of colleagues continue on-site work.
During the pandemic, employee turnover increased at Georgetown, similar to the vast majority of work organizations (i.e., the “great resignation”). These employees have been replaced over the months with new employees. Indeed, over 40% of the current Georgetown staff have been hired in the last three years or so. It is that fact that stimulated reflection during the award event.
Because of the large number of new employees, the longevity of the staff honored in the event has even more value to Georgetown. Such staff are the heart of the culture of caring that we aspire to achieve at Georgetown. We are privileged by their continued work at Georgetown, because they know the community values that are important to the mission for the university. They teach our students, new faculty, and new staff what it means to be part of the Georgetown family. They are the conduits of culture. At this time, when many staff are remote or hybrid, communicating cultural values is complicated. Some new staff do not enjoy uniformly day-to-day casual conversations with their new colleagues. Their challenges to learn the shared values of the community are larger individually. Since there are so many new employees, the challenge to the university is larger.
Georgetown employees with long service are treasured even more at this moment in history because they contain such knowledge about the ways of proceeding at Georgetown. We should honor them both for their longevity at Georgetown, but also for their socializing new colleagues into the values underlying the Georgetown family.
Georgetown Initiative on Pedagogical Uses of Artificial Intelligence (IPAI)
Posted onIn recent posts here and here , this blog has been pondering the rapidly changing role of artificial intelligence in the day-to-day lives of humans. The posts noted that we are in the exponential-growth phase of the systems, but that their current capabilities may imply a near-term growth rate that is unprecedented, even in the world of Silicon Valley. We are also in the “hype” phase of the new technology, so “slow thinking” and reflection is a good idea for all of us.
As with all technologies, both wonderful and terrible outcomes are possible from artificial intelligence. Regardless of what outcomes will ensue, however, it seems highly likely that, throughout their futures, our students will have to navigate new ways of work, leisure, and interpersonal relations affected by artificial intelligence systems.
It seems myopic, therefore, for Georgetown to attempt to ignore these systems as they develop or attempt to eliminate them from our teaching and research activities. Such a posture ill-prepares our students to be the leaders we aspire them to be. Our students should acquire critical capacities to discern under what circumstances artificial intelligence systems might be useful to their work and private lives; and, on the other hand, how to identify and halt harmful uses of the systems.
This moment evokes memories of 2012, when the hype phase of educational software developments spawned Massive Open Online Courses (MOOCs), and the EdX and Coursera platforms emerged. The hype at that time argued that in-person, brick and mortar universities were a thing of the past.
Over last ten years, we have learned much about online learning. Our collective knowledge was enhanced by faculty experimentation sponsored under the Initiative on Technology-Enhanced Learning (ITEL), which the provost office sponsored over a period of years. Of course, the COVID pandemic taught all of us very abruptly what works and what doesn’t for which kinds of students, using online tools.
Over the past few weeks, discussions with faculty and our teaching and learning specialists led to the idea that, once again, incentivizing trials, experiments, and pilots within our classrooms may make sense. Rather that banning artificial intelligence systems’ within our courses, we should collectively determine whether they might be used to enhance our learning goals.
For that reason, with deep gratitude for philanthropic support from donors who care about educational innovation, we can announce the Georgetown Initiative on Pedagogical Uses of Artificial Intelligence. The goal of the initiative is to support faculty trial uses of AI systems in a wide variety of class settings. The initiative will be open to all faculty, from the humanities, social sciences, and natural sciences; to all schools; to all levels of teaching, both undergraduate and graduate programs.
To design IPAI, we need faculty input. The provost office will appoint a task force of faculty and learning design specialists, led by Eddie Maloney of CNDLS and Randy Bass of the Red House.
The committee will:
1. Identify alternative faculty support mechanisms for trial uses of AI within courses they teach
2. Define a process of soliciting faculty proposals and identify the nature of the proposals
3. Propose an evaluation protocol for proposals received
4. Identify an evaluative process to judge whether each funded pilot achieved its goals
A competition for these design funds will be held in the fall 2023. None of us now know the future promise and threat of artificial intelligence systems. We do know, however, that our students should be prepared for a world in which such systems are ubiquitous. We need to help them thrive in that world by designing new learning experiences at Georgetown. There are no better people to do this than Georgetown faculty.
A Rights Statement for Artificial Intelligence
Posted onOver the past few weeks, every newspaper, news program, blog, and podcast seemed to contain commentary on generative artificial intelligence (AI) platforms, based on large language models (LLM). Some of the commentary paints a dystopian picture of impacts of AI eventually destroying humanity. Other commentaries propose an unprecedented productivity leap in scientific discovery.
My favorite example of the dystopian view is a LLM model that rewards the production of paper clips and leads to a world where AI uses every raw material (including the iron deposits in humans) to produce a globe filled with paper clips. A practical example of the optimistic view is the success of predicting how proteins fold into three dimensional shapes, a problem that plagued biologists for decades, leading to only 29 of the 4800 human proteins now lacking structural data.
Most of the newsworthy AI systems are trained on diverse and massive data sources obtained from the internet and digital resources from around the world. The appears to be one area of agreement among their developers. They themselves cannot explain exactly how a given output of the system was attained, given models that might be based on billions and billions of parameters.
In an event that didn’t generate long-lasting attention, in October, 2022, the White House Office of Science and Technology Policy issued a report labeled “Blueprint for an AI Bill of Rights: Making Automated Systems Work for the American People.” It generated a little press for a month or so, but could profit from more attention.
The report articulates five principles to guide the design and use of artificial intelligence systems, when they are used in ways that can affect humans:
First, one should be not be subjected to “unsafe or ineffective systems.” The report argues that some of the protections arise at the design and testing stage – wide consultation with the communities that will deploy the systems, use of third party technologists who attempt to “break” the system and use it for harmful purposes, checks on whether there are uses of inappropriate data in the training phase of the system, and a commitment to transparency about any harms after release.
Second, the algorithms should not disproportionately benefit some people in ways that violate basic equal protections notions. “Algorithmic discrimination” can arise because of poor training data coverage of populations or ignorance of the causal mechanisms of the given issue. There is the now famous example of training data from past employed persons being used to guide future staff hiring decisions merely replicating the racial and gender biases of the firm. Other example illustrates a misunderstanding of the target phenomenon of the algorithm – using arrest data in an attempt to reduce crimes. A common weakness is the current generative AI models are lacking representation of persons, behaviors, and images that are not internet-accessible.
Third, a person’s privacy should be protected and they should have “agency over how data about [them] is used.” One of the interesting current issues of large language models is that the training data were acquired in ways that are quite far removed from the humans that generated them, whether they be text or other type of data. The training data were harvested by scraping data from the internet, buying, or otherwise acquiring all the digital data they could.
Fourth, one should be alerted when an interaction they are having is being driven by an AI system. This should include some information about how it can affect one’s experience and outcomes in the interaction. Most people could perceive use of earlier primitive natural language processing platforms in telephone interactions. This is less and less true as the tools evolve.
Fifth, one should be able to avoid services of an AI system if they wish. An opt-out option should be offered. This would be especially important when the interaction has some personal risk – criminal justice, education, and employment. Of course, such agency requires that the person are aware of use of the AI system in the first place (see the fourth principle).
These principles are the beginning of a framework. Although not designed fully in a context of ethics, they have great resonance with the kinds of discussions of appropriate ethical behaviors of designers and users of AI systems.
AI systems have the promise of improved well-being for the entire globe. They also pose threats of enormous harm to populations, especially those not sophisticated in software systems.
With the features of generative AI changing monthly, the search for some enduring principles of behavior is an important one. It’s not clear that this report has achieved such permanent value, but it certainly deserves more attention.
When Universities and the Federal Government Collaborate to Build the Future
Posted onA key strength of our country has been a highly trained set of researchers and technicians who work in scientific agencies of the government (e.g., National Institutes of Health, National Institute of Standards and Technology, the national laboratories, the Defense Advanced Research Projects Agency), performing both basic and translational research. These agencies form a key component of an ecosystem of advanced research – universities, private sector biopharmaceutical, biotechnology, and information firms, as well as nonprofit research organizations. Over the second half of the 20th century, this ecosystem created unparalleled discoveries that led to applications that transformed society. While the public is bombarded each day with the application of those discoveries (e.g., generative artificial intelligence, the latest mobile phone), it is not frequently reminded that those applications could not exist without the basic outputs of this ecosystem.
Over the past few years, a slow crisis has emerged in the scientific and technical infrastructure of the Federal government. The core scientific staff are growing older. Declining public trust in institutions has dampened the attraction of working in these agencies. Staff shortages are common. Support for their agenda, as manifested through budgets, has been threatened.
Georgetown, as an institution that matured as the nation matured, has consistently played a special role in supporting government institutions. Its graduates disproportionately devote their careers to missions aligned with these institutions, sometimes as leaders within them, sometimes as external allies. Its location in DC enhances its ability to help rebuild the staff of federal agencies, especially required at this moment in history.
This week witnessed a special advance in Georgetown’s service to the nation. A group of colleagues, led by Professor Peter Olmsted, was awarded a cooperative agreement with the National Institute of Standards and Technology (NIST), amounting to as much as $21 million over several years. The purpose of the award is to provide valuable laboratory experience and financial assistance to undergraduates, graduate students, postdoctoral fellows, and faculty. The program is intended to assure continued growth and progress of a highly skilled science, technology, engineering, math workforce in the United States. In that sense, NIST has serves goals beyond its own staffing, to enrich the full ecosystem of science.
Under the agreement, Georgetown collaborates with NIST in identifying new scientific talent, administering their employment as Georgetown staff, while they work with and in NIST laboratories. They can include Georgetown undergraduate and graduate students working part-time at NIST, PhD students and postdoctoral fellows working on NIST/Georgetown projects, postdoctoral fellows working on NIST projects not connected to Georgetown, scientists with bachelors or masters degrees working at NIST, and senior scientists working at NIST (e.g. sabbatical visitors). Thus, some, but not all, of these staff have synergies with research programs ongoing at Georgetown.
Through programs like this, Georgetown again serves the nation for the common good. Over the years, it will offer unique experiences to scientists, both those starting their careers and those at their prime. It will thereby strengthen the ecosystem of government and academic institutions, by giving those in academia experience inside a government scientific organization.
Congratulations to our Georgetown colleagues for taking the initiative to acquire this new resource for Georgetown!
Bridging Between the Speed of Tech and the Speed of Society
Posted onOne of the central features of modern life is that some sectors of society move at very different speeds than others. The courts move on the time scale of months or years to process a case. The passage of new legislation implementing policy changes can take years at the Federal level. The curriculum of higher education majors moves gradually over years. Standards of behavior and certification processes for professional groups change only over years or decades.
Yet technological changes move more quickly. Moore’s law speculated that the speed of processors would double each two years, as a function of chip development and the speculation was largely supported over the decades after he forwarded it. Chat GPT-3, a generative Artificial Intelligence platform was followed by Chat GPT-4, a more powerful tool than its predecessor by a factor of 10 or so, introduced just a couple of months later. A tenfold increase in productivity in two months! It is anticipated that when the AI modules can be written by AI platforms themselves, that the growth of capacity will advance even faster.
We all got a lesson in exponential growth during the COVID pandemic, as a small set of infections led over a very short time to large sets of persons infected. Exponential growth in that case was deadly. The pandemic forced change in decision-making speed. In some cases, the necessary speed was attained by human organizations. In other cases, large numbers of deaths resulted from slow speed of reaction.
Georgetown profits from a group of faculty and students attempting to bridge one of these gaps between the natural speed of different sectors. The Tech and Society Initiative is a network of existing units and their staffs from different parts of the university – the Beeck Center for Social Impact and Innovation, the Center for Digital Ethics, the School of Foreign Service’s Center for Security and Emerging Technology, the Law Center on Privacy and Technology, the Graduate School of Arts and Sciences’ Communication, Culture and Technology Program, the College of Arts & Sciences’ Computer Science Department, its Ethics Lab, the Institute for Technology Law and Policy, and the McCourt School of Public Policy’s Massive Data Institute.
This is a wonderful example of applying Georgetown’s values of people for others – to boldly tackle pressing world problems, to do so in novel interdisciplinary ways, to integrate research and education, to foster environments where students and faculty work together, subordinating their allegiance to their own discipline to focus on the problem.
This is a big week for the Tech and Society network – the second annual Tech and Society week. This is a weeklong showcase of our colleagues’ work in forming a bridge between societal behaviors, norms, regulations, and laws related to technology and its rapid evolution. There are scores of events, from a focus on new techniques to enhance the privacy of data, to feminist cyberlaw, to intersectional biases in artificial intelligence systems, to interplanetary governance of the internet, and more and more. The last day of the week hosts the Fritz Family Fellowship conference, the presentation by student and postdoc Fellows describing the current status of their projects sponsored by two or more of the Tech and Society centers.
All of this is made possible by the diligent work of the faculty, students, and staff of the Tech and Society network. Try to take advantage of these events. Go here for the schedule for Thursday and Friday.
Social Cohesion in Work Organizations
Posted onOver the past two weeks, I found myself with faculty and administrators from multiple colleges and universities in the United States, as well as a few leaders of private sector firms. It was a wonderful opportunity to compare notes on current issues.
A common comment was that organizations have not yet stabilized post-COVID. With regard to students, they report the issues of disengagement in classes, similar to those described in earlier posts . Some colleges, especially those that are less selective than others, reported student learning losses were significant during the online experiences in high schools and colleges. This comports with recent results on reading and math scores for grade school students.
All the organizations, both universities and private sector firms, appeared to be uncertain about how their mix of in-person and teleworking is affecting productivity and staff morale. Some universities appear to be suffering with faculty who come into campus mostly only for their teaching obligations, remaining available to students only through electronic means. Staff tend to be present, but the human density in units is lower than pre-COVID. The private sector organizations seemed to vary on whether younger or older employees preferred to work in the office. All admitted that they expected to change mixes over time.
Most reported increased issues among people working together. As communities, we seem to be more impatient with the foibles of our co-workers. Sometimes, we misinterpret the intent of another and feel offended by our assumptions of their intent. We’re generating friction with one another by using electronic written communication when a more personal medium might better transmit the dialogue. There are more frequent perceived slights against one’s identity in the behavior of others. Some report colleagues struggling to acknowledge the perspective of others.
Of course, everyone was seeking to identify the causes of these changes. One line of argument forwarded in my meetings was that a cost of teleworking and online education was that we spent much more time by ourselves, alone. Spending a lot of time alone has the risk that we spend a little too much time thinking only about ourselves. The lack of face-to-face interactions gradually reduce thoughts of others’ day-to-day lives.
While zoom could have been used to repair some of this, the technology tended to be used for the “business” of teaching, learning, and administration, stripping away the informal chit-chat common to face-to-face meetings that occurs in hallways and at coffee stations. While we continued to see our colleagues and fellow students, the social cohesion built from multidimensional knowledge of them decayed over time.
It’s interesting that some have observed that many of the effects of COVID on all of us resemble those of Post-Traumatic Stress. People may report feeling “more on guard or unsafe,” large groups spark fear of exposure. More careful checking of one’s physical environment is a cognitive load that induces stress. That stress may bubble up in circumstances that have nothing to do with COVID risk.
There may be “trauma triggers” from people talking about their COVID experiences. We may have dropped activities that we used to do but are now quite safe, for example, shopping or exercising. That change of behavior may itself be stress-inducing. Some may feel an “increase in negative thoughts,” a feeling of lack of control over one’s future. We may be sleeping less soundly. All of these feelings might induce feelings of working too hard and lack of positive rewards from work or study, a forgetting of the shared mission that produces social cohesion.
Of course, there are many hypotheses but very little data.
But there were some signs of hope in my discussions. On the education front, some feel that this semester is better than last semester with regard to student engagement, that students were coming back into the flow of the university student role. Students in small seminars exhibited more engagement.
For work organizations in general, some were promoting the idea that anything that gets people together, to exercise the social interaction skills that may have become rusty, is worth the effort. In colleges, some are arguing that in-person faculty meetings are crucial to socializing new faculty members and re-establishing the bonds of existing faculty.
So the optimistic view is that we just need time together, to build up interpersonal skills that have atrophied, to be reminded of the humanity in all of us, in order to regain the sense of a common mission that bonds us together.
Navigating Different Perspectives inside the Classroom
Posted onThe American Council on Education (in collaboration with Pen America) recently issued an interesting report, Making the Case for Academic Freedom and Institutional Autonomy in a Challenging Political Environment.
The motivation for the report was the several-year experience of state governments attempting to limit the activities on college campuses on topics such as race, American history, etc. Many of these involve direct intervention into the governance processes of universities. Some introduced punishments for violations of the state government directives, including loss of state funding or dismissal of staff. The interventions at the K-12 level in some states have been even more direct and harsh.
As of yet, private institutions have not experienced similar incidents, but the issues provoked by the state legislatures for state institutions force reflection on the basic principles of the academy. For example, the report cites a 1940 statement of principles by the American Association of University Professors:
“Institutions of higher education are conducted for the common good and not to further the interest of either the individual teacher or the institution as a whole. The common good depends upon the free search for truth and its free expression.”
A 1957 Supreme Court case is cited, in Frankfurter’s opinion: “the four essential freedoms’ of a university – to determine for itself on academic grounds who may teach, what may be taught, how it shall be taught, and who may be admitted to study.” (Sweezy vs. New Hampshire)
The report calls for redoubling efforts to support academic freedom, the right of the instructor to determine the content of their class. At the same time, students of all beliefs and opinions must feel free to express their beliefs as their own contribution to the search for truth. And students must be active participants to supporting an environment that supports interchange of alternative views.
Some of the state government actions are attempting to reduce the presentation of “divisive concepts” in the classroom. Yet a fundamental duty of faculty and universities writ large is to offer an environment in which alternative, conflicting perspectives on human understanding and valid knowledge are freely presented. It is a corollary of such environments that each of us experiences uncomfortable moments when our beliefs are challenged by different perspectives. Sometimes, these different perspectives, to our initial surprise, are closer to the truth than our original beliefs. This discomfort, leading to a deeper, richer insight is an essential ingredient in learning.
Learning, almost always, involves discomfort because new knowledge must be assessed relative to what one previously understood. When there are conflicts between new and old, within our own mind, significant cognitive effort is involved in synthesize the two.
The ACE report notes: ”College students are adults who should be exposed to all topics on campus, including controversial and contentious ideas presented in an intellectually rigorous way that encourages discourse.” It notes that this is essential to the mission of educating students in critical review of new ideas. Indeed, knowing the key arguments against one’s view of a topic is the best way to determine how to defend that view. It notes that under academic freedom, faculty are given the main decision-making responsibilities for shaping curricula.
While private universities do not face the same external attempts to limit what is taught in the classroom, some internal pressures might become equivalent to them. The free exchange of ideas is the unique responsibility of academia. It is our way of deepening understanding and insight. That is the only route to coming closer to the truth. Faculty must assist students in learning how to build skills in this exchange of ideas, and they must have the freedom to use their expertise to do this.
Office of the ProvostBox 571014 650 ICC37th and O Streets, N.W., Washington D.C. 20057Phone: (202) 687.6400Fax: (202) 687.5103provost@georgetown.edu
Connect with us via: