Skip to main content


ICC 650
Box 571014

37th & O St, N.W.
Washington, D.C. 20057

maps & directions

Phone: (202) 687.6400



Space: The Final Frontier

Posted on

Universities are very dynamic environments when faculty are liberated to think about new educational and research possibilities. Space needs continuously arise.

Over the past few weeks, discussions on the so-called Master Plan of the university have begun. They have asked the question of what new academic initiatives might occupy new space and how that space needs to be configured.

It is a time of great innovation in US higher education. Many of the state universities, the backbone of the world’s finest ecology of higher education, have received year-after-year budget cuts. They are under direct threats. The option of tuition increases has either been eliminated by state legislative intervention or by weak demand from families strapped by flat salaries. Small private liberal arts colleges with limited endowments are also threatened; some are closing. Existential threats are spurs to innovation, and it is occurring at a very rapid rate.

Adoption of flipped courses is changing what students do inside classrooms. Intensive project-based learning has led to more use of laboratory or studio-based space. Blended courses with online components have led to more use of ad hoc small-group space, with integrated technology infrastructures. Learning management systems are designed to be digital one-stop shops for all materials students need to pursue a course. Ubiquitous Wi-Fi permits students to work in spaces that also act as social meeting space. The boundary between class and non-class, work and non-work, is blurring on campuses in the same way that it is blurring outside of universities.

Electronic portfolios of class work managed by students create integrated documentation of the progress of the student’s education. Social media platforms surrounding classes are fora for the intellectual exchanges supplementing class exercises – the 21st century mode of reflection and synthesis. Students are voluntarily supplementing the organized class learning with Kahn Academy and targeted educational videos. The Federal Government is seeking ways of certifying the learning accessible in coding academies and other focused learning centers.

On the research side, almost all major institutions that fund research are shifting to the big unsolved problems. Interdisciplinary teams increasingly spend time outside their faculty offices in shared space for these teams – a home for kindred souls who share common interests but different home disciplines. Sometimes the university teams are supplemented by researchers in partner organizations. New digital and computational approaches are arising in many fields.

Change is afoot.

In the midst of this we need to imagine the space needs of Georgetown 10 and 20 years hence. What space designs will meet the needs of our research teams working on key interdisciplinary problems? Should we plan for space occupied by Georgetown research partnering organizations? What space designs prompt the unplanned interactions of faculty with symbiotic scholarly interests? What space is best suited for the nurturing of interdisciplinary educational programs?

What is the needed mix of flexible group workspace and traditional classrooms with fixed desks and a lecture podium? What technology is needed for different group activities in the space they will occupy? What space will research-based learning need? How much time will students and faculty be off-campus, working together in experiential learning situations? What is the future of classes now taught in large lecture halls?

These are the kinds of questions we all need to address.

Several months ago, President DeGioia doubled-down on the role of place in the future of Georgetown. The decision renewed the commitment to deep formation of young minds that requires face-to-face interaction with faculty mentors and peers. This strength of Georgetown must be preserved in the future. But the space in which it is conducted must also reflect the use of new pedagogical designs and of new learning technologies that Georgetown faculty require. Over the coming weeks we’ll seek input from larger groups of faculty and staff on “the space of the future” at Georgetown.

Low Trust in Institutions in a Churning Sea of Rapid Change

Posted on

Recently, I found myself examining data on the loss of trust among the American public in major institutions. My search was partly motivated by examining whether the increased public discourse about the value of higher education was producing a loss of trust in higher education institutions. My search reminded me of the size of the decline in trust in major institutions (e.g., newspapers, the military, the church or organized religion, the Supreme Court, or Congress) over the past years in the US. Further, US millennials report lower trust of other people than prior cohorts at the same age.

Unknown, at least to me, is whether this loss of trust is connected with the large-scale disruption of several institutions by technology. Whole industries have been altered in major ways, with successful long-term organizations displaced by new startups. Is the lack of trust connected with the apparent slowness of the institutions to react to rapid social and technological changes?

The delays in getting government websites functional, in bringing modern software systems for agency processes, in reflecting multicultural perspectives — these and others occupy news stories frequently. Is trust eroding because of the contrast between the rate of change in our tech-based lives versus in that part of our lives that has no such foundation?

Higher education in 2012 was heavily focused on whether fully-online education was a disruptive force and would create whole new organizations certifying degree-level learning. Here at Georgetown, we bet that the deep formative experiences possible with strong faculty mentoring a residential undergraduate population would persist. As I write, the external environment remains filled with rapid change, and continuous vigilance about the right way forward is needed. Can we change fast enough to survive but slow enough to do so wisely?

The rate of technological change breeding new organizations produces an interesting puzzle. While the external environment of many institutions is undergoing rapid change, most of these institutions were not designed for such external environments. Their deliberative natures are valuable in choosing the right way forward when the way forward lasts for decades. When improvement arises from speedy iterative changes, they sometimes seem too slow. When there is also little trust in such institutions, the slow, deliberative nature might breed beliefs about incompetence.

So, increasingly we seem to see venerable institutions not trusted by stakeholders who are arguing for rapid change. Ideally, such institutions need to develop ways to make quick decisions when necessary, even when they were not designed to do so. In the meantime, it seems clear that the way forward requires unusual transparency by leaders of those organizations, a deep devotion to dialogue, carefully designed gatherings of those stakeholders with divergent perspectives, and a repeated communication of the basic mission that animates traditional devotion to the institution.

Knowing More about Today But Less About Yesterday

Posted on

In the 20th century, the quantitative social sciences blossomed. A key facilitator of that growth was the democratization of demographic and economic data through data archives. These data archives contained “anonymized” versions of original survey records of individuals, allowing a college sophomore to analyze the same data set as a full professor.

As the years went by, the archives permitted analyses of change over time in major phenomena. Indeed, they permitted quantitative history to evolve as a field. As we move into a new data world and surveys decline in frequency and importance, we need to think about whether data archives themselves will survive.

One business model of social science data archives seems to have two features: a) dues from a set of members that value access to the data, and b) contracts given to archives by data producers charged with dissemination of their data. The first feature arises a set of users from institutions whose mission includes original research (e.g., universities or research institutes). By sharing the large costs of data curation and dissemination, they give their members access to vast data resources at relatively low cost. The second feature is a function of the fact that much of the 20th century data in archives was funded by agencies funded by taxpayer money. These agencies have legal obligations to democratize their collected data.

A second business model tends to serve a set of like organization each possessing data sets that describe their activities; for example, a set of enterprises in the same manufacturing sector. In order to assess their own performance, the organizations want to know how they compare to other organizations. Are they ahead or behind their competition?

However, no organization wants their own data to be known by a competitor. In these circumstances, an honest broker organization (e.g., a trade organization) is sometimes charged with collecting all the data and providing each member a set of statistics describing all contributing organizations. Through the honest broker, the entire population of firms becomes more aware of the status of the sector, without threatening their competitive position.

The new data world is likely to be built on diverse sources of sensor, transaction, text, and movement data providing the society instant monitoring of human behavior. These are vast real time data series. Their value stems from offering timely descriptors of what’s happening minute to minute.

Yet as we increasingly enter this world, it is not at all clear whether any institution with access to such data has the mission to curate and archive such data, permitting future analyses to use them. Who will protect the streaming data of the present for users of the future?

If the society merely uses the data to monitor the present, our understanding of the past might be increasingly threatened.

Supports for Faculty to Teach in More Expansive Ways

Posted on

Now is a time of rapid innovation in higher education. It seems like every day we see new pedagogical techniques and teaching models being introduced to enrich learning.

Of course, individual faculty innovate all the time, taking on new approaches to courses or aspects of the major. At a higher level of aggregation, departments are inventing environmental supports for innovation and experimentation. Our Government department is one example of a unit that has moved on this opportunity, creating a new category, called GOVX courses. These new courses create a context for innovative teaching, experimental course design, and new methods of connecting campus with the wider world.

One example is a course led by Professor Marc Howard, which is a new and ideal followup to an existing course, “Prisons and Punishment.” After being exposed to the theories and research literature in criminal justice and prison policy, the students can enter into an intensive experiential learning activity closely mentored by a senior faculty member. The activity applies the newly-learned theories to formulate a real proposal for prison reform (which might address issues of policing, sentencing policy, prison conditions, etc.) The full vision is to have some class meetings within a local maximum-security prison, collaborating with incarcerated prisoners who are also taking college courses. The students culminate their project work with a presentation in a public event. The course meetings are more like research team project meetings than lectures or discussions of readings.

Combining theory and practice is not new; many traditional courses have attempted to do so within the same class. Often, the “practical” side of learning uses materials that attempt to convey real-world problems, ripe for applying the theories learned in the earlier part of the course. However, the ability to simulate the real world inside a traditional course is sometimes limited. The Howard course structure, as with many community-based learning (CBL) courses, enriches the theoretical learning with a true, real-world experience not a simulated real-world experience.

For their graduate programs, the Government department started piloting innovative course modules, many bearing only one credit. These will also be offered as GOVX courses. Some are aimed at professional development, like a course in writing research grant proposals to support scholarship. Others might be research methods courses focusing on specific statistical techniques. Others teach techniques of close reading and critique required of the scholar in reviewing journal-article submissions.

Such courses combining theory and practice, or focused on skills, will not replace the crucial work that goes in theory-focused courses or seminars focused on critical inquiry. They expand the diversity of approaches that we can offer students. As more and more Georgetown courses share these design features, we’re learning that students love the approaches, devote more time to the course, and report back deep appreciation for the courses as expanding the ways that Georgetown prepares them for life beyond graduation.

Faculty have been shifting their approach to teaching and learning in many departments in the University to take advantage of the opportunity to enrich our students’ experience. These are great examples of faculty leading the effort to innovate in their pedagogy, to the benefit of their students.

The Great Recession and the Humanities

Posted on

At a recent workshop on building the research university of the 21st century, there was a wonderful discussion of the need for the insights and way-of-thinking common to the humanities for many of the key global challenges. The speaker noted that although the laboratories and equipment of scientists seem foreign to the intense cognition and reflection of the humanist, the work of the two groups of scholars is united by a common quest for meaning. Both sets of scholars attempt to glean understanding where there are ambiguity and clarity and where there is chaos. It was a welcomed and spirited argument supporting the role of the arts and humanities in the modern university.

That talk, several new books, and reports of scholarly groups are part of an increasing appreciation of the humanities. The evidence is accumulating that the humanities are uniquely valuable in generating the creative minds that lead unusually successful lives in modern society. Yet there does seem to be a mismatch in enrollment patterns at US universities and that argument.

I wandered into an interesting study that asked the question, “Do US students’ choices of undergraduate major follow the business cycle?” The study examined the experiences of college graduates from 1960 to 2011, a period that experienced several recessions. The findings are separated by the gender of the student.

In the face of periods of higher unemployment, undergraduate women tend to choose majors that are generally associated with male-dominated careers. For example, they move into business, finance, accounting, and computer-related fields. They also tend to choose nursing, a major with direct job targets. The majors that are disadvantaged during such periods of high unemployment are literature, languages, sociology, history, and education. Among males the story is similar, with periods of higher unemployment yielding more majors in engineering, accounting business, natural sciences, and relatively fewer majors in history, literature, languages, sociology, and political science.

In a 1780 letter to his spouse, John Adams wrote, “I must study Politicks and War that my sons may have liberty to study Mathematicks and Philosophy. My sons ought to study Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce, and Agriculture, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry, and Porcelaine [sic].” It appears that times of economic hardship, as well as times of political strife, generate a refocusing to individual economic welfare.

What does this mean for the arts and humanities? I don’t pretend to know. However, one hopeful interpretation would be that the negative effects of the great recession of 2008-2009 on the selection of majors might be dissipating over the coming months and years, as the US economy recovers from the shock. If past history is replicated, if the macroeconomic performance dampens concerns over one’s short-run economic welfare, the percentages of undergraduates majoring in the arts and humanities should increase in future cohorts. This benefit might also be enhanced with increasing public discourse about the value of liberal education to drive innovation through creativity.

Georgetown possesses deep strength in the arts and humanities. Its devotion to liberal education, including the humanities, has contributed young minds who have become leaders in an innovation society. I’m hopeful that a growing economy will lead to wider societal support for the humanities; in the meantime, vigilance to preserve this strength for future generations of Hoyas is important.

Data Skepticism

Posted on

The best data analysts don’t trust data. They approach them with suspicion and disbelief. They probe them, looking for weaknesses. Only after repeated failures to expose fatal weaknesses do the analysts feel safe attempting to extract information from the data. Even then, they are careful not to try to extract more information from the data than they are capable of providing.

Increasingly, social scientists will be using data that they themselves did not create. The organic or “big data” will come from sources that did not foresee use by a social scientist to seek understanding of attitudes or behaviors of individuals or groups. In contrast, when social scientists themselves were the creators of data sets, they generally chose the target population to be one of interest substantively, designed the selection technique to identify which units were exposed to the measurement, carefully controlled the measurement, and induced features to reveal the validity of the measurements. The researchers themselves controlled the documentation of the data properties (so-called metadata).

In the new world in which we live, social scientists will increasingly harvest data created by processes far removed from their control (or even their access). So, in this new world in which social scientists are confronted with a data stream of unknown properties, how can they assess the quality of the data?

It’s likely that careful distinctions need to be made between observable and nonobservable errors. On the nonobservable side, no data set will reveal what is excluded from itself (e.g., what members of a target population can never be measured through the process). Insight into such biases comes only from comparing one data set to another.

Other weaknesses of data can be revealed through careful examination of the data set itself. Patterns of missing data for units that should have such items present can reveal that one or more items might be subject of weaknesses. Simple distributions of the values of quantitative measures can detect outliers that are unlikely.

There are some multivariate checks that are possible when a data set has multiple observations on the same unit. Are logical relationships between multiple measures (e.g., age and date of birth) displayed in the data?

Sometimes unreported disruptions in the measurement can be detected when there are time stamps on observations: does the pattern of characteristics over time reveal large discontinuities? Are the correlations between attributes over time varying in an erratic fashion? Sometimes graphical displays of multiple variables reveal anomalies that are likely to be errors of observation. Sometimes creating arithmetic combinations of multiple variables reveals unlikely patterns.

“Big data” are likely often to be undocumented data. The first step of a wise data analyst of such data is not extracting information from them, but challenging them to justify their worth for any purpose. Even in the best of cases, however, such scrutiny will merely offer insights into observable errors, not those arising from failure to contain information on important units in a target population.

Making Multiple Data Sets Work in Harmony

Posted on

As the observational sciences increasingly turn toward use of so-called “big data” organic sources, much attention is being paid to how to analyze combinations of multiple data sets. A common goal is conjoining survey data with multiple other data sets. Many of the approaches to this problem attempt to identify common measurements in different data sets. For example, if data set 1 contains measurements on the spatial location of units measured, the analyst asks whether data set 2 contains similar measures. When that is indeed the case, statistical models taking advantage of this “shared covariate” can be estimated.

Thinking about this effort, it’s relevant to observe the common uses of observational data on large populations (e.g., people, businesses). Typical analytic goals are comparisons over five different dimensions: 1) time, 2) space (e.g., political subdivisions), 3) population groups (e.g., gender, race, age groups), 4) levels of aggregation (e.g., persons, families, households, neighborhoods), and 5) measurement complexity (e.g., individual attributes, indexes). Statistical agencies routinely present their estimates organized by these different dimensions.

One characteristic of our new data world is that much of the organic “big data” are not designed to contain a large set of consistent measures. Tweets are totally in the control of the subscriber; search terms are unstructured. Retail scanner data are consistent (time, retail outlet, product description, quantity, and price) but quite limited in number of variables or attributes.

There are implications of these data attributes for the future of combining multiple data sets to produce statistical estimates.

Perhaps the most common measurement in the “big data” world is the time at which the data were created (e.g., when was the tweet sent; when was the search term entered; and when was the product purchased). The spatial location of the measurement might be the next most frequent occurring measurement.

This implies that mixing survey data and these new organic sources will most likely enrich statistical estimates of small geographical areas (exploiting shared spatial identifiers) and estimates of greater temporal granularity (exploiting the time stamps on observations). It also implies that the new data world is less likely to offer enriched estimates of small subpopulations because the organic data tend not to have such measures. (The exception might be social networks.)

In studies of human populations, surveys have excelled in providing insights into demographic subgroups. They provide statistical contrasts among racial and ethnic subgroups, groups differing on socioeconomic status, immigration status, educational attainment, and a host of other important social attributes. It seems unlikely the that new data world will offer enhanced contrasts on such groups, since the organic data tend not to have measures on such attributes.

The implication for this seems clear. Estimates in the new data world might offer better enhanced temporal and spatial granularity, but little improvement for population subgroup contrasts. The survey data will carry the burden of measuring such contrasts, and added organic data will improve such contrasts only marginally.

Striving to Support Strong Centers and Institutes

Posted on

For many decades, universities have conducted periodic peer-led reviews of academic departments or units. The reviews (on many campuses, at 5-7 year intervals) generally start with introspection and self-assessment by members within the unit. They often involve a campus visit by peer scholars outside the university. The reviewers meet with faculty, administrators, students, and staff connected to the unit. They discuss and share evaluations and provide a written report to the unit’s members and to administrators responsible for the units.

In some sense, this process mirrors the peer review protocols for vetting scholarly output of individual faculty and new research proposals. Indeed, peer review is a key accepted quality control mechanism throughout academia.

However, such reviews rarely incorporate evaluation of research centers and other semi-permanent collections of faculty who have research or outreach missions.

Georgetown has many such centers. Indeed, depending on what definition is used, there are more than 70 centers, institutes, programs, or initiatives that have quasi-permanent status on the Main Campus alone. In this blog, we’ll use the term “center” as a convenient title for all such units.

In order to evaluate and support those units in more effective ways, the deans of the Main Campus and the provost’s office have developed an evaluative protocol for such units. Like the protocol for the review of academic programs, the review of centers will be periodic, on a schedule that we will construct over the coming months.

Unlike the review of academic programs, there will be three types of center reviews. All three will begin with a set of questions about the status of the center posed by provost’s and dean’s offices. These questions will concern outputs of the center, commentary on impacts of the center’s work, and the financial status of the unit.

After a review of answers to those questions, the School or College Dean, in consultation with the Office of the Provost, will take one of three actions: 1) move forward with action items immediately, 2) have a subsequent on-site team visit and review, or 3) have an expedited review requiring no on-site visit but including a review of relevant documents by an outside review team.

The review teams will be composed of two to three external reviewers and one Georgetown University full-time faculty member.

The goals of the center review process are not dissimilar to the goals of the review of academic units — to gain insight into strengths and weaknesses, to assess possible measures to improve the performance of the unit, and to alert us to activities of similar groups on other campuses.

Our hope is that, after such reviews, we can all make better judgments about the way forward for Georgetown’s centers, institutes, initiatives, programs, and like units.

Social Justice and Interdisciplinary Research

Posted on

At a recent retreat, I saw a video made as part of a worldwide congress of Jesuit colleges and universities. It reviewed various activities of the university through the lens of social justice, using the values of the Society as motivators.

To my delight, the commentary treated both the teaching mission of the university and the research mission. Indeed, it tied the role of research in discovery of new knowledge as key to the social justice mission of Jesuit universities. This is completely consistent with work on the “frontiers”; in this case, the “frontiers” of human understanding and knowledge. Unsolved problems in society sometimes need new insights, new knowledge. Research is the domain that produces such knowledge.

Even more delightful was a deduction that issues involving social justice appear to require interdisciplinary research activities. The knotty problems of poverty, equitable distribution of food, energy, and water resources; unrestricted access to education and unimpeded pathways to social mobility—all of these require combinations of diverse sets of knowledge. When universities were organized, they created clusters of faculty members who shared a set of motivating questions for their scholarship—they were colleagues in pursuing deeper understanding of a specific subset of knowledge. They were not necessarily motivated by unsolved problems within the society but rather the passionate pursuit of knowledge. So, the basic issues surrounding what we call “social justice” are not coextensive with the department, discipline, and school structures that exist today.

What the video noted, however, was that energizing the pursuit of social justice within universities might be greatly advanced by interdisciplinary approaches. We can note that almost all social problems affect the poor most severely.

This is completely consistent with the work of many colleagues at Georgetown (e.g., the environment initiative, the health disparities initiative, the Beeck Center for Social Impact and Innovation, the social enrtrepreneurship activities, and a host of others).

Further, the Graduate School is identifying sets of faculty throughout the university who want to tackle important social problems (e.g., infectious diseases). When we can identify career paths associated with these problems, we’ll mount new educational programs linked to research. In almost all cases these programs are interdisciplinary in their focus.

Interdisciplinary research is difficult. It requires unusual humility among practitioners, as they learn the language and the perspective of other disciplines. It requires continuous attempts at synthesis, bringing together disparate ideas into a seamless whole. It requires agility, to reject quickly what doesn’t seem to work and to try another approach. It’s notable that more and more research funding agencies have recognized that progress on the thorny issues requires teams of researchers working collaboratively.

The video linking social justice and interdisciplinary approaches to research, when added to similar moves by research funding agencies throughout the world, offers a consistent way forward for universities seeking tackling the world’s ills.

The Need for a New Model of Data Access

Posted on

Decisions in the late 1950’s to create anonymized data extractions from the decennial census records democratized data access with deep respect for the privacy of individuals. The later creation of archives of anonymized Federally-funded social science data sets (often sample survey data) led directly to great advances in the empirical social sciences in the United States, securing the country’s academic leadership in that domain. At this point, most social science undergraduates access these data as part of their learning of the trade of social science analysis.

Over the same decades a vibrant proprietary survey industry also emerged, primarily built around customer satisfaction measurement as a feedback loop to manufacturers, retailers, and service providers. Similarly, political campaigns and marketing departments have created proprietary survey data for guidance in strategic moves potentially affecting important outcomes. Some companies emerged to sell statistical information for private sector purposes (e.g., Neilson television ratings; Arbitron radio ratings).

The two data worlds, the democratized access of anonymized microdata vs. closely held proprietary data, have existed in parallel, largely nonintersecting realities over the past 50 years. This caused little loss of human knowledge advance, in my opinion, because the proprietary data often had very limited use, focused directly on the actions of a single company.

The 21st century data world consists of data produced by private sector enterprises as auxiliary to their product or service delivery, as well as the traditional survey data common to the 20th century. Since survey data are increasingly costly per record (falling participation rates leading to increased efforts to gain cooperation), the “big” data or organic data will predominate in the future.

The 20th century information model (e.g., monthly monitoring of major economic statuses; annual measurement of health conditions, criminal victimizations, research and development, and educational outcomes) achieved common-good objectives of the US society. It is not hyperbolic to state that such indicators were the foundation of the “informed citizenry” so important to democracies.

Can we discover a new model to serve those common good purposes in the 21st century?

The answer may lie in a model that is neither dependent on pure altruistic actions serving the common good nor on a direct profit-making motive. This model would need to address two key issues: a) how can it offer financial or information benefits to the private sector data holders and the individuals described by the data, and b) how can it offer full respect for individual privacy?

Can we learn from the new social entrepreneurship movement, which seeks sustainable revenue models for social good purposes? Is there a model that does serve the common good but is self-sustaining because it also serves other interests?

How will such a model address protections of data from one entity from its competitors? How would the confidentiality promises of the data holders be respected?

In short, for the 21st century, can we build a data world that has the legitimacy and credibility key to informing the society about itself, but one that doesn’t depend solely on organizations whose sole mission is serving the common good?

Office of the ProvostBox 571014 650 ICC37th and O Streets, N.W., Washington D.C. 20057Phone: (202) 687.6400Fax: (202)

Connect with us via: