I just spent a couple of days at the Reacting to the Past 16th annual institute at Barnard last week. Reacting was first developed by Mark Carnes (who has now written an excellent Harvard University Press book about this pedagogy) and it has now grown enormously and is used at hundreds of schools in many different disciplines. While I was a fan before, after playing, I certainly understand much more deeply both how these “games” work and what the benefits are.
Hearing both from students who have played and from research studies about outcomes, it is clear that these experiences lead to better academic learning, more engagement, more motivation and confidence and perhaps most interesting for the first-year seminar—better relationships and wider friendship patterns.
It is hard to describe, but every student is given a substantial book that includes historical essays, primary texts and other background information that students must read. Each student is assigned a role with an accompanying “role sheet” of several pages that includes biographical information, victory objectives to work to achieve in the games, and specific strategy advice. The first class or two might be background on the period, then introductions are made in roles and the game begins.
The game I played was Modernism vs. Traditionalism: Art in Paris, 1888-89 ( find out more from the author Gretchen McKay, who discusses the genesis of the game here). This game, based in the discipline of art history, includes students playing artists, critics, and dealers. We started with learning to talk about art with formal, visual analysis, and subsequently gave presentations trying to persuade others why and how some art was better than others. Learning to discuss the merits of art is the key to this game, and is achieved through lots of debates and organizing of shows (and who would and should show together). The culmination of the game is the final show in 1889 at the World Exposition, where buyers (new players) showed up within a crowd. We all had to persuade others to buy “our art” using the language of art we had just learned. We alternated between faction meetings, which afforded time to make connections, and more formal debate periods, which included speeches and presentations followed by audience questions (in roles). It was indeed a rapid see one, do one and teach one.
The benefits of this approach are far reaching.
Better Academic Learning
We know that the human brain remembers more when we read with purpose. I always suggest that faculty put more than “read chapter 2 for Tuesday” on the syllabus, because just a little added purpose improves retention: read the chapter and (for example) find an argument you hate, an insight that broadens your perspective, or a relative who has the problem described. Reading any text with a purpose improves learning, and Reacting immediately puts students in that frame of mind. I was reading to find arguments for or against something.
In Reacting, students then make presentations (everyone has to make at least one speech and write papers) supporting or attacking positions. Writing to persuade is, of course, one of the core skills we teach in college and something that students often find difficult. Reacting is all about persuasion and persuasive communication.
In the game I played, set in Paris in 1888, I learned a lot about the art of the period. I sat in a darkened room through multiple presentations about paintings—much like I would have in any art history class. These, however, were presentations from other students, and all of the content I expected to learn was there. Yes, some presentations were better than others, and they were not presented “fairly” or with an academic perspective. But we were all learning to see together and because we all had a bias, due to our assigned roles, we were again, looking with purpose. Of course, the student who presented each painting learned his or her works in more depth.
In short, the reading, discussion and papers are the standard currency of academic learning, but they were placed in a different context that made the entire process more engaging.
Engagement, Motivation and Confidence
Reacting gives students license to act out in class. This is fun, but also intense. For many students, the potential to “win” provides motivation, but there is also a potent energy of learning in this environment. People are laughing and yelling and interrupting. Even I was surprised at how freeing it was to be in a role, and how it changed my classroom persona and participation.
Another interesting benefit is that each character is given unique information. So you know (in your role) something others do not. This is power, of course. But it FEELS great the first time you say something in class and others get excited.
So the cycle starts with the motivation not to be embarrassed in front of your peers, but then you immediately feel the joy of success and eventually mastery (very much the micro-rewards that video games use). How often will students give a research paper to loud cheers from other students?!? In the guise of a role, however, I presented research (original for me) that bolstered the position of my faction and the (partisan) audience roared. This was a good feeling, and reinforced the idea that –when I read carefully and prepare, there is a reward.
One of the most interesting outcomes I learned about is that first-year students who play RTTP games end up with much broader friendship patterns in their second semester. Games are organized into factions (in the Athens game, for example, they include Oligarchs, who generally don’t care much about Socrates, the Socratic faction who are his devotees and want him to either be martyred or spared, the Radical Democrats who find him a threat, and the Moderate Democrats who while they really want every one to get along, can’t abide by Socrates’ anti-democratic ideas) and students then have to work together.
Initially, I was skeptical to hear that students often have trouble disengaging from the role (and need to be told they can’t talk about the game on weekends sometimes!), but after playing (even an abbreviated game) I still knew most of my colleagues by their roles, “Hi Monet, so what is your real name?” I am sure that for years to come, I will remember our Renoir, Seurat, Van Gogh and Gauguin in their roles.
Reacting is about relationships. On the surface, this is helping students build relationships in roles, but there is an extra benefit that these students now do indeed know each other. There are multiple benefits to getting first-year students into working groups that have this level of energy.
Leadership, writing, oral communication, working with others to solve a common goal, persuasion and responding to changing conditions are all, of course, widely desired job skills. Some students may need you to make this connection, but students themselves articulated that a common interview question is “can you describe a situation when you worked in a diverse group and took a leadership role to solve a problem?” Employers may not be expecting an answer about saving Socrates, but this may indeed be the only answer a student can muster from college.
But don’t take my word for it: the research about the benefits of playing Reacting games is already quite compelling. The most complete evaluation of Reacting came from a FIPSE grant allowing a psychologist, Steve Stroessner to compare Reacting and non-Reacting sections within and across half-a-dozen colleges. Students who played Reaching games demonstrated
• An elevated self-esteem. Reacting students both showed a higher self-esteem compared to students in non-Reacting sections and a higher level of self-esteem at the end of the semester compared to the beginning of the semester.
• An increase in empathy – compared to a decrease (!) for students in the control sections
• More external locus of control, i.e. level of belief that outcomes often are influenced by forces that are external to self.
• Greater endorsement of the belief that human beings are malleable, contributing to a belief in the possibility of incremental change, that people can change over time and across contexts.
• Enhanced verbal and rhetorical skills – Reacting students demonstrated a greater ability to make an oral argument.
Steven J. Stroessner, Laurie Susser Beckerman, and Alexis Whittaker “All the World’s a Stage? Consequences of a Role-Playing Pedagogy on Psychological Factors and Writing and Rhetorical Skill in College Undergraduates,” Journal of Educational Psychology (2009).
You can find more articles about the efficacy of Reacting here:
Faculty who try these games overwhelmingly find that they work. In 2013 a survey of 100 faculty found that faculty were clear that students in Reacting games achieved AAC&U LEAP Essential Learning Outcomes and found that the faculty overwhelmingly assert that it helps achieve those outcomes. Percentage of faculty who agreed on outcomes that Reacting develops:
97% – Providing academic challenge
96% – Engaging with “Big” questions
96% – Connecting knowledge with choices and actions
91% – Developing students’ ability to apply learning to complex problems
91% – Teaching the art of inquiry
85% – Fostering civic learning
82% – Fostering Intercultural Learning
79% – Teaching the art of innovation
78% – Fostering Ethical Learning
Many faculty also talk about these games have rejuvenated their teaching. A couple of sample comments:
• surprisingly, incredibly effective at seducing students into deep meditation on–and creative explication of—primary sources
• It’s the most rewarding teaching you can do, because students will take ownership of their learning.
• Reacting has completely transformed my approach to teaching. I find that it forces me as an instructor to be much more invested in my students… I have rethought my role as a teacher: I no longer try to cover “everything” in lectures, but rather I see myself as a coach in helping students navigate through the exciting avalanche of information that is available.
• For me as an instructor, it’s made teaching fun again. I’ve begun to revise all my courses around either Reacting games or simplified versions. …Students regularly tell me that they learn more preparing for the simulations than they would sitting through traditional lectures.
• These RTTP games bring a level of engagement and learning to my classroom that has helped take me back to the level of excitement about entering the classroom that I felt when I was just starting to teach 20 years ago.
Like many folks, I expected Reacting largely to be about history. Indeed, the games take place with real historical figures in specific moments in time, but the skill sets are broad and the subject matter ranges from science to art. This is a pedagogy that will revitalize your classroom, change you and your students—whatever you teach. Take a look at RTTP:
As Andrew DelBarco, director of American studies at Columbia University, notes, most of the media discussion today tells us very little about “what a good college ought to be” and what we do (or should do) for students.
There is, for example, the potentially distracting conversation about the economic return of individual majors. In many ways, this is an entirely reasonable concern. College is a tremendously important investment of time and money, and thinking about the practical return should form some part of the decision. First-generation and immigrant students know this well from their families. My first degree was in chemistry because my parents and I were given the deeply practical advice: If there is anything else you can do other than music and be happy, do it. My mother was immensely relieved by this advice, and I was not given the opportunity to try a higher-risk major.
I’d like to think that the choice of major could balance economics, aptitude, and passion. I tell students that their major now matters less because so much of what they will need to learn for their career, they will have to learn later, anyway. One new study confirms that there is no difference in the critical-thinking stills of graduates from different majors. That is good news, and the authors also find that college increases critical thinking. They also find, however, that graduates’ critical-thinking skills have declined over the last 48 years, despite their increasing importance and a new emphasis on colleges teaching students these skills.
We clearly need to do better at improving critical thinking, but what other skills should a modern curriculum address?
Jobs that require both thinking and social skills are growing, and the combination of math plus social skills seem a great hedge against technology. (Would it be bad to include social skills as a part of general education?) There is clearly a social and public good in our graduates getting jobs. We want to prepare them for the future, and that surely includes both intangible qualities of life and the ability to be self-sufficient financially.
Jobs that require both thinking and social skills are growing, and the combination of math and social skills seems to be a great hedge against being replaced by technology. Here at Goucher, we want to prepare students for the future, and this surely includes both being able to enjoy life (as a well-rounded person) and the ability to be financially self-sufficient.
At Goucher, we are thinking about how to combine all of these ideas. We have already started a new required three-year writing program for all students and hope to follow this with a similar requirement in data analytics. We are also focused on relationships, resilience, and reflection (what we call our 3Rs). We want to improve the relationships and mentoring that are so critical for later success and workplace satisfaction.
All of these things intersect. As a parent, what I really want for my child is happiness, and that comes from both some workplace satisfaction, but also some (paid!) work. Can we design a curriculum that combines both the abstract and the practical, prepares both the head and soul, and ultimately delivers both happiness and a job? Is all of that possible? Is all of that our responsibility? U.S. colleges have always been aspirational, so we are going to keep trying to deliver on all of those promises.
The Internet is fundamentally disaggregated. There is more and more information on our phones every day, but Siri is not getting any smarter because she remains disaggregated: More and more pieces of information just mean more bits and bytes. Ultimately, content only becomes knowledge when it is combined with wisdom. Content has to be integrated within people and thinking minds, and this happens best in a community. College and our general-education curriculum have to be about more than content. Our real products are integrated and happy people who are voracious, self-regulated learners.
Most liberal arts colleges still have some general-education requirements, although many only require some distribution of academic areas. We are still hesitant to define a core of knowledge, and I find that healthy (despite my personal enthusiasm for Plato and his audacious attempt in The Republic to prove that doing good is really good for us. Perhaps I just find this a core question for any happy life.)
What are we actually trying to accomplish, and how might we rethink general education in a liberal arts context?
U.S. writer and academic Louis Menand suggests that the liberal arts have their roots in knowledge we pursue with “disinterest.” As Menand writes: “Garbage is garbage, but the history of garbage is scholarship.” (This idea has its roots in some of the academic history discussed in part one of this blog series. Academic freedom and the move to distinguish scholarship as “value-neutral” were a way of balancing the religious roots of most U.S. universities. John Dewey and Arthur Lovejoy founded the American Association of University Professors in 1915 partly to make sure religious or political views were not the basis for hiring, but it was also part of the professionalization of faculty. Is scholarship really disinterested and devoid of personal values? Should it be?
One side of the coin is the benefit to analyzing all sides of an issue from the relative safety of the blackboard (or the Ivory Tower). Americans tend to use “academic” as a pejorative synonym for “theoretical,” “abstract,” or even “useless,” but theory and “disinterest” are useful precisely because they provide us with an abstract space to play with alternatives without having to make up our minds. Planning for contingencies that have not and might not happen is the essence of practical strategy—for games, business, or life.
The ability to always see the other side of a debate or issue can, of course, also be debilitating (and incredibly irritating to your children). While we mistakenly draw a clear bright line between theory and practice, we do, in fact, often need action, which requires a decision. Art works the same way: There may be lots of equally good or interesting ways to play Hamlet or paint a tree, but picking one at a time and doing it with conviction is essential for any good performance.
“Disinterest” is also seen as a pillar of science. We tend to privilege the “scientific method” as being separate from politics or bias, but even science is guided by the interests and priorities of scientists and the government. We can’t ever be entirely disinterested or rational, but disciplinary training provides a framework.
This is sometimes used as a justification for the humanities: Science may be peering intently at the real world, but it is using a lens, and the humanities is the study of that lens. Similarly, the “academic disinterest” of majors like classics or art is defended precisely because it is abstract and removed from the practical.
In his book Average Is Over: Powering America Beyond the Age of the Great Stagnation economist Tyler Cowan argues that we should emphasize the economic power of the humanities in business. He has a point: Creativity and an understanding of the human condition are surely useful in any enterprise. And these skills easily and often translate into employment. Still, I hesitate to make the value of general or liberal education purely economic.
As a musician and historian, I recognize that I am odd. I am comfortable with Kant’s definition of aesthetics as only the “useless” bit. (And while I think Kant’s task is impossible, I LOVE that he is trying to separate judgments of taste from judgments of beauty. So I find it a bit odd when folks try to defend something (the humanities or the pursuit of knowledge for its own sake, for example) that I don’t feel needs defending. (No one feels the need to defend ice cream or chocolate.) But I am odd.
Academics call the units or departments of our world “disciplines.” These are internally consistent and self-governing systems of value. As the word “disciplines” implies, they provide discipline—a structure for organizing and verifying knowledge. They can, equally, be confining.
Another common function of general education, therefore, is to introduce students to the different disciplines (or systems of thought). More recently, some general-education programs are trying to create a space for a renewed desire for interdisciplinarity. (Remember that the question of whether to teach general education inside or outside the disciplines was an important difference between the distribution and core systems of general education.)
For all of this history and suspicion of the practical, the liberal arts are not truly disinterested. Many disciplines and institutions (like Goucher) are also deeply invested in the world, real problems, the character of our students, and especially our local community. Many liberal arts skills are also manifestly practical. Writing skills, for example, are at once a prime vehicle for thinking through abstract complex problems and the world’s most important job skill.
The Georgetown Center for Education and the Workforce has demonstrated that a college degree does, indeed, have a lifetime economic benefit: A bachelor’s degree is on average worth $1.3M more over a lifetime of earnings. That is great news, but I hope it is worth even more than that.
Goucher has joined the small group of schools using the Gallup/Purdue index to measure how specific experiences in college (like having a professor believe in your future success) have a profound impact on future workplace happiness.
General education has then been conceived as a way:
1- to stimulate learning for its own sake
2- to connect students to the real world
3- to give students a common cultural or intellectual vocabulary
4- to introduce students to a variety of ways of thinking.
Some of these seem mutually exclusive, and clarifying which of these matters for each program is an important first step in helping students understand how the pieces of a distribution system come together.
Content was never the primary goal of a liberal arts education, but with the increased pace of new knowledge and our easy access to more content on every device, thinking and analysis have become even more important. This makes integration evermore important in the design of our new general-education systems. The whole now truly has to be more than the sum of the pieces.
LAST PART: Happiness, values, and integration
In part two of our discussion about general education, we saw how making the bachelor’s degree a prerequisite for professional school create the U.S. liberal arts curriculum, but simultaneously it separated undergraduate education from the “real world.”
General education requirements were introduced to try and bridge this gap.
We can see early examples of this in the core courses universities introduced during the First World War. Columbia’s famous contemporary civilization course was initially called “War Aims.” Stanford and Dartmouth followed suit with courses on “Problems of Citizenship.” Williams called its version “American National Problems.” Eventually these became the common “Western Civ” requirements.
These core courses had a social motive to give students a common understanding of society, shared value judgments, universal traits and outlooks, and a collective experience that would bind society together. Columbia’s other famous core course, “Literature Humanities,” was organized initially by English Professor John Erskine, who was worried that new immigrants and especially Jewish students, would not share in the common culture of the “great Anglo-Saxon writers.” In 1934, Jacques Barzun and Lionel Trilling (a student of Erskine’s) revived this as “The Colloquium in Important Books.”
Harvard’s core originated in a 1945 report “General Education in a Free Society,” which became known as “The Redbook” (because of its crimson binding). Harvard President James Bryant Conant (who served from 1933 to 1953) discovered that the elective system had, indeed, created more courses, but he also hoped to create a meritocracy and began using the new SAT test for admissions. Conant thought the elective system was too easy to game and not integrated enough. But “The Redbook” also had a clear social motive to give students “a common … understanding of the society which they will possess in common” as Americans at the beginning of the Cold War. General education was, in other words, driven by fears of increasing social mobility and declining moral authority in a time of national crisis.
To make a long story very short, as both the canon and society were opened up in the 1960s, curricular cores had to change; they became about method and learning how to learn. Brown introduced “Modes of Thought” courses in 1969, and Harvard created a core in 1970 requiring students to take courses in 7 or 11 areas (still extra-departmental). In 1974, Michigan introduced “Approaches to Knowledge.”
Some of this also represents a crisis of confidence in what the “core” knowledge or context might be for all students, which was also part of the 1960s revolution. As the college population and the faculty began to diversify, scholars began to study new and more diverse traditions. Faculty and students specifically rejected many of the common “value judgments” of the old core. The same books assigned to bind us together, could also alienate.
In the same way that Barzun and Trilling at Columbia taught books that were “important” (and not because of great truths they contained), the renamed “Western Civ” programs justified their core texts simply as a common heritage. I taught in a Great Works first-year course at Stanford in the 1980s. While it did include great works by women and colonized people, it was still largely the Western intellectual tradition. I liked that the title made clear it was a value judgment, but I am still of two minds about this issue.
As a musician, I want to teach work I think is great, but I also feel compelled to teach work I think is important (even when it is not to my taste). A work can be “great” for a variety of reasons and within a tremendous variety of aesthetic, cultural, and intellectual systems. (All judgments are institutional or cultural in a way.) As a teacher, I try to help students understand why someone thinks these works are important, and having a wider variety of texts and cultures is helpful in that. (It demonstrates, for example, that there are many ways to evaluate greatness.) But I also think understanding the Western intellectual tradition is important—and useful!—for living in the West, in the same way that being able to write clear standard English is useful.
Teaching in England taught me that I write like an American: One of my British department chairs took the time to point out that my style was “far too breezy, direct, and concise.” I was encouraged to be “less Hemingway and more scholarly”—even in email! My point is I learned to recognize the value of different styles without having to prioritize one as being best. I still speak to my jazz friends in a very different language than I use in class or in my academic prose. Is there a way to teach both great works and important skills without privileging one tradition? Is there a way to keep one foot in the real world, while still fostering that love of learning for its own sake?
Suppose the point of general education (and perhaps even the graduation standard) was the ability to hold two opposing ideas in your mind at once? Previous editions of general education tried both to bridge the gap between the liberal arts and the real world and prepare students to live in that same real world. Imagine the real-world implications for our nation if opening minds was the outcome of general education?
PART FOUR: Ways of Knowing
For most institutions, this core liberal arts mission happens in the general education curriculum, a unique feature of U.S. higher education.
Today, we have two broad types of general education: the distribution model and the core model. Most colleges use the distribution model, where students get “breadth” by taking a mandated variety of courses within different departments. This gives both faculty and students great flexibility. A core system (like Columbia’s) works differently: Here all students take a set of extra-departmental courses specially designed for this purpose. Many colleges (like Goucher) have a mix of the two.
These two models represent two very different conceptions of the liberal arts. The distribution model says the liberal arts can’t be reduced to any specific body of knowledge. Basically, any and every course in general education will provide students with a way of thinking that is more important than specific content. (I like this idea, but I suspect we don’t connect things as well as we could.) The core model is the reverse: Specific ideas, values, and texts matter—not just any literature course will do. (I like this idea, too, but mostly because of the idea of a common intellectual experience.)
To understand the roots of general education, we need to start in the 19th century, when a bachelor’s degree consisted only of required courses. Before this, there weren’t any general education or majors. In the 1860s, college enrollments were declining, in part because students were given the choice between a bachelor’s degree OR an easier and more professional degree in law, medicine, or science.
When Charles Eliot arrived as the president of Harvard in 1869, half of the law students and almost three-quarters of the medical students had not previously been to college and did not hold undergraduate degrees. At the time, Harvard Law School had no admissions requirements beyond evidence of “good character” and the ability to pay the hundred-dollar tuition.Eliot had the brilliant idea that the bachelor’s degree could be a prerequisite for professional or graduate school (instead of being a separate equal path). This raised standards and made the professional schools more selective. Eliot had returned from Europe influenced by the German model, where universities of greater disciplinary specificity had become a national economic engine through research, and he introduced both ideas to U.S. higher education. This also led him to create separate professional (graduate) doctoral programs in 1872.
One final Eliot innovation was the introduction of the elective system. If students were going to get professional training later, then the undergraduate degree could be, as he wrote in The Atlantic, “the enthusiastic study of subjects for the love of them without any ulterior objects.” (Eliot’s critics also pointed out that electives brought a loss of coherence, depth, AND breadth.)
All of these innovations—the bachelor’s degree as preparation for graduate school, the creation of autonomous departmental fields of study, the elective system, and an emphasis on research—were all widely imitated. They were all expensive, but Harvard could afford it. Eliot understood this, but he bargained that faculty would support it because they could teach what they wanted. By allowing students to choose professors—Eliot also created the modern course with a name, number, and professor listed—he also hoped teaching would improve. Most of the schools that copied this Harvard model, of course, did not have the same resources, but this collection of ideas has defined U.S. higher education ever since.
This kind of curricular freedom radically changed U.S. higher education. The absolute separation of undergraduate education from the vocational professional schools, and the separation of knowledge into individual disciplines, triumphed, created the U.S. liberal arts ideal, and opened the door for a massive expansion of bachelor’s degrees. However, it also created a stubborn resistance to connecting undergraduate learning across disciplines and with the “real world.” General education was invented to try and bridge the gap between both the disciplines inside the academy and the world beyond its walls.
PART THREE: Education and the “real world”
At Goucher College, we are rethinking our curriculum. Specifically, we’re asking: What is the core of a liberal arts degree? What is general education in that context? And what is the relationship between a major and general education? Perhaps students should only declare a mission and not a major? We are doing so amidst a daily tidal wave of criticism and defense of the liberal arts.
A lot of money is being spent trying to convince people that the liberal arts are good for students and for the country. (Read the Council of Independent Colleges’ argument and the student blogs about the benefits of a liberal arts education that the Mellon Foundation is supporting.)
I hope these kinds of efforts help get the message out about the value of the liberal arts, but I suspect it’s a bit like the government encouraging you to eat more vegetables: expensive and noble, but unlikely to work by itself.
There are equally lots of fairly radical suggestions for reinventing higher education and lots of recent books on this topic, including Andrew Delbanco’s College: What It Was, Is and Should Be, Clayton Christensen’s The Innovative University: Changing the DNA of Higher Education From the Inside Out, and Louis Menand’s The Marketplace of Ideas: Reform and Resistance in the American University, just to name a few.
To me, the most attractive prospect (and the approach we are taking at Goucher) is to try and reframe the liberal arts as the degree of the future. Technology and the increased pace of knowledge creation have changed our world, but they only underscore the need to focus on creating graduates who are critical thinkers who understand that new information is not just stockpiled: It changes you.
Critical thinking, discernment, and analysis have long been at the core of a liberal arts education. The change from a world of limited but relatively reliable sources (encyclopedias and books), to one with many more sources of less reliable information (webpages, postings, tweets, and memes) has fundamentally changed our relationship with knowledge and learning. While our phones all have access to more information than any classroom, calling this device a “smart” phone has confused quantity of information with knowledge. Smart people are not the ones who know the most; they are the people who know how to change their minds.
The jobs of the future belong to those who are more than just critical thinkers. Most of the information students will need in the future, we cannot teach them because it has not yet been discovered. More than ever, students will need to continue learning after graduation. So while content remains critical (both for its own sake, but also as a basis for learning and thinking), our real mission is to create voracious, self-regulated learners.
Being self-regulated means you can manage your own assumptions and cognitive processes. It is related to metacognition, which is the conscious control over your own thinking, including what inputs you value; how you interrogate information; and your own resistance to new inputs, or your “immunity to change” (Kegan & Lahey, 2009). Context and integration are key to change. Employers often ask for flexible and creative problem solvers, but this is really more a way of being than a set of skills. At its core, creative thinking is about an openness to new possibilities, the ability to attend to our own intellectual accent, the integration of conflicting ideas, and the willingness to have new knowledge change how we think about the world. A growth mindset is the ultimate job skill.
At Goucher we have crystalized this as a new “3Rs”—relationships, resilience and reflection . These core ideas came from both the latest cognitive neuroscience research on learning and the Gallup/Purdue survey on lifelong workplace satisfaction: Happy workers and happy people had college faculty who believed in their ability to grow and cared about their learning and future success. When failure and growth are established as a pattern in college, those students continue to be self-regulated learners.
Next semester, we will have an entire themed-semester on mindfulness as we look at everything regarding our curriculum and residential experience.
PART TWO: How did we get here?
C. Edward Watson, Ph.D. Director, Center for Teaching and Learning, University of Georgia
This is the last segment of a three-part blog series that I mentioned in a recent video that is meant to get students and parents to relax about locking in a college major. In the previous blog post, I said college is all about perspective and learning how to learn, and not just the “fun” stuff like art and music. In this post, I want to add this advice:
Students, a little STEM won’t kill you. Your aptitude and happiness matter, and you need to know yourself and what makes you happy. But a job will also increase your happiness. After teaching arts entrepreneurship for years, I know that many music, art, dance, and theater majors hated my class. They hated thinking about taxes and contracts, and they thought having an elevator pitch was stupid. But years later, I continue to get emails and cards telling me that because of their website, they now have a thriving dance studio, or that they came up with a new way to price seats and saved their theater company, or that Yo-Yo Ma liked their elevator pitch so much, he made time for a free lesson. Few people get to follow their passion entirely (that is why most of us have hobbies.)
The world needs more artists and humanists, but being able to make some technology, manipulate a spreadsheet, and analyze numerical data are a good investment. Employers want to hire broadly educated graduates. Both colleges and students need to ask the really hard question: Are you really broadly educated if you took only one science course or avoided every course with statistics? A little pain here can provide a real gain.
It is possible to be successful regardless of major, but all of life involves tradeoffs and different levels of risks. Think now about what you really want out of college and make choices accordingly.
Your best hedge against uncertainty is to make the most of the divergent opportunities in college. Go to the best college you can afford: Yes, the more money you borrow, the more you have to pay back, and the more your major will matter. Take the best professors—no matter how early in the morning they teach or how scared you are of the subject. Discover what you love, but ALSO pursue more than just your passions.
I would tell Goucher students—and all students: Become broadly educated. For initial salary and employment, your major matters, but you can alter those statistics with some well-chosen electives. Take some things for fun, but not everything. This is your best opportunity to change what you are good at. Become a professional learner.
Goucher, like all colleges require math and science, but do more than the minimum. Learn enough STEM and especially quantitative analysis and (http://cew.georgetown.edu/recovery2020) coding, so you can learn more later. (When your children need braces, your priorities will change.) An early internship or a good professional mentor can help you find some useful electives.
This is not an appeal for every philosophy major to double major in business!! Do not major in something you hate! But death and taxes are unavoidable, and you should not avoid every business, science, or math course just because it seems boring or it is not your current strength. The same applies to the science majors: Ultimately you will want to understand what your research means to people (the humanities) and be able to explain this (writing and communication). All hard science and no fuzzy discussion will limit your career–and your happiness. Expanding your interests will pay enormous dividends later in life.
I teach music–jazz history. That might seem like a “fun” elective, but think about what music means. Nothing, actually. And yet we think it does. Music (without words) has no concrete meaning—it is abstract—it means only what you or your culture make it mean. That makes music the MOST political art form and the most political cultural artifact you will ever encounter because all of the meanings are culturally specific.
All of the music you hate (or don’t know) sounds alike, so if someone says all country or hip-hop music sounds the same, or all trumpet players sound the same, they don’t know much about those forms of music.
In learning about music or art, you will also learn about HOW cultures make meaning. You will learn not only how different types of cultures produce meaning, but how meaning is created. That is a very useful thing if you want to understand what your patients or clients are really trying to tell you. So the arts are more important than you might think, but not because they are “culture.”
The arts are one of the best ways to experience other worlds and ways of creating meaning. They offer an entrance to hundreds of other ways of knowing; and your future will require that you understand a variety of perspectives. No wonder STEM students are being advised to think about STEAM (STEM + Arts.)
One thing is clear: A college degree is more valuable than ever. But at the same time, it is only a baseline. More and more jobs will require a graduate degree AND continued learning. Learn to be more than half a thinker: you need to be both hard and fuzzy. A true liberal arts education like we offer here at Goucher means studying both humanities and science in depth.
This is the second segment of a three-part blog series about why students (and their parents) shouldn’t stress about choosing a major. It complements a video I also recently posted. In the previous blog post, I talked about my own college experience and how it taught me to think in complex ways, and from different disciplinary angles. In this installment, I present recent research to question my own hypothesis (that a college major doesn’t matter), as well as to argue for the values of the liberal arts in the long run.
As a humanist, I am distressed at the recent research that demonstrates both that your major matters (for starting salary and unemployment) and that the humanities do relatively poorly by this measure. Anthony Carnevale at the Georgetown Center of Education and the Workforce found that the four majors that provided the least economic rewards were (from the bottom up) psychology and social work, education, arts, and humanities, with engineers and computer scientists making significantly more money right away and even after graduate school. Carnevale and his colleagues have repeatedly discovered that your choiceof major substantially affects employment prospects and earnings. Majors that are linked to occupations generally provide better employment, and being able to make technology is better for finding employment than just being able to use it. STEM graduates have lower unemployment than do arts and humanities grads.
Humanists have decided to do their own studies, and the AACU found (http://www.aacu.org/leap/documents/nchems.pdf ) that over time, humanities and social science majors actually surpass those in professional and pre-professional majors during the peak performing years (ages 56-60). This was widely reported as “Liberal Arts Grads Win Long-Term,” but even in these articles, the data are explicit that STEM grads make more money—even when you average in all of those graduates degrees. With no graduate degree, humanities majors (on average) close the gap, but they never reach the unemployment levels of professional, pre-professional, and STEM majors.
As a scientist, I look at the data and question my hypothesis. It is fine to argue for the values of the liberal arts (even in Forbes), but we also need to look at the evidence.
The most recent study (this time from three Yale economists tells us that the difference in starting pay gets extended in a recession. They argue first that poor labor markets disrupt early careers by reducing wages and employment. They also found that while your chances of full-time employment eventually even out over seven years, the negative wage effect persists—forever. These effects are differential across majors, and the initial wage gaps across majors widen and persist in a recession. Apparently these effects are double in our recent great recession.
We should not ignore this evidence. When it comes to employment and wages, engineering is a better bet than history. If you are an English major seeking a higher salary, law school is a good choice. While I’d like everyone to be able to follow their passion, that is surely easier to do if your parents can afford it. If you need to pay for college, you cannot ignore the data on majors. (By the way, the reason I was a chemistry major? My mother insisted.)
As a humanist, however, I ask what all this means. Employers continue to say that they want critical thinkers and lifelong learners, which Goucher students are taught to be by the time they graduate. Jobs are changing faster than they ever have. Future employers will reward those who can adapt and continue to learn. There are plenty of opportunities to learn technical skills (https://developers.google.com/university/) for free on the Internet. A technical or professional major will probably continue to provide an initial wage and employment boost, but unless you keep learning, those skills will eventually be supplanted by others. College is about perspective and learning how to learn, which we emphasize a lot here at Goucher.
You probably learn somewhat differently in the humanities and the sciences. While the humanities are not merely qualitative, and the sciences are much more than quantitative analysis, thinking about proficiency in both is useful. You will need to continue learning in both ways to have the life you want.
Read Part III here
I recently sent out a video message to the members of Goucher’s Class of 2018 and their families. In it, I try to allay the academic anxiety that paralyzes both students and parents alike. In this three-part blog series, I will explain why there’s no reason to worry and why a student’s major takes back seat to his or her ability to think critically, using a variety of perspectives that are enriched by a true liberal arts education. Here’s the first part:
I had nine majors in college. I tried Japanese, physics, comp lit, and more, before graduating in chemistry (and one course shy of that double major in ancient history.) I still felt “uneducated,” so I did an M.A. in “humanities” (what we jokingly called the “Evelyn Wood School of Western Culture.”) The humanities and the sciences BOTH remain critical to my thinking.
In the sciences, I learned to look at data and crunch numbers. I learned to ask better questions and to be suspicious of theories (and results that adhered too closely to theories). Most importantly, if the data did not support the hypothesis, I learned I needed to change my mind and really look at what the data were telling me. Science taught me how to have an open mind.
The humanities taught me that analysis is always influenced by perspective and that everything has meaning to someone. I learned to ask better questions, be suspicious of theories (and results that adhered too closely to theories) and most importantly, that if the data did not support the hypothesis, I needed to change my mind and look for what someone else might see in the data. The humanities also taught me how to have an open mind.
Data are awesome, but (almost) all of them are relative, and meaning can be found (almost) anywhere for someone. Both scientists and humanists make the really big discoveries when they look at a problem from a new perspective and ask a new question. In a way, these lessons can (and should) be learned in any major, but I find that without an education in both humanities and sciences, you are only half a thinker.
My major didn’t really matter. College taught me how to think in complex ways, and it did that by forcing me to look at problems from different disciplinary angles. We call the different majors “disciplines,” and disciplines imply focus. Students often think that the different departments study different types of things, but it is really that each asks different types of questions. A discipline constricts the sorts of questions you ask of data. That focus is essential for deep understanding, but we (in academia) also recognize that change of perspective is also useful, and so we are constantly creating “interdisciplinary” or “multidisciplinary” programs.
Most colleges, Goucher included, also try to balance these needs by having both a broad interdisciplinary core and majors that allow for more depth in one discipline. It sounds good, but general education requirements such as Goucher’s liberal education curriculum are often seen as a chore to be finished before students get on to what they really want to do. Because we are organized in colleges by disciplines, it is also natural that our loyalty and attention tends to be on “our” students in “our” majors.
The jobs of the future (and indeed happiness in your future) will probably require both qualitative and quantitative work. Most colleges will let you major in one or the other (humanities or sciences in very broad terms), but your thinking and your potential will be advanced by depth in both areas.