Division of Arts and Humanities /asmagazine/ en ‘Kenough’: Is Barbie more revolutionary for men than women? /asmagazine/2025/03/07/kenough-barbie-more-revolutionary-men-women ‘Kenough’: Is Barbie more revolutionary for men than women? Rachel Sauer Fri, 03/07/2025 - 14:08 Categories: News Tags: Division of Arts and Humanities PhD student Research Women and Gender Studies popular culture Clay Bonnyman Evans

鶹ѰBoulder PhD student’s paper argues that the hit film exemplifies ‘masculinity without patriarchy’ in media


M.G. Lord, author of Forever Barbie: The Unauthorized Biography of a Real Doll and co-host of the podcast LA Made: The Barbie Tapes, describes Greta Gerwig’s Oscar Award-winning, box-office behemoth  as “incredibly feminist” and widely perceived as “anti-male.”

Meanwhile, conservative critics rail that the movie is “anti-man” and full of “beta males” in need of a testosterone booster. Conservative British commentator Piers Morgan called it “an assault on not just Ken, but on all men.”

 

鶹ѰBoulder PhD student Julie Estlick argues that Greta Gerwig's award-winning film Barbie is "a really good film for Ken."

But 鶹Ѱ women and gender studies doctoral student Julie Estlick sees things differently. In her recent paper, ,” published in Feminist Theory, she argues that the movie is “a really good film for Ken.”

On first viewing, Estlick noticed a woman nearby having a “very visceral, emotional response” to the now iconic monolog by actor America Ferrera, which begins, “It is literally impossible to be a woman.”

She wasn’t particularly moved by the speech, and walking out of the theater, she realized she didn’t see the movie as a clear-cut icon of feminism.

“I really questioned whether the film was actually about Barbie, and by extension, women, at least in the way people were claiming,” she says.

Once Barbie was available for streaming, Estlick took a closer look and arrived at a heterodox conclusion:

Barbie is not anti-man; it is pro-man and is not necessarily a revolutionary film for women, at least not as much as it is for men,” she writes in the paper’s abstract. “This is because Barbie espouses non-hegemonic masculinity through cultural critiques that are rare to see in popular media.”

Hegemonic vs. toxic masculinity

For Estlick, “hegemonic masculinity” is a kind of stand-in for the “toxic masculinity” so often featured in media: superheroes, gangsters, vigilantes, killing machines who are also “lady killers.” Always strong, rarely emotional, such men are absurdly impermeable to harm, and sport chiseled features and perfectly sculpted abs, she says. Yet many are also “man children” whose “ultimate prize” is to have sex with a woman.

“That kind of media comes at the expense of women, works against women, and often oppresses women by sexualizing and objectifying them,” Estlick says.

 

In the film Barbie, the patriarchy ultimately doesn't serve the Kens any more than it does the Barbies, argues 鶹ѰBoulder PhD student Julie Estlick. (Photo: Warner Bros. Pictures)

Non-hegemonic masculinity is strong without being oppressive, and supportive and protective of women without regard to any quid pro quo. It allows for men to openly express emotions and vulnerability and to seek help for their mental-health struggles and emotional needs without shame, while retaining their strength, vitality and masculinity.

“It does the opposite of hegemonic masculinity,” Estlick says. “It works alongside women and doesn’t harm them in any way.”

The Kens are first represented in the movie as clueless accessories to the ruling Barbies of Barbie Land. But after Beach Ken (Ryan Gosling) and Stereotypical Barbie (Margot Robbie) find a portal to our world, Beach Ken returns and establishes a patriarchal society in which women become mindless accessories to hyper-competitive men in the thrall of hegemonic masculinity.

But ultimately, the patriarchy doesn’t serve the Kens any more than the Barbies.

“As people always say, men’s worst enemy under patriarchy isn’t women. It’s other men and their expectations, who are constantly stuffing men into boxes,” Estlick says.

Which isn’t to say that women don’t also enforce strictures of hegemonic masculinity.

“When little boys are taught to suppress emotions, little girls are watching. They are watching their fathers, and fathers onscreen, acting in certain ways,” Estlick says. “Girls internalize toxic ideologies the same ways boys do.”

Allan the exception

In Barbie, there is just one male who stands apart from Kendom: Allan, played by Michael Cera.

“Allan is positioned as queer in the film in that he is othered but not less masculine in the traditional understanding of the word,” Estlick writes. He “deviates from the conventional canon of masculinity” and “uses his masculinity for feminism and to liberate women while also protesting patriarchy.” 

Allan doesn’t fit into Kendom, with or without patriarchy. As the narrator (voiced by Helen Mirren) notes, “There are no multiples of Allan; he’s just Allan.”

The character is based on a discontinued Mattel doll released in 1964, intended to be a friend to Ken. Fearing the friendship might be perceived as gay, the company swiftly removed Allan from store shelves, later replacing him with a “family pack” featuring Barbie’s best friend Midge as his wife, and a backstory that the couple had twins.

In the film, non-toxic Allan is immune to patriarchal brainwashing and sides with the Barbies in re-taking Barbie Land.

 

“(T)he film can be understood as a vital framework for masculinity that allows for vulnerability, emotion and heterosexual intimacy among men,” says researcher Julie Estlick. (Photo: Warner Bros. Pictures)

“Right off the bat we see (Allan) as queered from the rest of the Kens and Barbies,” Estlick says.

But Beach Ken, too, eventually senses that he’s not happy in the patriarchal society has created. In one of the movie’s final scenes, a tearfully confused Beach Ken converses with Stereotypical Barbie from a literal ledge:

“You have to figure out who you are without me,” Barbie tells him kindly. “You’re not your girlfriend. You’re not your house, you’re not your mink … You’re not even beach. Maybe all the things that you thought made you aren’t … really you. Maybe it’s Barbie and … it’s Ken.”

In other words, Barbie is rooting for Ken to claim his individuality.

“Beach Ken’s house, clothes, job and girlfriend all represent boxes that society expects men to tick, but this scene illustrates that it is okay to deviate from normative behaviors of masculinity and that manhood is not solely defined through heteronormative bonds and behaviors,” Estlick writes. And “it is acceptable for men to admit to a woman that they need help.”

Barbie is pure, candy-colored fantasy. But in our world, Estlick believes it points the way toward further non-toxic media representations of masculinity and ultimately contribute to better mental health for men trapped in a “man box” — as well as women who have borne the burden of men’s self- and societally imposed strictures on their own humanity.

“(T)he film can be understood as a vital framework for masculinity that allows for vulnerability, emotion and heterosexual intimacy among men,” she concludes. It “(opens) the door to the creation of more media that subverts societal expectations of toxic masculinity.” 


Did you enjoy this article?  Passionate about women and gender studies? 

 

鶹ѰBoulder PhD student’s paper argues that the hit film exemplifies ‘masculinity without patriarchy’ in media.

Related Articles

Traditional 0 On White Top image: Warner Bros. Pictures ]]>
Fri, 07 Mar 2025 21:08:55 +0000 Rachel Sauer 6082 at /asmagazine
Did ChatGPT write this? No, but how would you know? /asmagazine/2025/03/03/did-chatgpt-write-no-how-would-you-know Did ChatGPT write this? No, but how would you know? Rachel Sauer Mon, 03/03/2025 - 14:34 Categories: News Tags: Division of Arts and Humanities English Research Undergraduate Students Collette Mace

In her Writing in the Age of AI course, 鶹ѰBoulder’s Teresa Nugent helps students think critically about new technology


One of the most contentious subjects in academia now is the use of AI in writing. Many educators fear that students use it as a substitute . And while students fear that they’re going to be accused of using it instead of doing their own critical thinking, some still use it anyway.

Some students, like their instructors, fear what AI is capable of, and they are highly uncomfortable with the risks associated with its use.

 

Teresa Nugent, a 鶹ѰBoulder teaching associate professor of English, invites students in the Writing in the Age of AI course to experiment with AI as part of their writing process and critically reflect on how these tools influence their ideas.

Teresa Nugent, a 鶹Ѱ teaching associate professor of English, has seen all these perspectives. When she first read the 2023 essay “” by Columbia University undergraduate Owen Kichizo Terry, she knew that it was time for educators and students to better understand AI use in writing, even though it was scary.

Two years later, she is in her second semester of teaching ENGL 3016, Writing in the Age of AI. In this course, Nugent invites students to experiment with AI as part of their writing process and critically reflect on how these tools influence their ideas. Her students have conversations with chatbots about topics that they know well and evaluate whether the bots actually know what they’re talking about.

Nugent says she hopes that taking a class in which they are encouraged to talk about AI use allows students to explore possibilities, play with these tools, test their capabilities and determine how best to use them. By teaching students how to use AI as a tool to help develop their critical thinking skills instead of just avoiding that hard work, Nugent aims to prompt students to think about the wider implications of AI, and where it can ethically fit into an academic curriculum.

“We as educators have an obligation to help our students develop the skills that they’re going to need in the world that is developing around all of us,” Nugent says. “If we try to pretend AI isn’t here, we are doing students a disservice. We need to find ways to inspire students to want to learn; we need to spark their curiosity and motivate them to find meaningful connections between course content and the world.”

Mixed feelings about AI

Not all students are enthusiastic about AI. Nugent explains that, since the class fulfills an upper-level writing requirement, she has students of all different majors and experience levels. Many students, she notes, come in with a great deal of apprehension about using AI, something the class discusses openly on day one.

Nugent asks her students to think of a story they’ve been told—often by a parent or grandparent—about what life was like before some commonplace technology—like cell phones or the internet—was invented.

 

“If we try to pretend AI isn’t here, we are doing students a disservice," says Teresa Nugent, 鶹ѰBoulder teaching associate professor of English.

Someday, she reminds her students, they'll tell stories about what the world was like before generative AI. New technology is always emerging, and the best way to adapt to the changing world is to keep learning about it, she says.

Nugent also acknowledges the real risks that come with AI use. She offers students a plethora of readings expressing a range of perspectives on the subject—including  concerns about the unintended consequences of technological innovations and Mustafa Suleyman’s warning about the need to contain AI in his book The Coming Wave. Students read writings about how current educators have grappled with the release of AI chatbots and science fiction media depictions of AI, including the film Her and the dystopian serial Black Mirror.

Students also read texts about the harmful effects of AI on the environment, the issues of class and social justice that are entangled with AI use and psychological studies concerning AI.

Overall, Nugent says she wants students to leave the class with an informed understanding of AI. For their final project, students are required to research an aspect of AI in which they are particularly interested.

She says this leads to a wide array of research topics, often based on students’ majors; for example, an environmental studies major might research how to use renewable energy sources to power data centers. After writing academic papers, students reframe their research into a “blog” format that a general audience would find easily understandable.

“Knowledge is power,” Nugent says. “Being well informed about something always gives one more of a sense of agency than not being informed.” Ultimately, Nugent says she hopes that students will leave the class feeling confident and prepared to offer their knowledge about AI to society and keep themselves and others informed about this moment in technological history.


Did you enjoy this article?  Passionate about English? Show your support.

 

In her Writing in the Age of AI course, 鶹ѰBoulder’s Teresa Nugent helps students think critically about new technology.

Related Articles

Traditional 0 On White ]]>
Mon, 03 Mar 2025 21:34:42 +0000 Rachel Sauer 6078 at /asmagazine
Schmooze-a-Palooza to celebrate community, song and Hebrew /asmagazine/2025/02/28/schmooze-palooza-celebrate-community-song-and-hebrew Schmooze-a-Palooza to celebrate community, song and Hebrew Rachel Sauer Fri, 02/28/2025 - 12:19 Categories: News Tags: Division of Arts and Humanities Events Jewish Studies students

鶹ѰBoulder event, now in its 11th year, will schmooze it up on March 12


For the past decade, Hebrew classes at the 鶹Ѱ have hosted a novel event described as a rite of passage: the Schmooze-a-Palooza—part concert, part community building and part celebration of Hebrew and song.

The 11th-annual Schmooze-a-Palooza will be held at 6:30 pm Wednesday, March 12, in UMC Room 235. Anyone with an interest in Hebrew is invited.

 

  What: 11th-annual Schmooze-a-Palooza

  When: 6:30 p.m. March 12

  Where: UMC Room 235

  Who: Anyone with an interest in Hebrew is invited.

Led by Eyal Rivlin, a teaching professor of Hebrew language in the Program in Jewish Studies and a professional musician, students in each class prepare a well-loved song in Hebrew—memorizing it, dressing up, creating a dance and performing it in front of their peers. 

Having taught in different capacities for more than 30 years, Rivlin wanted to extend learning beyond the classroom, help the different classes connect and inspire lifelong friendships.

“When we show up in creative and expressive manners, with permission to embody our inner rockstar, a vulnerability is tapped which sets the groundwork for connecting at deeper levels,” says Rivlin. “It is clear to me that in 20 years from now, many of my students will remember singing with their friends, taking a risk and showing up together and having fun in the context of learning a language.”

Through the years, the concert has expanded and now, in addition to class performances, Jewish Studies faculty offer a song from the stage, some students volunteer to perform solos and duets of their favorite Hebrew songs and members of the local Hebrew-speaking community prepare a song as well. This year there is even talk about a flash-mob dance, Rivlin says. 

Students have said that the event is a highlight of their 鶹Ѱjourney. Songs are a great way to expand vocabulary and memorize sentences and expressions. They also offer the community a taste of different cultural themes and musical styles.

This annual live concert is free and an opportunity to meet new friends, learn some Hebrew expressions and cheer fellow Buffs, Rivlin says.


Did you enjoy this article?  Passionate about Jewish studies? Show your support.

 

鶹ѰBoulder event, now in its 11th year, will schmooze it up on March 12.

Related Articles

Traditional 0 On White ]]>
Fri, 28 Feb 2025 19:19:24 +0000 Rachel Sauer 6077 at /asmagazine
Working with Data for Social Change symposium set for March 14 /asmagazine/2025/02/28/working-data-social-change-symposium-set-march-14 Working with Data for Social Change symposium set for March 14 Rachel Sauer Fri, 02/28/2025 - 11:15 Categories: News Tags: Division of Arts and Humanities English Events Program for Writing and Rhetoric

The all-day event will bring together local and national scholars engaged in digital public humanities projects to advocate for social change


The project on the 鶹Ѱ campus is sponsoring a one-day  symposium March 14.

This all-day event brings together local and national scholars engaged in digital public humanities projects to advocate for social change and who have worked to strengthen ethical data humanities education in higher education, said Laurie Gries, associate professor of English and director of the Program for Writing and Rhetoric, who is spearheading the symposium.

 

  What: Working with Data for Social Change symposium

  When: March 14

  Where: In-person at CASE KOBL 140 and online;  

All faculty, staff and students who want to learn more about the data humanities are invited.

The symposium aims not only to demonstrate and underscore the value of data advocacy research for the humanities at large, but also to generate collective ideas as to how to data advocacy education can be enhanced across the disciplines in higher education, according to Gries.

She said she believes the symposium will be of interest to faculty, staff and students who want to learn more about the data humanities and, more particularly, about data advocacy as a focus of research and/or pedagogy. Those interested in attending in-person or via Zoom can 

The symposium will feature scholars and activists from around the country, including Melissa Borja, Nasreen Abd Elal and Sylvia Fernández Quintanilla, who have advocated with data for social change on projects including the  and , respectively. Additionally, Gries will talk about her data-driven project, the , which was recently profiled in Colorado Arts and Sciences Magazine.

Gries said the symposium also will feature scholars who have worked intently to build data humanities education within and beyond the 鶹ѰBoulder campus. For instance, in addition to featured speaker Melanie Walsh discussing the  project, David Glimp, Nathan Pieplow and other 鶹ѰBoulder and 鶹ѰDenver professors will speak about their efforts to train students how to engage data through critical, humanistic frameworks and how to use data effectively to address matters of significance to them and their communities.

Speaking of Gries’ efforts to spearhead the symposium, Glimp said, “Laurie has assembled a terrific team of collaborators to develop her vision of not only cultivating data literacy among our students but also equipping students with the tools to argue with data. By ‘arguing with data,’ I mean both being able to identify and assess all the ways data-backed arguments can mislead or go wrong, and being able to craft effective, responsible arguments with data about matters of the greatest urgency for our world.”

The Data Advocacy for All project was the recipient of a $300,000 鶹ѰNext Award in May 2022. 


Did you enjoy this article?  Passionate about writing and rhetoric? 

 

The all-day event will bring together local and national scholars engaged in digital public humanities projects to advocate for social change.

Related Articles

Traditional 0 On White ]]>
Fri, 28 Feb 2025 18:15:53 +0000 Rachel Sauer 6076 at /asmagazine
It’s a bird! It’s a plane! It’s another superhero film! /asmagazine/2025/02/19/its-bird-its-plane-its-another-superhero-film It’s a bird! It’s a plane! It’s another superhero film! Rachel Sauer Wed, 02/19/2025 - 13:45 Categories: News Tags: Division of Arts and Humanities English Film Studies Research popular culture Doug McPherson

Following a blockbuster opening weekend for Captain America: Brave New World, 鶹ѰBoulder’s Benjamin Robertson reflects on the appeal of superhero franchises and why they dominate studio release schedules


Captain America continues to conquer obstacles and crush villainsnot bad for a man approaching age 85.

The comic book hero made his debut in print in December 1940, then on TV in 1966 and hit the silver screen in 2011gaining massive momentum along with way. This past Presidents Day weekend, the fourth installment of the superhero series, “Captain America: Brave New World,” hit the top spot at the box office in the United States, and .

 

Benjamin Robertson, a 鶹ѰBoulder assistant professor of English, notes that superhero franchises are comforting in their repetitiveness.

It’s the fourth-best Presidents Day launch on record, behind three other superhero movies: Black Panther, Deadpool and Ant-Man and the Wasp: Quantumania.

What’s going on here? What’s giving Captain America his muscle? And why do folks keep going back to these same stories, characters and worlds over and over?

Benjamin Robertson, a 鶹Ѱ assistant professor of English who specializes in popular culture, film and digital media, says there are two answers: “One, the genre is comforting in its repetitiveness. This is the least interesting answer, however,” he says.

The second answer appears a little more sinister. Robertson says viewers return to these stories because creators make “story worlds that solicit consumers’ attention and that must always grow and that turn increasingly inward.”

He says the first Iron Man film is about America intervening in the Middle East following Sept. 11, but later M鶹Ѱ(Marvel Cinematic Universethe franchise behind many superhero movies) films seem less and less about real or historical matters and more about the M鶹Ѱitself.

“As a colleague once put it, every M鶹Ѱfilm is simply the trailer for the next M鶹Ѱfilm, the result of a strategy that seeks to create a fandom that can’t escape from the tangled narrative that the franchise tells,” he explains.

In short, Robertson says if consumers want to know the full narrative—the full world that these films and series describe—they have to go to the theater. “As this world becomes about itself rather than about external history or real-world events, a certain ‘lock in’ manifests, making it harder and harder to not see these films if one wants to understand the world they create.”

‘Flatter American identities’

 

Actor Anthony Mackie plays the titular Captain America in Captain America: Brave New World. (Photo: Marvel Studios)

Another trick is that M鶹Ѱfilms tend to “flatter American identities” by celebrating militarism, focusing on charismatic heroes who try to do the right thing unconstrained by historical necessity and suggesting that everything will work out in the end, Robertson says.

“I can see the more comforting aspects of these films having appeal to many consumers. Don’t fear climate change, fear Thanos [a supervillain] and other embodiments of badness,” he says.

As to the question of whether franchises are just growing their worlds and the characters in them, or retelling the same story because it makes money, Robertson says each M鶹Ѱfilm is a piece of intellectual property, but an individual film is far less valuable than a world.

“A film might spawn a sequel or sequels, but without developing the world, the sequels will likely be of lesser quality and, eventually, no longer be profitable or not profitable enough to warrant further investment,” Robertson says. “But if producers develop the world into a complex environment that contains numerous characters with distinct and yet intersecting story arcs, well, then you have the foundation for potentially unlimited storytelling and profit in the future.”

He adds that in that context, Captain America has obvious value as an individual character, but he has far more value as part of a world that can develop around him and allow for new actors to play him as he evolves with the world.

So, as the world grows as an intellectual property and in narrative development, "so does the potential for profit, although we may now be seeing the limits of this dynamic as some M鶹Ѱfilms have not been doing as well at the box office over the past five years, although there are likely several factors that contribute to this decline.”


Did you enjoy this article?  Passionate about English? Show your support.

 

Following a blockbuster opening weekend for ‘Captain America: Brave New World,’ 鶹ѰBoulder’s Benjamin Robertson reflects on the appeal of superhero franchises and why they dominate studio release schedules.

Related Articles

Traditional 0 On White Top photo: Marvel Studios ]]>
Wed, 19 Feb 2025 20:45:54 +0000 Rachel Sauer 6072 at /asmagazine
How ardently we admire and love 'Pride and Prejudice' /asmagazine/2025/02/14/how-ardently-we-admire-and-love-pride-and-prejudice How ardently we admire and love 'Pride and Prejudice' Rachel Sauer Fri, 02/14/2025 - 10:16 Categories: News Tags: Division of Arts and Humanities English Literature Research popular culture Collette Mace

Are Elizabeth Bennet and Mr. Darcy the greatest love story? 鶹ѰBoulder’s Grace Rexroth weighs in


What is the greatest love story of all time?

This is a question many like to consider, discuss and debate, especially around Valentine’s Day. Whether you’re more of a romantic at heart or a casual softie, you’ve more than likely heard or expressed the opinion that there is no love story quite like Elizabeth Bennet and Mr. Darcy in Jane Austen’s Pride and Prejudice.

Despite being more than 200 years old, something about this classic novel transcends centuries and social changes to remain a text with which many people connect, whether on the screen, stage or in the pages of the novel.

 

Grace Rexroth, a 鶹ѰBoulder teaching assistant professor of English, notes that Pride and Prejudice has captivated audiences for more than two centuries in part because it appeals to what people—specifically women—have wanted and fantasized about through different eras following its publication. 

What makes this love story so memorable and so beloved? Is it truly the greatest love story of all time, or is there something else about it that draws readers in again and again?

According to Grace Rexroth, a teaching assistant professor in the 鶹Ѱ Department of English who is currently teaching a global women’s literature course focused on writing about love, the historical context in which Jane Austen wrote Pride and Prejudice is crucial to understanding the novel's inner workings.

The Regency Era was a period of intense revolution and change. There still were very strict social norms surrounding marriage and status, which are evident in the novel, but it’s also important to consider that proto-feminist ideals, such as those expressed by Mary Wollstonecraft, were influencing conversations about the position of women in society, Rexroth notes.

Even at the time of publication, Pride and Prejudice was perceived differently between opposing political groups—more conservative thinkers saw it as a story that still rewarded conservative values, such as humility, beauty (always beauty) and a reserved disposition. Other, more progressive readers saw it as standing up to the status quo.

To this day, readers and scholars often debate whether Austen was writing to criticize or praise Regency Era ideas about women’s autonomy. In The Making of Jane Austen, author Devoney Looser observes,It sounds impossible, but Jane Austen has been and remains a figure at the vanguard of reinforcing tradition and promoting social change.”

Nuance helps it endure

The fact that Pride and Prejudice lends itself to different interpretations is part of the reason why it’s lived such a long life in the spotlight, Rexroth says. It has managed to appeal to what people—specifically women—have wanted and fantasized about through different eras following its publication.

According to Looser, both film and stage adaptations have highlighted different aspects of the text for different reasons. During its first stage adaptations, for instance, the emphasis was often placed on Elizabeth’s character development. In fact, the most tense and climactic scene in these early performances was often her final confrontation with Lady Catherine De Bourgh, when Elizabeth asserts that she’s going to do what’s best for herself instead of cowering under Lady Catherine’s anger at her engagement to her nephew, Mr. Darcy.

Such scenes emphasize Elizabeth’s assertiveness and self-possession in the face of social pressure. Featuring this scene as the climax of the story is quite different from interpretations that focus on the suppressed erotic tension between Elizabeth and Darcy.

This doesn’t mean that adaptations prioritizing the romantic union didn’t soon follow. In 1935, Helen Jerome flipped the narrative on what Pride and Prejudice meant to a modern audience by casting a young, conventionally attractive man to play Mr. Darcy. Looser refers to this change as the beginning of “the rise of sexy Darcy,” a phenomenon that has continued in the nearly 100 years following this first casting choice.

In many ways, the intentional decision to make Mr. Darcy physically desirable on stage coincided with the rising popularity of the “romantic marriage”—a union founded on love and attraction rather than on status and societal expectations. Before this, Mr. Darcy’s being handsome was just a nice perk to Elizabeth, not a clear driving force for her feelings towards him.

 

Matthew Macfadyen (left) as Mr. Darcy in the 2005 film Pride and Prejudice. Some critics argue that the film over-dramatized the first proposal scene. (Photo: StudioCanal)

From loathing to love

This is not to say there’s no implication of attraction in the original novel, though. There’s something magnetic about Darcy and Elizabeth’s relationship from the very beginning, when they profess their distaste for each other as the reigning sentiment between them (though readers can see that Elizabeth really doesn’t seem to mind being insulted by Mr. Darcy until later in the novel). It’s a quintessential “enemies to lovers” narrative, Rexroth says.

In that way, the novel offers a hint of the unruly desires driving many creative decisions in most modern film adaptations—from the famous “wet shirt” scene in the 1995 BBC adaptation with Colin Firth and Jennifer Ehle, to what some critics argue is a highly over-dramatized first proposal scene staged in the rain in the 2005 Keira Knightly version. That sense of tension between Elizabeth and Darcy, unsaid but palpable, is a draw that has reeled in modern audiences to the point of obsession.

Rexroth suggests that part of the novel’s appeal hinges on what can and cannot be expressed in the text: “Because discussions of sex and desire are fairly repressed in the novel, emotional discourse has more free reign, which is often appealing to modern readers who experience a reverse set of tensions in modern life. Modern discourse, while often privileging a more open discussion of sex, often places tension on how and why we express emotion—especially in romantic relationships.”

Modern sexual liberation, especially through the eyes of women, has been an integral part of feminist movements. However, feminism also offers reminders that when the world still is governed by misogynistic ideas about sex—including women as the object and men as more emotionally unattached sexual partners—key aspects of what sex can mean from an anti-misogynist viewpoint are lost.

This, perhaps, is one reason that Pride and Prejudice is so appealing to women battling standards of sexuality centered around patriarchy, and who find themselves longing for something more—a “love ethic,” as author bell hooks called it.

However, is Pride and Prejudice really a perfect example of a "love ethic”? Rexroth also asks her classes to consider the pitfalls of how readers continue to fantasize about Pride and Prejudice, potentially seeing it as a model for modern romantic relationships.

Questions of true autonomy

While Elizabeth exercises her autonomy and free choice by rejecting not one but two men, standing up to Lady Catherine and overall just being a clever and witty heroine, she is still living within a larger society that privileges the status of her husband over her own and sees her value primarily in relation to the ways she circulates on the marriage market.

 

Jennifer Ehle (in wedding dress) and Colin Firth as Elizabeth Bennet and Mr. Darcy in the 1995 BBC adaptation of Pride and Prejudice. For many fans, the "perfect ending" with the "perfect man" is part of the story's longstanding appeal. (Photo: BBC)

For that reason, women are never really autonomous, Rexroth says. How can they be, when Elizabeth’s decision to reject a man could potentially ruin her life and the lives of her sisters? Or when her sister Lydia’s decision to run away with Mr. Wickham nearly sends the entire family into ruin? What happens to Elizabeth in a world without Darcy?

This, according to Rexroth, is the danger of looking at Pride and Prejudice uncritically. Though readers and scholars may never know if Austen meant it to be a critical piece about the wider societal implications of the marriage market—although it can be inferred pretty strongly that she did mean it that way, Rexroth says—it does have startling implications towards modern relationships that we tend to find ourselves in.

“Modern discussions of love often focus on the individual, psychological aspects of relationships rather than the larger social networks that structure them,” Rexroth explains. “My students sometimes think that if they just work on themselves, go to the gym and find the right partner, everything will be okay—they’re not always thinking about how our larger social or political context might play a role in their love lives.”

The fantasy of Pride and Prejudice tends to reinforce this idea, she adds. It’s not that the world needs to change—the fantasy is that finding the right man will “change my world.” Such fantasies tend to treat patriarchy as a game women can win if they just play it the right way, Rexroth says. If a woman finds the right man or the right partner, that man will somehow provide the forms of social, economic or political autonomy that might otherwise be lacking in a woman’s life.

Such fantasies sidestep the question of what produces true autonomy—and therefore the capacity to fully participate in a romantic union, she adds.

So, is Pride and Prejudice the ultimate love story? Ardent fans might argue yes—a “perfect ending” with a “perfect man” is the quintessential love story, and who can blame readers for wanting those things? Happy endings are lovely. 

Others, however, might still wish that Mr. Darcy had behaved in a more gentlemanlike manner.


Did you enjoy this article?  Passionate about English? Show your support.

 

Are Elizabeth Bennet and Mr. Darcy the greatest love story? 鶹ѰBoulder’s Grace Rexroth weighs in.

Related Articles

Traditional 0 On White Colin Firth (left) and Jennifer Ehle as Mr. Darcy and Elizabeth Bennet in the 1995 BBC adaptation of "Pride and Prejudice." (Photo: BBC) ]]>
Fri, 14 Feb 2025 17:16:15 +0000 Rachel Sauer 6071 at /asmagazine
Where is today's cool hand Luke? /asmagazine/2025/01/24/where-todays-cool-hand-luke Where is today's cool hand Luke? Rachel Sauer Fri, 01/24/2025 - 13:08 Categories: News Tags: Cinema Studies and Moving Image Arts Division of Arts and Humanities Research popular culture Rachel Sauer

In honor of what would have been Paul Newman’s 100th birthday, 鶹ѰBoulder film historian Clark Farmer considers whether there still are movie stars


Movies did not invent stars—there were stars of theater, opera and vaudeville well before moving pictures—but movies made them bigger and more brilliant; in some cases, edging close to the incandescence of a supernova.

Consider a star like Paul Newman, who would have turned 100 Jan. 26. Despite being an Oscar winner for The Color of Money in 1987 and a nine-time acting Oscar nominee, he was known perhaps even more for the radiance of his stardom—the ineffable cool, the certain reserve, the style, the beauty, the transcendent charisma that dared viewers to look away.

 

“There are still actors we like and want to go see, so I’d say there still are movie stars but the idea of them has changed,” says 鶹ѰBoulder film historian Clark Farmer, a teaching assistant professor of cinema studies and moving image arts.

Even now, 17 years after his death in 2008 at age 83, fans still sigh, “They just don’t make stars like that anymore.”

In fact, if you believe the click-bait headlines that show up in newsfeeds every couple of months, the age of the movie star is over. In with Allure magazine, movie star Jennifer Aniston opined, “There are no more movie stars.” And in Vanity Fair’s 2023 Hollywood issue, , “The concept of a movie star is someone untouchable you only see onscreen. That mystery is gone.”

Are there really no more movie stars?

“There are still actors we like and want to go see, so I’d say there still are movie stars, but the idea of them has changed,” says 鶹Ѱ film historian Clark Farmer, a teaching assistant professor of cinema studies and moving image arts. “I think that sense of larger-than-life glamor is gone, that sense of amazement at seeing these people on the screen.

“When we think of what could be called the golden age of movie stars, they had this aristocratic sheen to them. They carried themselves so well, they were well-dressed, they were larger than life, the channels where we could see them and learn about them were a lot more limited. Today, we see stars a lot more and they’re maybe a little less shiny and not as special in that way.”

Stars are born

In the earliest days of film, around the turn of the 20th century, there weren’t enough regular film performers to be widely recognized by viewers, Farmer says. People were drawn to the movie theater by the novelty of moving pictures rather than to see particular actors. However, around 1908 and with the advent of nickelodeons, film started taking off as a big business and actors started signing longer-term contracts. This meant that audiences started seeing the same faces over and over again.

By 1909, exhibitors were reporting that audiences would ask for the names of actors and would also write to the nascent film companies asking for photographs. “Back then you didn’t have credits, you only had the title of the film and the name of the production company, so people started attaching names to these stars—for example, Maurice Costello was called Dimples.”

As the movie business grew into an industry, and as actors were named in a film’s credits, movie stars were born. In 1915, Charlie Chaplin conflagrated across screens not just in the United States, but internationally, Farmer says.

 

Rock Hudson and Elizabeth Taylor, seen here in a publicity photo for Giant, were two of Hollywood's biggest stars during the studio period. (Photo: Warner Bros.)

“You could say that what was produced in Hollywood was movies, but studios were also actively trying to produce stars—stars were as much a product as the movies,” Farmer says. “There was always this question of could they take someone who had some talent or some looks or skills like dancing or singing, and would they only rise to the level of extra, would they play secondary characters, or would they become stars? Would people see their name and want to come see the movies they were in?

“Stars have this ineffable quality, and studios would have hundreds of people whose job it was just to make stars; there was a whole machinery in place.”

During Hollywood’s studio period, actors would sign contracts with a studio and the studio’s star machinery would get to work: choosing names for the would-be stars, creating fake biographies, planting stories in fan magazines, arranging for dental work and wardrobes and homes and sometimes even relationships.

For as long as it has existed, the creation and existence of movie stars has drawn criticism from those who argue that being a good star is not the same as being a good actor, and that stars who are bigger than the films in which they appear overshadow all the elements of artistry that align in cinema—from screenwriting to cinematography to acting and directing.

“There’s always been a mixture of people who consider film primarily a business and those who consider it primarily art,” Farmer explains. “Film has always been a place for a lot of really creative individuals who weren’t necessarily thinking of the bottom line and wanted to do something more artistic, but they depended on those who thought about it as a business. Those are the people asking, ‘How do you bring people in to see a movie?’ Part of that can be a recognizable genre, it could be a recognizable property—like a familiar book—but then stars are one more hook for an audience member to say, ‘I like Katherine Hepburn, I like her as an actress and as a person, and she’s in this movie so I’ll give it a try.'

“One of the biggest questions in the film industry is, ‘How can we guarantee people will come see our movie?’ And the gamble has been that stardom is part of that equation.”

Evolving stardom

As for the argument that movie stars cheapen the integrity of cinema, “I don’t think they’re bad for film as an art form,” Farmer says. “Audiences have this idea of who this person is as a star or as a performer, which can make storytelling a lot easier. You have this sense of, ‘I know who Humphrey Bogart is and the roles he plays,’ so a lot of the work of creating the character has already been done. You can have a director saying, ‘I want this person in the role because people’s understanding of who this person is will help create the film.’ You can have Frank Capra cast Jimmy Stewart and the work of establishing the character as a lovable nice guy is already done.”

 

"Faye Dunaway wears a beret in Bonnie and Clyde and beret sales go off the charts. People went to the movies, and they recognized and admired these stars," says 鶹ѰBoulder film historian Clark Farmer. (Photo: Warner Bros.)

As the movie industry evolved away from the studio system, the role of the movie star—and what audiences wanted and expected from stars—also began changing, Farmer says. While there was still room for stars who were good at doing the thing for which they were known—the John Waynes who were excellent at playing the John Wayne character—there also were “chameleon” stars who disappeared into roles and wanted to be known for their talent rather than their hair and makeup.

As film evolved, so did technology and culture, Farmer says. With each year, there were more channels, more outlets, more media to dilute what had been a monoculture of film.

“Before everyone had cable and streaming services and social media, movies were much more of a cultural touchpoint,” Farmer says. “People wanted to dress like Humphrey Bogart or Audrey Hepburn. Faye Dunaway wears a beret in Bonnie and Clyde and beret sales go off the charts. People went to the movies, and they recognized and admired these stars.

“One of the markers of stardom is can an individual actor carry a mediocre film to financial success? Another would be, are there people who have an almost obsessive interest in these stars, to the point of modeling themselves after star? Stars tap into a sort of zeitgeist.”

However, the growth and fragmentation of media have meant that viewers have more avenues to see films and more ways to access stars. Even when A-listers’ social media are clearly curated by an army of publicists and stylists, fans can access them at any time and feel like they know them, Farmer says.

“Movies are just less central to people’s lives than they used to be,” Farmer says. “There are other forms of media that people spend their time on, to the point that younger audiences are as likely to know someone who starred in a movie as someone who’s a social media influencer. But that’s just a different kind of stardom.

“I think the film industry really wants movie stars, but I’m not sure viewers necessarily care all that much. Again, it’s always the question of, if you’re spending millions and millions of dollars on a product and you want a return on that, how can you achieve that without making another superhero movie or another horror movie? The industry wants movie stars and audiences just want to be entertained.”


Did you enjoy this article?  Passionate about cinema studies and moving image arts? 

 

In honor of what would have been Paul Newman’s 100th birthday, 鶹ѰBoulder film historian Clark Farmer considers whether there still are movie stars.

Related Articles

Traditional 0 On White ]]>
Fri, 24 Jan 2025 20:08:48 +0000 Rachel Sauer 6060 at /asmagazine
That can of beer tastes and lasts better than you think /asmagazine/2025/01/24/can-beer-tastes-and-lasts-better-you-think That can of beer tastes and lasts better than you think Rachel Sauer Fri, 01/24/2025 - 10:48 Categories: News Tags: Classics Division of Arts and Humanities Research popular culture Doug McPherson

Beer historian and 鶹ѰBoulder Assistant Professor Travis Rupp explains why canned beer, celebrating its 90th anniversary today, has been ‘immensely impactful’ for the industry


“It's Saturday, y'all, here's a plan
I'm gonna throw back a couple …
Until the point where I can't stand
No, nothing picks me up like a beer can.”

  • From “Beer Can” by Luke Combs

 

"Cans are the best containers for beer," says beer archaeologist and historian Travis Rupp, a 鶹ѰBoulder teaching assistant professor of classics. (Photo: Travis Rupp)

On Jan. 24, 1935, some shoppers in Virginia were likely scratching their heads and gawking at something they hadn’t seen beforebeer in cans―s𳦾ھ, Krueger’s Cream Ale and Krueger’s Finest Beer from the Gottfried Krueger Brewing Company. Up until then, beer drinkers had enjoyed their suds in bottles. 

Today, canned beer is commonplace, but according to beer archaeologist and historian Travis Rupp, a 鶹Ѱ teaching assistant professor of classics, even though canning would prove to be “immensely impactful” for the industry, neither brewers nor consumers cared much for cans initially.

“There were false claims made about metal flavor leaching into canned beverages because the beer was coming in contact with the aluminum,” Rupp says. “Where this may have been the case with early steel or aluminum cans, it wasn’t true for most of the container's history.”

Rupp adds that even as late as 2015, glass bottles were viewed as better containers for beer, given that they were “nicer” for presentation.

Yet today, cans have emerged as the clear winner in the beer game. A Colorado example: MillerCoors Rocky Mountain Metal Container, based near the Coors campus in Golden, now churns out roughly .

“Cans are the best containers for beer. They don’t let in sunlight or oxygen, which are both detrimental to beer,” says Rupp. “Bottles let in sunlight. Even brown or amber bottles allow a small percentage of ultraviolet rays through, which can skunk or spoil the beer. Bottles also can leach in oxygen through the cap over time as the seal breaks down. Bottles still have a place for cellaring or aging high gravity barrel-aged beers or sours, but if you want your beer to stay and taste fresh the longest, you opt for cans.”

The case for cans

Over the decades, cans have also helped brewers’ bottom lines: “Cans are far cheaper because they’re much lighter to ship,” Rupp explains. “Freight shipping costs are mostly dictated by weight. This ultimately can result in higher profits for breweries and lower costs for consumers. They’re also far, far cheaper to store, since they require far less space than glass bottles and cartons.”

 

The first canned beers were Krueger's Cream Ale and Krueger's Finest Beer. (Photo: Brewery Collectibles Club of America)

Long before cans made their debut, Rupp says some breweries tried replacing wooden casks with metal kegs throughout the 19th century, but no protective liner existed to prevent metallic leaching in these containers. “And given the long duration that beer would sit in the metal casks before serving, the flavor would become quite awful. It wasn’t until the 1960s that stainless steel kegs hit the market.”

About that metallic-flavor-leaching debate, Rupp says aluminum can producers now apply a patented protective liner to the inside of their cans to prevent leaching. “If you cut open a can produced by the Ball Corporation [the global packaging giant], you’ll find … a dull grayish-white crosshatched pattern in the can. This is the protective liner, and I assure you no metal flavor is leaching into your beer.”

But for Rupp, perhaps the most impressive technology comes in what’s called the seaming process on cans. The ends (or top) of the can are produced separately. Once the cans are filled, the end is placed on top and goes through a series of rollers and chucks to seam the top of the can.

“This bond is so tight that the sides of the can will fail before the seam does. It’s a really cool advancement in canning technology, as are canning machines in general that work hard to ensure no oxygen ends up in the beer before the cans are sealed. We’ve come a long way from church keys and pull tabs on beer cans.”


Did you enjoy this article?  Passionate about classics? Show your support.

 

Beer historian and 鶹ѰBoulder Assistant Professor Travis Rupp explains why canned beer, celebrating its 90th anniversary today, has been ‘immensely impactful’ for the industry.

Related Articles

Traditional 0 On White ]]>
Fri, 24 Jan 2025 17:48:48 +0000 Rachel Sauer 6059 at /asmagazine
Historian Henry Lovejoy wins $60,000 NEH fellowship /asmagazine/2025/01/15/historian-henry-lovejoy-wins-60000-neh-fellowship Historian Henry Lovejoy wins $60,000 NEH fellowship Rachel Sauer Wed, 01/15/2025 - 17:41 Categories: News Tags: Awards Division of Arts and Humanities Faculty History

NEH funding also was awarded for two other humanities projects at 鶹ѰBoulder


鶹Ѱ Department of History Associate Professor Henry Lovejoy has won a $60,000 fellowship from the  to allow him to research and write a book about involuntary African indentured labor between 1800 and 1914.

Lovejoy’s research focuses on the political, economic and cultural history of Africa and the African Diaspora. He also has special expertise in digital humanities and is director of the Digital Slavery Research Lab, which focuses on developing, linking and archiving open-source data and multi-media related to the global phenomenon of slavery and human trafficking.

 

鶹ѰBoulder Department of History Associate Professor Henry Lovejoy has won a $60,000 NEH fellowship to research and write a book about involuntary African indentured labor between 1800 and 1914.

Additionally, Lovejoy spearheaded the creation and update of the website , a living memorial to the more than 700,000 men, women and children who were “liberated” but not immediately freed in the British-led campaign to abolish African slave trafficking.

The term “Liberated Africans” coincides with a now-little-remembered part of history following the passage of the Slave Trade Act of 1807 by the United Kingdom’s Parliament, which prohibited the slave trade within the British Empire (although it did not abolish the practice of slavery until 1834).

Around the same time, other countries—including the United States, Portugal, Spain and the Netherlands—passed their own trafficking laws and operated squadrons of ships in the Atlantic and Indian oceans to interdict the slave trade.

However, in a cruel twist of fate, most of those “liberated” people weren’t actually freed—but were instead condemned as property, declared free under anti-slave trade legislation and then subjected to indentures lasting several years.

Lovejoy said the NEH fellowship is allowing him to take leave from work to write his book, focused on lax enforcement of anti-slavery laws, migratory patterns of African laborers, their enslavement and subsequent use as indentured laborers around the world from 1800 to 1914.

“I’m deeply grateful for being awarded this opportunity, as the NEH plays such a vital role in supporting the humanities by funding projects that foster our cultural understanding, historical awareness, and intellectual inquiry,” he said.

Meanwhile, Lovejoy said he is also writing a biography about Sarah Forbes Bonetta, a “liberated African” who was apprenticed by Queen Victoria, after conducting research in royal, national and local archives in England, Sierra Leone and Nigeria. Lovejoy also wrote the book , a biography of an enslaved African who rose through the ranks of Spain’s colonial military and eventually led a socio-religious institution at the root of an African-Cuban religion, commonly known as Santería. 

 

鶹ѰBoulder Professor Patrick Greaney (left) won a $60,000 NEH fellowship to research and write a book about German manufacturer Braun; Wilma Doris Loayza (right), teaching assistant professor in the Latin American and Latinx Studies Center, along with co-project directors Joe Bryan, Leila Gomez and Ambrocio Gutierrez Lorenzo, won a two-year, $149,925 grant to develop course modules and educational resources about Quechua and Zapotec language and culture. 

Lovejoy’s NEH fellowship was one of three NEH awards to 鶹ѰBoulder faculty. Other awards granted were:

Germanic and Slavic Languages and Literatures Professor Patrick Greaney won a $60,000 fellowship to research and write a book about German manufacturer Braun, National Socialism and the creation of West German culture between1933-1975, focusing on Braun from the beginning of the Nazi regime through the 1970s in the Federal Republic of Germany. Greaney’s research focuses on literature, design and modern and contemporary art.

Wilma Doris Loayza, teaching assistant professor at the Latin American and Latinx Studies Center, and affiliated faculty of the Center for Native American and Indigenous Studies, along with co-project directors Joe Bryan, Leila Gomez and Ambrocio Gutierrez Lorenzo, won a two-year, $149,925 grant to develop course modules and educational resources about Quechua and Zapotec language and culture as part of efforts to expand and strengthen the Latin American Indigenous Languages and Cultures program.

The awards to 鶹ѰBoulder faculty were part of $22.6 million in grants the NEH provided to 219 humanities projects across the country. The awards were announced Tuesday.

“It is my pleasure to announce NEH grant awards to support 219 exemplary projects that will foster discovery, education, and innovative research in the humanities,” said NEH Chair Shelly C. Lowe.

“This funding will strengthen our ability to preserve and share important stories from the past with future generations, and expand opportunities in communities, classrooms, and institutions to engage with the history, ideas, languages, and cultures that shape our world.”


Did you enjoy this article?  Passionate about history? Show your support.

 

NEH funding also was awarded for two other humanities projects at 鶹ѰBoulder.

Related Articles

Traditional 0 On White ]]>
Thu, 16 Jan 2025 00:41:10 +0000 Rachel Sauer 6053 at /asmagazine
Historian still making a strong case for Black Majority /asmagazine/2025/01/06/historian-still-making-strong-case-black-majority Historian still making a strong case for Black Majority Rachel Sauer Mon, 01/06/2025 - 15:53 Categories: Books Tags: Black History Books Division of Arts and Humanities History Research Bradley Worrell

鶹ѰAdjunct Professor Peter H. Wood’s seminal 1974 book on race, rice and rebellion in Colonial America recently celebrated its 50th anniversary with an updated version


If Peter H. Wood wants to stump some University of Colorado history majors about early American history, he’ll ask them which of the original 13 colonies was the wealthiest before the American Revolution and also had an African American majority at the time.

“Often, they will see it as a trick question. Some might guess New Jersey or New York or Connecticut, so most people have no idea of the correct answer, which is South Carolina,” says Wood, a former Rhodes Scholar and a Duke University emeritus professor. He came to the 鶹ѰBoulder Department of History as an adjunct professor in 2012, when his wife, Distinguished Professor Emerita Elizabeth Fenn, joined the department.

 

Peter H. Wood has been an associate professor at 鶹ѰBoulder for more than a dozen years, following a lengthy career teaching American history at Duke University.

South Carolina colonial history is a topic with which Wood is intimately familiar, having written the book , which was first published in 1974 and has been described as   W. W. Norton published a 50th anniversary edition of the book in 2024.

Recently, Wood spoke with Colorado Arts and Sciences Magazine about how he first brought the story of colonial South Carolina to light, reflecting on how the book was received at the time and why this part of history remains relevant today. His responses have been lightly edited for style and condensed for clarity.

Question: How did you become aware of this story of colonial South Carolina, which was unfamiliar to many Americans in 1974 and perhaps still is today?

Wood: I knew when I was an undergraduate that I wanted to study early American history. After a two-year stint at Oxford in the mid-1960s, I came back to Harvard for graduate school.

At that time, the Civil Rights Movement was going on. I’d been very interested in those events, as most of my generation was, and I wanted to see how I could put together my interest in interracial problems with my interest in early American history.

What I found was that early American history was very New England-oriented in those days. Ivy League schools were cranking out people writing about the Puritans, and when they wrote about the South, they would mainly write about Virginia. They talked about Jefferson and Washington. South Carolina had hardly been explored at all. There are only 13 British mainland colonies, after all, so to find that one of them had scarcely been studied was exciting.

Specifically, I was motivated by the Detroit riot in 1967, watching it unfold on television in the summer of 1967. Roger Mudd, the old CBS reporter, was flying over Detroit in a helicopter the way he’d been flying over Vietnam. He was saying, ‘I don’t know what’s going on down there.’ I realized that he was supposed to be explaining it to us, but he didn’t really have a very good feel for it himself. No white reporters did.

And the very next morning I went into Widener Library at Harvard and started looking at colonial history books to see if any of them covered Black history in the very early period … and South Carolina was completely blank. So, that was what set me going.

Question: If there wasn’t any significant scholarship about South Carolina prior to the American Revolution, particularly about African Americans living there, how did you conduct research for your book?

Wood: I went to the South Carolina State Archives in Columbia, not knowing what I would be able to find. I understood that if I did find materials, they would be written by the white colonists … because enslaved African Americans were not allowed to read and write. There wasn’t going to be anybody who was African American keeping a diary.

But what I did find was that the records were abundant. That’s partly because these enslaved people were being treated as property; they had a financial value. So, when I would open a book, there would be nothing in the index under ‘Negroes’ (that was the word used in those days). But I would look through the book itself and there were all kinds of references to them. They just hadn’t been indexed, because they weren’t considered important.

At every turn, there was more material than I expected, and often dealing with significant issues. …

And when you’re researching early African American history, you learn to read those documents critically. The silver lining of that sort of difficult research is that it forces you to be interdisciplinary and to use any approach you can.

 

Black Majority by 鶹ѰAssociate Professor Peter H. Wood was updated for its 50th anniversary in 2024. First published in 1974, the book broke new ground in showing how important slaves were to the South Carolina economy in Colonial times.

So, I ended up using some linguistics and some medical history (about malaria) and especially some agricultural history. Most people back then—and most Americans still today—don’t realize that the key product in South Carolina was rice. I argued successfully and for the first time in this book that it seemed to have originated with the enslaved Africans. The gist of the book is that these people were not unskilled labor; they were skilled and knowledgeable labor, and it was a West African product (rice) that made South Carolina the richest of the 13 colonies.

Question: With regard to Black Majority, you made the statement, ‘Demography matters.’ What do you mean by that?

Wood: I realized early on that demography was a very radical tool in the sense that it obliges you, or allows you, to treat everybody equally. In other words, to be a good demographer, you have to count everybody: Men, women and children, Black and white, gay and straight—everybody counts equally. As a born egalitarian, that was appealing, especially in a period where there were lots of radical ideas bouncing around that I was a little leery of.

But demography seems very straightforward, as in: All I have to do is count people. So, the very title of the book, Black Majority, is a demographic statement. It’s not saying, ‘These people are good or bad’ or anything else. It’s just saying, ‘Here they are.’ It becomes what I call a Rorschach test, meaning it’s up to the reader as to what they want to make out of these basic facts. …

The book—especially in those days—was particularly exciting for young African Americans, because they’d been told they didn’t have any history, or that it was inaccessible.

Remember, this was even before Alex Haley had published Roots. I actually met Alex while he was working on his book, because I was one of the only people he could find who was interested in slavery before the American Revolution. Most of the people who were studying Black history—which was only a very small, emerging field in those days—were either studying modern-day Civil Rights activities and Jim Crow activities, or maybe the Civil War and antebellum cotton plantations.

Question: You initially undertook your research on this topic to write your PhD dissertation. At what point in the process did you think your findings could make for a good, informative book?

Wood: Very early on, I thought I wanted to write a book. I mean, I wanted to be able to publish something and I wanted to start at the beginning. … If I could go all the way back to 1670, when this colony began, and find records, and tell the story moving forward—instead of going backwards from the Civil Rights movement—I wanted to do that.

If I could write a book about that, then it would show lots of other people that they could write a book about Blacks in 18th-century Georgia or 19th-century Alabama, for example. All of those topics had seemed off limits at the time.

So, I was going to start at the beginning and move forward and see how far I had to go to get a book. I thought, ‘I’ll probably have to go up to 1820,’ but by the time I got to 1740, by the time I got through the —which was the largest rebellion in Colonial North America, in 1739, and it was unknown to people—I had enough for a book.

I had enough (material) for a dissertation so I could get my degree, but I also had enough for a book. And, luckily for me, it was just at the time when there was a lot of pressure on universities to create Black Studies programs, in the late 1960s and early 1970s.

That put a lot of pressure on New York publishers to find books about Black history. And so, Alfred Knopf in New York took the book and gave me a contract within two weeks. I was very lucky in that regard: That was a moment where it was just dawning on everybody that, ‘My goodness! There’s a huge area here where we have not shone a searchlight.’ …

I'll tell you a funny story. At Knopf, they said, ‘You should go talk to our publicity director,’ because they were excited about this book. I walked into her office, and she was this burly, blonde advertising woman. Her face just dropped. She said, ‘Oh, Dr. Wood, I thought you were Black!’ And then she brightened up. ‘That’s all right,’ she said. ‘I'll get you on the radio.’ (laughs)

 

Peter H. Wood, here exploring chimney remains, is revising his book Strange New Land: Africans in Colonial America, which will be published in an expanded edition this year.

So, that just illustrates, if I’d been Black, it would have been even better, but at that point, anything was grist for the mill, especially if it was opening up new territory in American history.

Question: That actually raises a question: Did you face any criticism as a white author writing about Black history, like author William Styron did?

Wood: That was the controversy about William Styron’s 1967 book,  Styron was a white Connecticut author, and quite well-informed and well-intended. He had been raised in Virginia himself, so he’d grown up with versions of this story.

He was not a historian. Still, he wanted to try to write about from Turner’s perspective. So, he had the freedom of a novelist, of trying to put himself inside Nat Turner’s head. That effort was troublesome to a lot of folks.

It bothered some Black folks because it was a white author trying to do that and showing a complicated version of things. It was also upsetting to some white folks. If they knew about Nat Turner at all, it was that he was some crazy madman who killed people, so the idea that you should try to get inside his head, that was upsetting to them.

But, in answer to your question, I was lucky in that … the critique that white people shouldn’t do Black history had not really taken hold. At that time (1974), very little was being written about African Americans in Colonial times … and so there was a desire for anything that could shine some light on the subject.

Question: Why do you think Black Majority has maintained its staying power over the years? And what changes were made for the 50th-anniversary edition that W. W. Norton published?

Wood: As I’ve said, it came along at the right time. Along with other works, it opened up a whole new area, and so early African American history is now a very active field.

When I did the revisions for this 50th-anniversary edition, I didn’t change it drastically, because it is a product of the early 1970s, of 50 years ago. I think the points I made then have held up pretty well. That’s why I’d say it has been influential in the academic community, but for the general public, not so much.

Question: Why do you think that is?

Wood: It’s very hard to change the mainstream narrative, especially in regard to our childhood education about early American history. From elementary school on, we hear about Jamestown and about the Puritans; we learn that colonists grew tobacco in Virginia, but almost nothing beyond that. …

I think that’s part of our failing over the last 50 years. The idea of having a national story that everyone can agree upon has fallen apart, and I wish we could knit it back together. It may be too little, too late. But if we if we can ever manage to knit it back together in a more thorough, honest way, African Americans in Colonial times will be one of the early chapters.

Twenty years ago, I worked on a very successful U.S. history textbook called Created Equal, where I wrote the first six chapters. Even then, our team was trying to tie all of American history together in a new and inclusive way—one that everyone could understand and share and discuss. … I hope that book, and Black Majority, is more relevant than ever. 


Did you enjoy this article?  Passionate about history? Show your support.

 

鶹ѰAdjunct Professor Peter H. Wood’s seminal 1974 book on race, rice and rebellion in Colonial America recently celebrated its 50th anniversary with an updated version.

Related Articles

Traditional 0 On White Top image: Remnants of rice fields along the Combahee River in South Carolina. (Photo: David Soliday/National Museum of African American History and Culture) ]]>
Mon, 06 Jan 2025 22:53:30 +0000 Rachel Sauer 6046 at /asmagazine