Several groups of reptiles persisted in Jurassic Africa even as volcanism ruined their habitat
In southern Africa, dinosaurs and synapsids, a group of animals that includes mammals and their closest fossil relatives, survived in a “land of fire” at the start of an Early Jurassic mass extinction, according to a study published January 29, 2020 in the open-access journal PLOS ONE by Emese M. Bordy of the University of Cape Town and colleagues.
The Karoo Basin of southern Africa is well-known for its massive deposits of igneous rocks left behind by extensive basaltic lava flows during the Early Jurassic. At this time, intense volcanic activity is thought to have had dramatic impacts on the local environment and global atmosphere, coincident with a worldwide mass extinction recorded in the fossil record. The fossils of the Karoo Basin thus have a lot to tell about how ecosystems responded to these environmental stresses.
In this study, Bordy and colleagues describe and identify footprints preserved in a sandstone layer deposited between lava flows, dated to 183 million years ago. In total, they report five trackways containing a total of 25 footprints, representing three types of animals: 1) potentially small synapsids, a group of animals that includes mammals and their forerunners; 2) large, bipedal, likely carnivorous dinosaurs; and 3) small, quadrupedal, likely herbivorous dinosaurs represented by a new ichnospecies (trace fossils like footprints receive their own taxonomic designations, known as ichnospecies).
These fossils represent some of the very last animals known to have inhabited the main Karoo Basin before it was overwhelmed by lava. Since the sandstone preserving these footprints was deposited between lava flows, this indicates that a variety of animals survived in the area even after volcanic activity had begun and the region was transformed into a “land of fire.” The authors suggest that further research to uncover more fossils and refine the dating of local rock layers has the potential to provide invaluable data on how local ecosystems responded to intense environmental stress at the onset of a global mass extinction.
Bordy adds: “The fossil footprints were discovered within a thick pile of ancient basaltic lava flows that are ~183 million years old. The fossil tracks tell a story from our deep past on how continental ecosystems could co-exist with truly giant volcanic events that can only be studied from the geological record, because they do not have modern equivalents, although they can occur in the future of the Earth.”
UNLV history professor Elizabeth Nelson separates facts about the effects of marketing, consumerism, and social media on the holiday’s evolution from fiction about love’s golden age.
Pets, spouses, co-workers, friends, classmates: They’re all in line to be on the receiving end of another record year for Valentine’s Day spending, says a new survey by the National Retail Federation.
But as Americans strive to return to the good old days of romance, one UNLV history professor says they never actually existed.
“People love the idea that there were these wonderful eras before our own time when people celebrated Valentine’s Day in the most authentic way,” says Elizabeth Nelson, a 19th-century pop culture expert, who began researching Valentine’s Day three decades ago and literally wrote the book on marketing the holiday. “But there was always this long and complicated history about Valentine’s Day and people actually thought that it was too commercial and insincere from the very beginning.”
We sat down with Nelson to get a handle on the history behind the holiday and the ways advertising, consumerism, and social media have changed the way we celebrate.
Who is St. Valentine and why does he have a holiday?
Popular lore says that in 5th century A.D., there was a St. Valentine who was imprisoned for some transgression. The myth says the jailer’s daughter took pity, brought him food, and tried to save him. The incarcerated man sent her a note of thanks, signing it: “From your Valentine.”
The story falls apart on multiple historical levels — it seems unlikely that the jailer’s daughter would have been literate or that Valentine could’ve gotten paper and pen in a jail cell. But historians argue that — like Christmas, Easter, and many other modern holidays — Christians in the past tended to link saint holidays with pagan celebrations to help solidify conversion because people didn’t want to give up the ways in which they lived their lives. Blending these holidays allowed revelers to keep observing rituals from centuries ago. Over time, the original intent was forgotten.
In this case, there was also a Roman festival called Lupercalia, celebrating fertility, that might have influenced the celebration of Valentine’s Day. While we now celebrate Valentine’s Day in February, in the Middle Ages, Chaucer, in “The Canterbury Tales,” describes the holiday as occurring in May with imagery of springtime, birds, and budding flowers — which makes sense if linked to a Roman holiday centered on fertility.
What’s more, there are several saints throughout history named Valentine. But none of them are patron saints of love.
Who celebrates Valentine’s Day and why?
Valentine’s Day is mostly only celebrated in the United States and Britain. Before the 18th century, it was about exchanging gifts — gloves and spoons were traditional — and being someone’s valentine for a whole year. It sometimes served as a precursor to betrothal.
There are some interesting stories circulating about why it’s not as popular overseas.
Legend has it that in France, women who were rejected by their desired valentine would burn those men in effigy in a bonfire, causing a riotous ruckus — so allegedly, the government outlawed Valentine’s Day in the early 19th century.
In England, there was a practice called “valentining,” where kids would go door to door asking for treats, similar to Halloween. However, over time, these public celebrations got out of hand and sometimes devolved into violence and mob action. So the proper, genteel middle class opted instead to change the focus from human interaction to the less dangerous exchange of cards.
When did the commercialization of Valentine’s Day begin?
In the 1840s, Valentine’s Day took off in the U.S. as increased paper production and printing presses lowered costs and increased the number of pre-printed cards that people could exchange that featured fancy lace, pictures, and other decorations. And sometimes celebrants copied pre-written poems out of books called “valentine writers” that featured bawdy sexual innuendo. My favorite metaphor: grating someone’s nutmeg.
One of earliest American valentine businesses was run by Esther Howland in Worcester, Mass. She was the daughter of an insurance agent who ran a stationary store. She asked her father to import fancy paper, lace, and other decor from England to make valentines to sell. She employed female friends of the family, and asked her brothers to share sample valentines during their work trips as traveling salesmen. Esther received many orders and created a successful business during the 1850s and 1860s. Her story is quite amazing because we don’t think of women as running businesses in the 19th century.
Hallmark was founded in 1911, and technology made it possible to produce valentines in color and with various textures even more inexpensively than before. So, it’s really in the beginning of the 20th century that Valentine’s Day becomes part of a general movement to turn holidays into opportunities for selling things from candy to flowers to magazine advertisements. Valentine’s Day began to center more on children than before. People began exchanging valentines in school. Hallmark played a big role in marketing it to elementary students, shifting the focus to the competitive collecting of the most valentines rather than a single sincere one.
Has romance always been at the center of Valentine’s Day?
Initially, it was about having one valentine throughout the year and possibly becoming betrothed. But it evolved in the 19th century, sparking questions about the sincerity of exchanging pre-printed cards and the sanity of spending exorbitant amounts of money on them.
Valentine’s Day and the exchange of valentines were a way that people in the emerging middle class in the 19th century negotiated that complicated relationship between romantic love and the economic reality of marriage. You could marry someone for love, but you still had to marry someone for love who could support you because most middle-class women didn’t work. So, it was dangerous just to fall in love with people without knowing anything about them. The celebration of Valentine’s Day became a way for people to test the uncomfortable juxtapositions of what love and marriage should be and the reality of what was actually possible. So, not so different from today.
How has social media shifted the celebration of Valentine’s Day?
One of the things that’s nice about Valentine’s Day today is that there are a variety of ways to celebrate. There are Galentine’s and Single Awareness Day celebrations, you can give your pet a gift, or you can even celebrate alone. You don’t have to wait for the candy or the flowers to come. People still do those things, but there’s less pressure to conform to a public declaration or celebration of it. And that’s the thing about Valentine’s Day: It’s about what other people see you doing or getting. How do you perform the idea of love rather than actually express or engage in the act of love. It’s the representation of the commercial items — getting flowers delivered to your office or going to a fancy restaurant or getting a piece of expensive piece of jewelry. It’s what other people think of your couplehood rather than what you think about it.
It is likely that Facebook and other social media have made Valentine’s Day more viral and more toxic, but the framework was already there. It’s not so much that social media really changed the scrutiny that was already at the core of Valentine’s Day; it just created a whole new possibility for performing the act of Valentine’s Day. Because social media sites are all about performing your imagined best self, the level of scrutiny on how you celebrate Valentine’s Day or what you got for Valentine’s day is ratcheted up exponentially on Facebook . It is not just about the people in your office or in your neighborhood, everybody in your world sees whether your sweetie did right by you or not, or vice versa.
Research on the teeth of fossilized dinosaur embryos indicates that the eggs of non-avian dinosaurs took a long time to hatch–between about three and six months. The study, led by scientists at Florida State University, the American Museum of Natural History, and the University of Calgary, was published today in the Proceedings of the National Academy of Sciences and finds that contrary to previous assumptions, dinosaur incubation is more similar to that of typical reptiles than of birds. The work suggests that prolonged incubation may have affected dinosaurs’ ability to compete with more rapidly generating populations of birds, reptiles, and mammals following the mass extinction event that occurred 65 million years ago.
“We know very little about dinosaur embryology, yet it relates to so many aspects of development, life history, and evolution,” said study co-author Mark Norell, Macaulay Curator of Paleontology at the American Museum of Natural History. “But with the help of advanced tools like CT scanners and high-resolution microscopy, we’re making discoveries that we couldn’t have imagined 20 years ago. This work is a great example of how new technology and new ideas can be brought to old problems.”
Because birds are living dinosaurs, scientists have long assumed that the duration of dinosaur incubation was similar to birds, whose eggs hatch within 11 to 85 days. The research team tested this theory by looking at the fossilized teeth of two extremely well-preserved ornithischian dinosaur embryos on each end of the size spectrum: Protoceratops–a pig-sized dinosaur found by Norell and colleagues in the Mongolian Gobi Desert, whose eggs were quite small at 194 grams, or a little less than half of a pound–and Hypacrosaurus, a very large duck-billed dinosaur found in Alberta, Canada, with eggs weighing more than 4 kilograms, or nearly 9 pounds. First, the researchers scanned the embryonic jaws of the two dinosaurs with computed tomography (CT) at the Museum’s Microscopy and Imaging Facility to visualize the forming dentitions. Then they used an advanced microscope to look for and analyze the pattern of “von Ebner” lines–growth lines that are present in the teeth of all animals, humans included. This study marks the first time that these growth lines have been identified in dinosaur embryos.
“These are the lines that are laid down when any animal’s teeth develops,” said lead author and Florida State University professor Gregory Erickson. “They’re kind of like tree rings, but they’re put down daily. And so we could literally count them to see how long each dinosaur had been developing.”
Using this method, the scientists determined that the Protoceratops embryos were about three months old when they died and the Hypacrosaurus embryos were about six months old. This places non-avian dinosaur incubation more in line with that of their reptilian cousins, whose eggs typically take twice as long as bird eggs to hatch–weeks to many months. The work implies that birds likely evolved more rapid incubation rates after they branched off from the rest of the dinosaurs. The authors note that the results might be quite different if they were able to analyze a more “bird-like” dinosaur, like Velociraptor. But unfortunately, very few fossilized dinosaur embryos have been discovered.
“A lot is known about growth in dinosaurs in their juvenile to adult years,” said co-author Darla Zelenitsky, from the University of Calgary. “Time within the egg is a crucial part of development with major biological ramifications, but is poorly understood because dinosaur embryos are rare.”
The study also has implications for dinosaur extinction. Prolonged incubation exposed non-avian dinosaur eggs and attending parents to predators, starvation, and environmental disruptions such as flooding. In addition, slower embryonic development might have put them at a disadvantage compared to other animals that survived the Cretaceous-Paleogene extinction event.
Florida State University graduate student David Kay also is an author on this paper.
This work was funded, in part, by the U.S. National Science Foundation, grant # EAR 0959029, the Macaulay Family, and the Natural Sciences and Engineering Research Council of Canada, grant # 327513-09.
AMERICAN MUSEUM OF NATURAL HISTORY (AMNH.ORG)
The American Museum of Natural History, founded in 1869, is one of the world’s preeminent scientific, educational, and cultural institutions. The Museum encompasses 45 permanent exhibition halls, including the Rose Center for Earth and Space and the Hayden Planetarium, as well as galleries for temporary exhibitions. It is home to the Theodore Roosevelt Memorial, New York State’s official memorial to its 33rd governor and the nation’s 26th president, and a tribute to Roosevelt’s enduring legacy of conservation. The Museum’s five active research divisions and three cross-disciplinary centers support approximately 200 scientists, whose work draws on a world-class permanent collection of more than 33 million specimens and artifacts, as well as specialized collections for frozen tissue and genomic and astrophysical data, and one of the largest natural history libraries in the world. Through its Richard Gilder Graduate School, it is the only American museum authorized to grant the Ph.D. degree and the Master of Arts in Teaching degree. Annual attendance has grown to approximately 5 million, and the Museum’s exhibitions and Space Shows can be seen in venues on five continents. The Museum’s website and collection of apps for mobile devices extend its collections, exhibitions, and educational programs to millions more beyond its walls. Visit amnh.org for more information.
Study puts the ‘Carib’ in ‘Caribbean,’ boosting credibility of Columbus’ cannibal claims
Christopher Columbus’ accounts of the Caribbean include harrowing descriptions of fierce raiders who abducted women and cannibalized men – stories long dismissed as myths.
But a new study suggests Columbus may have been telling the truth.
Using the equivalent of facial recognition technology, researchers analyzed the skulls of early Caribbean inhabitants, uncovering relationships between people groups and upending longstanding hypotheses about how the islands were first colonized.
One surprising finding was that the Caribs, marauders from South America and rumored cannibals, invaded Jamaica, Hispaniola and the Bahamas, overturning half a century of assumptions that they never made it farther north than Guadeloupe.
“I’ve spent years trying to prove Columbus wrong when he was right: There were Caribs in the northern Caribbean when he arrived,” said William Keegan, Florida Museum of Natural History curator of Caribbean archaeology. “We’re going to have to reinterpret everything we thought we knew.”
Columbus had recounted how peaceful Arawaks in modern-day Bahamas were terrorized by pillagers he mistakenly described as “Caniba,” the Asiatic subjects of the Grand Khan. His Spanish successors corrected the name to “Caribe” a few decades later, but the similar-sounding names led most archaeologists to chalk up the references to a mix-up: How could Caribs have been in the Bahamas when their closest outpost was nearly 1,000 miles to the south?
But skulls reveal the Carib presence in the Caribbean was far more prominent than previously thought, giving credence to Columbus’ claims.
Face to face with the Caribbean’s earliest inhabitants
Previous studies relied on artifacts such as tools and pottery to trace the geographical origin and movement of people through the Caribbean over time. Adding a biological component brings the region’s history into sharper focus, said Ann Ross, a professor of biological sciences at North Carolina State University and the study’s lead author.
Ross used 3D facial “landmarks,” such as the size of an eye socket or length of a nose, to analyze more than 100 skulls dating from about A.D. 800 to 1542. These landmarks can act as a genetic proxy for determining how closely people are related to one another.
The analysis not only revealed three distinct Caribbean people groups, but also their migration routes, which was “really stunning,” Ross said.
Looking at ancient faces shows the Caribbean’s earliest settlers came from the Yucatan, moving into Cuba and the Northern Antilles, which supports a previous hypothesis based on similarities in stone tools. Arawak speakers from coastal Colombia and Venezuela migrated to Puerto Rico between 800 and 200 B.C., a journey also documented in pottery.
The earliest inhabitants of the Bahamas and Hispaniola, however, were not from Cuba as commonly thought, but the Northwest Amazon – the Caribs. Around A.D. 800, they pushed north into Hispaniola and Jamaica and then the Bahamas where they were well established by the time Columbus arrived.
“I had been stumped for years because I didn’t have this Bahamian component,” Ross said. “Those remains were so key. This will change the perspective on the people and peopling of the Caribbean.”
For Keegan, the discovery lays to rest a puzzle that pestered him for years: why a type of pottery known as Meillacoid appears in Hispaniola by A.D. 800, Jamaica around 900 and the Bahamas around 1000.
“Why was this pottery so different from everything else we see? That had bothered me,” he said. “It makes sense that Meillacoid pottery is associated with the Carib expansion.”
The sudden appearance of Meillacoid pottery also corresponds with a general reshuffling of people in the Caribbean after a 1,000-year period of tranquility, further evidence that “Carib invaders were on the move,” Keegan said.
Raiders of the lost Arawaks
So, was there any substance to the tales of cannibalism?
Possibly, Keegan said.
Arawaks and Caribs were enemies, but they often lived side by side with occasional intermarriage before blood feuds erupted, he said.
“It’s almost a ‘Hatfields and McCoys’ kind of situation,” Keegan said. “Maybe there was some cannibalism involved. If you need to frighten your enemies, that’s a really good way to do it.”
Whether or not it was accurate, the European perception that Caribs were cannibals had a tremendous impact on the region’s history, he said. The Spanish monarchy initially insisted that indigenous people be paid for work and treated with respect, but reversed its position after receiving reports that they refused to convert to Christianity and ate human flesh.
“The crown said, ‘Well, if they’re going to behave that way, they can be enslaved,'” Keegan said. “All of a sudden, every native person in the entire Caribbean became a Carib as far as the colonists were concerned.”
Michael Pateman of the Turks and Caicos National Museum and Colleen Young of the University of Missouri also co-authored the study.
Without a doubt, Tyrannosaurus rex is the most famous dinosaur in the world. The 40-foot-long predator with bone crushing teeth inside a five-foot long head are the stuff of legend. Now, a look within the bones of two mid-sized, immature T. rex allow scientists to learn about the tyrant king’s terrible teens as well.
In the early 2000s, the fossil skeletons of two comparatively small T. rex were collected from Carter County, Montana, by Burpee Museum of Natural History in Rockford, Illinois. Nicknamed “Jane” and “Petey,” the tyrannosaurs would have been slightly taller than a draft horse and twice as long.
The team led by Holly Woodward, Ph.D., from Oklahoma State University Center for Health Sciences studied Jane and Petey to better understand T. rex life history.
The study “Growing up Tyrannosaurus rex: histology refutes pygmy ‘Nanotyrannus’ and supports ontogenetic niche partitioning in juvenile Tyrannosaurus” appears in the peer-reviewed journal Science Advances.
Co-authors include Jack Horner, presidential fellow at Chapman University; Nathan Myhrvold, founder and CEO of Intellectual Ventures; Katie Tremaine, graduate student at Montana State University; Scott Williams, paleontology lab and field specialist at Museum of the Rockies; and Lindsay Zanno, division head of paleontology at the North Carolina Museum of Natural Sciences. Supplemental histological work was conducted at the Diane Gabriel Histology Labs at Museum of the Rockies/Montana State University.
“Historically, many museums would collect the biggest, most impressive fossils of a dinosaur species for display and ignore the others,” said Woodward. “The problem is that those smaller fossils may be from younger animals. So, for a long while we’ve had large gaps in our understanding of how dinosaurs grew up, and T. rex is no exception.”
The smaller size of Jane and Petey is what make them so incredibly important. Not only can scientists now study how the bones and proportions changed as T. rex matured, but they can also utilize paleohistology– the study of fossil bone microstructure– to learn about juvenile growth rates and ages. Woodward and her team removed thin slices from the leg bones of Jane and Petey and examined them at high magnification.
“To me, it’s always amazing to find that if you have something like a huge fossilized dinosaur bone, it’s fossilized on the microscopic level as well,” Woodward said. “And by comparing these fossilized microstructures to similar features found in modern bone, we know they provide clues to metabolism, growth rate, and age.”
The team determined that the small T. rex were growing as fast as modern-day warm-blooded animals such as mammals and birds. Woodward and her colleagues also found that by counting the annual rings within the bone, much like counting tree rings, Jane and Petey were teenaged T.rex when they died; 13 and 15 years old, respectively.
There had been speculation that the two small skeletons weren’t T. rex at all, but a smaller pygmy relative Nanotyrannus. Study of the bones using histology led the researchers to the conclusion that the skeletons were juvenile T. rex and not a new pygmy species.
Instead, Woodward points out, because it took T. rex up to twenty years to reach adult size, the tyrant king probably underwent drastic changes as it matured. Juveniles such as Jane and Petey were fast, fleet footed, and had knife-like teeth for cutting, whereas adults were lumbering bone crushers. Not only that, but Woodward’s team discovered that growing T. rex could do a neat trick: if its food source was scarce during a particular year, it just didn’t grow as much. And if food was plentiful, it grew a lot.
“The spacing between annual growth rings record how much an individual grows from one year to the next. The spacing between the rings within Jane, Petey, and even older individuals is inconsistent – some years the spacing is close together, and other years it’s spread apart,” said Woodward.
The research by Woodward and her team writes a new chapter in the early years of the world’s most famous dinosaur, providing evidence that it assumed the crown of tyrant king long before it reached adult size.
About Oklahoma State University Center for Health Sciences
Oklahoma State University Center for Health Sciences educates osteopathic physicians, scientists, allied health professionals and health care administrators for Oklahoma with an emphasis on serving rural and underserved Oklahoma. OSU-CHS offers graduate and professional degrees with over 1,000 students enrolled in academic programs in the College of Osteopathic Medicine, the School of Allied Health, the School of Health Care Administration, the School of Biomedical Sciences, and the School of Forensic Sciences. OSU Medicine operates a network of clinics in the Tulsa area offering a multitude of specialty services including addiction medicine, cardiology, family medicine, internal medicine, pediatrics, psychiatry and women’s health. Learn more at https://health.okstate.edu.
In the aftermath of Qasem Soleimani’s killing, President Trump on Twitter threatened to attack 52 Iranian sites that are important to “the Iranian culture,” a threat that has drawn criticism and condemnation as “cultural cleansing” and an action in violation of international law.
Seema Golestaneh, professor of near Eastern studies at Cornell University, studies the anthropology of Islam and culture of Iran. She says threatening to attack cultural sites shows a lack of understanding of the Iranian peoples’ day-to-day lives.
“The threat to attack Iranian cultural sites is akin to threatening to bomb Notre Dame or the Sistine chapel. And to make such claims so cavalierly, without any regard for the deep emotional ties that people have with these sites, seems especially cruel.
“Some of these sites are not just tourist destinations but are still in heavy use and are woven into the fabric of their respective cities. For example, the bazaar of the Imam’s Square and the Khaju Bridge of Isfahan, which were built nearly four hundred years ago, are used by hundreds of thousands of people every day.
“The term ‘cultural heritage sites’ in a way seems to fall short to describe these places and things and their role in the popular imagination. They are ways of life, ways of understanding the self. The United States is a young country and perhaps it is hard to understand this deep affection. But outside of the loss of life, for Iranians, nothing could be more painful.”AddThis Sharing ButtonsShare to Print
Cultural heritage expert available to discuss threats against Iranian cultural sites
Ted Grevstad-Nordbrock, assistant professor of community and regional planning at Iowa State University, is available to comment on threats against Iranian cultural sites. He is an expert on cultural heritage and historic preservation.
Grevstad-Nordbrock has drawn parallels between the threats against Iranian cultural sites and the “Baedeker Raids” by Nazi Germany in 1941-2. During these raids, German war planners used popular European travel guides (“Baedekers”) to identify cultural sites in UK cities for aerial bombardment. This was intended to shock and demoralize the British population. It was also a reprisal for the British bombing of historic cities in Germany’s north.
Grevstad-Nordbrock has conducted research of historic sites during times of armed conflict, in particular exploring how Allied governments protected historic sites in Europe from destruction during World War II, focusing on immoveable cultural heritage (historic buildings, archaeological sites) as opposed to moveable art objects (paintings, sculptures, etc.).
He has a Ph.D. in geography from Michigan State University, a master’s degree in historic preservation planning from Cornell University, a master’s degree in art history from the University of Wiconsin-Madison and a bachelor’s degree in psychology from UW-Madison. He had 25 years professional experience in historic preservation before coming to Iowa State.
For interviews, please contact Chelsea Davis at firstname.lastname@example.org or 515-294-4778.
Erica Armstrong Dunbar enlists students’ help to tell untold stories of the “bravest woman that ever lived”
In an iconic image, Harriet Tubman stands calmly wrapped in a shawl. But the picture that most people associate with Tubman doesn’t scratch the surface of the strength and determination it took leading 60 to 70 slaves to safety through the Underground Railroad.
With the release of the film Harriet, Rutgers scholar Erica Armstrong Dunbar said it’s a good time to shed light on Tubman’s life not only as the famed Underground Railroad conductor, but as a sister, a daughter, a wife, a mother and a woman.
“What we know about Tubman’s life from history books really only consists of 10 years of her life, and I wanted to present her in a way that is fresh,’ said Dunbar, a Rutgers University–New Brunswick Charles and Mary Beard Professor of History and the author of She Came to Slay: The Life and Times of Harriet Tubman. “The point was to be accessible and have it be modern and contemporary, so it connects to readers across generations to make a story that is over 100 years old feel relevant today.”
Dunbar began with Tubman’s grandmother, a woman named Modesty, who endured the Middle Passage and arrived in colonial Maryland in the late eighteenth century. Tubman’s parents, Harriet “Rit” Green and Ben Ross were enslaved by different families on Maryland’s Eastern Shore. Tubman was born with the name Araminta Ross, and her family was separated as many others were during the slave trade, with her three sisters sold to different plantation owners.
“I wanted to start at the very beginning and talk about the things we don’t often hear,” Dunbar said. “I explore her teenage years and her marriage to John Tubman, who actually left her for another woman once she escaped to Philadelphia. I discuss her adopted child Gertie through her second marriage to Nelson Davis, a man 20 years her junior. A decade later, Tubman led a military expedition during the Civil War and rescued close to 750 enslaved people. After the war ended, she continued to fight for 53 years as an activist for the elderly and women’s rights. It’s important that we see all these different sides, so we can begin to look at her as a whole person.”
To offer this perspective, Dunbar enlisted research associates from various universities, including Rutgers–New Brunswick’s Ashley Council, a second-year graduate student focused on African-American history. Council spent months digging through the Freedmen’s Bureau Archives, 19th century newspapers, census data, civil war letters, black abolitionist papers, speeches and many other historical sources. She faced the complex task of uncovering slave history, much of it told through the lens of white supremacy.
“I researched portions of Tubman’s history like the Combahee River Raid, and I started constructing narratives that challenged me to write in a more accessible way, to touch on the humanity of the reader,” said Council, who plans to become a professor of African-American history. “There is not a lot of archival material about Tubman and the history of the enslaved . Archives weren’t made to make the enslaved visible. So, I had to take history based in white supremacy and find the narratives that were hidden beneath. It isn’t something our discipline always allows and this was an amazing opportunity to be a part of a new way of telling her story.”
Dunbar was invited to attend pre-screenings of the film Harriet, which she invited her graduate student associates to join. Students also discussed Tubman’s life in conjunction with the film release during live podcasts and Twitter chats. While there are some differences between her book and the film, there were moments that shed light on the militant side of Tubman, which Dunbar was happy to see on screen.
“She was a fierce black woman — and certainly one of the bravest women that ever lived,” Dunbar said. “She made 13 trips along the Underground Railroad, traveling more than 100 miles and never lost one single person. She reminds us of the importance of the strength of leadership in the darkest of times and to stand up for social injustice. Her story offers hope and encouragement in battling the issues happening today.”
Dunbar said it was important to involve students in her research.
“I want students to have the opportunity to work in the archives and uncover the fragments of history that are untold. It helps them see possibilities in the field of history and the prominence of the Department of History at Rutgers. We are the number one program in African American history in the nation for a reason.”
Broadcast interviews: Rutgers University–New Brunswick has broadcast-quality TV and radio studios available for remote live or taped interviews with Rutgers experts. For more information, contact Cynthia Medina email@example.com
ABOUT RUTGERS—NEW BRUNSWICK
Rutgers University–New Brunswick is where Rutgers, the State University of New Jersey, began more than 250 years ago. Ranked among the world’s top 60 universities, Rutgers’s flagship university is a leading public research institution and a member of the prestigious Association of American Universities. It is home to internationally acclaimed faculty and has 12 degree-granting schools and a Division I Athletics program. It is the Big Ten Conference’s most diverse university. Through its community of teachers, scholars, artists, scientists, and healers, Rutgers is equipped as never before to transform lives.
Experimental cultivation of seed crops lost to history reveals much higher yields than expected
Natalie Mueller grew and calculated yield estimates for two annual plants that were cultivated in eastern North America for thousands of years — and then abandoned.
Make some room in the garden, you storied three sisters: the winter squash, climbing beans and the vegetable we know as corn. Grown together, newly examined “lost crops” could have produced enough seed to feed as many indigenous people as traditionally grown maize, according to new research from Washington University in St. Louis.
But there are no written or oral histories to describe them. The domesticated forms of the lost crops are thought to be extinct.
Writing in the Journal of Ethnobiology, Natalie Mueller, assistant professor of archaeology in Arts & Sciences, describes how she painstakingly grew and calculated yield estimates for two annual plants that were cultivated in eastern North America for thousands of years — and then abandoned.
Growing goosefoot (Chenopodium sp.) and erect knotweed (Polygonum erectum) together is more productive than growing either one alone, Mueller discovered. Planted in tandem, along with the other known lost crops, they could have fed thousands.
Archaeologists found the first evidence of the lost crops in rock shelters in Kentucky and Arkansas in the 1930s. Seed caches and dried leaves were their only clues. Over the past 25 years, pioneering research by Gayle Fritz, professor emerita of archaeology at Washington University, helped to establish the fact that a previously unknown crop complex had supported local societies for millennia before maize — a.k.a. corn — was adopted as a staple crop.
But how, exactly, to grow them?
The lost crops include a small but diverse group of native grasses, seed plants, squashes and sunflowers — of which only the squashes and sunflowers are still cultivated. For the rest, there is plenty of evidence that the lost crops were purposefully tended — not just harvested from free-living stands in the wild — but there are no instructions left.
“There are many Native American practitioners of ethnobotanical knowledge: farmers and people who know about medicinal plants, and people who know about wild foods. Their knowledge is really important,” Mueller said. “But as far as we know, there aren’t any people who hold knowledge about the lost crops and how they were grown.
“It’s possible that there are communities or individuals who have knowledge about these plants, and it just isn’t published or known by the academic community,” she said. “But the way that I look at it, we can’t talk to the people who grew these crops.
“So our group of people who are working with the living plants is trying to participate in the same kind of ecosystem that they participated in — and trying to reconstruct their experience that way.”
That means no greenhouse, no pesticides and no special fertilizers.
“You have not just the plants but also everything else that comes along with them, like the bugs that are pollinating them and the pests that are eating them. The diseases that affect them. The animals that they attract, and the seed dispersers,” Mueller said. “There are all of these different kinds of ecological elements to the system, and we can interact with all of them.”
Her new paper reported on two experiments designed to investigate germination requirements and yields for the lost crops.
Mueller discovered that a polyculture of goosefoot and erect knotweed is more productive than either grown separately as a monoculture. Grown together, the two plants have higher yields than global averages for closely related domesticated crops (think: quinoa and buckwheat), and they are within the range of those for traditionally grown maize.
“The main reason that I’m really interested in yield is because there’s a debate within archeology about why these plants were abandoned,” Mueller said. “We haven’t had a lot of evidence about it one way or the other. But a lot of people have just kind of assumed that maize would be a lot more productive because we grow maize now, and it’s known to be one of the most productive crops in the world per unit area.”
Mueller wanted to quantify yield in this experiment so that she could directly compare yield for these plants to maize for the first time.
But it didn’t work out perfectly. She was only able to obtain yield estimates for two of the five lost crops that she tried to grow — but not for the plants known as maygrass, little barley and sumpweed.
Reporting on the partial batch was still important to her.
“My colleagues and I, we’re motivated from the standpoint of wanting to see more diverse agricultural systems, wanting to see the knowledge and management of indigenous people recognized and curiosity about what the ecosystems of North America were like before we had this industrial agricultural system,” Mueller said.