Ice is an essential for many of us during the long, hot summer. But just how did people in the 19th century enjoy cool drinks in an age before electricity and freezers? Here, Colette Lefebvre-Davis tells us about ice harvesting…

 

As winter creeps, the ponds begin to freeze. As they freeze winter sports enthusiasts begin to dust off their ice skates and ice fishing utensils. Suddenly, it is time to play and ice makes a great place for skating. In the past, it was time to harvest ponds and lakes. In the modern world ice harvesting is no longer practiced. Ice can be made now with electric refrigerators, food is easily preserved with the cold. But not so long ago it was a cash crop. Prominent men and women craved it in the summer months, and once a drink was enjoyed cool and not tepid, it was a necessity for those that could afford it. Of course it was only the wealthy who could afford to buy or keep the ice. 

Images of the ice trade around New York City. From an 1884 edition of Harper's Weekly.

Images of the ice trade around New York City. From an 1884 edition of Harper's Weekly.

American Forefathers had to have Cold Butter!

Thomas Jefferson had a problem with his self-designed ice house around 1806, namely keeping his ice house dry and filled. ”About a third is lost to melting.”[1] Thereafter, it was imperative to catch the water that was in the ice house. Jefferson filled the ice house with snow to insulate the ice and keep it from melting, and still men were employed to empty it.  Jefferson wrote to his overseer, to harvest from the nearby Rivanna River.  Being who he was, a philanthropist, and knowledge seeker Jefferson no doubt waited patiently for his experiment to unfold. He wasn’t around when the first ice house on his property was built; rather he monitored the progress from Philadelphia in 1803. Yet letters were constant between himself and the people at his estate, because he knew that the harvest of ice would allow him to have cold drinks in the summer as well as cool desserts. Cold, heavy, backbreaking work - ice was worth it not only for the famous American President and creator of the Declaration of Independence.

Jefferson also built an ice house at the President’s house in Philadelphia. It has been excavated in recent years and is on display. For Jefferson there was no other method to nicely preserve his butter and meat. Ice was for those with money at that time. During heat waves, while others sipped tepid water, those who were able to drank cool drinks.

Even, Benjamin Franklin is credited with cooling off the palates of the delegates of the constitutional convention while idle one evening. He secured cream from a neighbor’s cow, and used his ice from the storehouse. There were satisfied palates, and certainly cooler tempers.

 

The Hazards of Ice Harvests

Harvesting ice was a cash crop, a winter crop. In New England, just as the ice grew thicker during plunging temperatures, a harvest was pending. Men of the early twentieth century and before slipped on their shoes, tightened their belts and prepared their horses for a harvest at a local pond.  In their inventory were the necessary utensils for harvesting which included an ice gaff, ice pick, ice tongs and ice saw. It was hard, laborious, cold, dangerous and rewarding. People were excited to go to work, to come together as a community of workers, despite the dangers that harvesting presented. Despite cold conditions, accidents and frostbite the harvest was a much looked forward to event.

Now, ice harvest festivals remain, as a fun reminder of the past. Communities gather over ice-covered ponds and snow banks to watch local historians as they demonstrate 19th and 18th century harvesting techniques. Some audience members are invited to participate, carrying large chunks of ice to a sled where it is pulled by a horse to an ice house - if one is available.

But it was made into a lucrative business when Fredric Tudor decided to make money from old fashioned New England winters. Tudor, born in 1783 in Boston, Massachusetts would be known as the “Ice King”. Now, Boston in 1783 was just recovering from the American Revolution. For most it was a depressed and poor place. The population that had once thrived was small; most had left during the revolution to escape the ravages of war and military takeovers. The population was 10,000 in 1780 and many were struggling to make money. Tudor was by no means poor himself though; in fact he had the opportunity to go to Harvard. It wasn’t his destiny; instead he and his brother hunted, fished, practiced courting rituals, and learned the life of the privileged. It was a passing comment at a summer picnic that drove him to think of their pond as a possibility to make money. It not only would change the Caribbean, with ice being shipped from Boston to Martinique, but it would also change the United States. 

Tudor decided that hot climates like Martinique were the best place to start. So he sent his brother out to forge the path for their soon to be booming ice trade. Yes, he was crazy, and if anybody had asked people in Boston, they would have said that it was preposterous to send ice to the warmer climates safely and then once there store it away. 

But ice harvesting became popular, and with a few tweaks in shipping it and preserving it, people began to ask for it. Competition began to sprout up in Maine along the rivers, and other ice companies emerged as the demand grew.

 

How to Harvest Ice

Step 1: First scrape the snow off the ice, it should be six to thirty inches (however to transport it needs to be at least eight inches).

Step 2: Measure grids on the ice and bring horses along to help with the measurements.

Step 3: The next step was to cut through the grooves on the grid, until the blocks break off and float down the cleared channel to the chute where they were hauled up and into the ice house.

Step 4:  Men used breaking off bars and one-handed crosscuts on the ice which they gloated or poled down like a raft to the ice house.

Step 5: Each block was moved up the chute with hooks to various levels as the ice house filled with layers of ice separated and surrounded by layers of sawdust supplied by lumber mills as an insulator.

 

Ice created American Cuisine

Ice harvesting changed the way in which Americans ate. Soon after Mr. Tudor suggested ice in drinks, it became more and more necessary to have it. Newspapers of the time would report that ice harvests were either plentiful or hardly there at all. In the latter case, men would be commissioned to take a voyage to the Arctic, to chip pieces of ice from huge icebergs to satisfy the need back home.

It was an easier way to keep meat and dairy products longer. It sure beat the time it took to preserve with canning or salting. The flavors were reportedly fresher, and that was all the public needed to know. While the ice business boomed, so too did inventors who strove to create ice.

In the 1920s, ice consumers purchased ice boxes lined with zinc or lead to preserve their foods. There were magical, icy cold drinks, ice box cookies, cakes, and pies. The iceman was soon a staple person in most American cities and towns. He would drive in on a horse drawn ice wagon, and simply unload a nicely squared piece with ice hooks, haul it into a person's home and lift it into the ice box. The ice boxes or cold closets as they were also known were created as pieces of furniture, admired and handsome. They were made with trays to catch the water at the bottom, and once they melted the ice man soon came again.

Leftovers were preserved longer, most likely to the chagrin of the children in a home, and around the same time inventors were working on creating American frozen food meals. Refrigeration techniques had been utilized by breweries and then spread to Chicago’s meat packing industry. They were using refrigerants like sulfur dioxide and methyl chloride which were harmfully impacting people who were exposed to it. That type of refrigeration was not going to be placed in homes. In 1884 it was reported that almost every home but the poorest had ice boxes. It became normal for homes to post a sign in the window whenever they needed more ice. However up until the 1930s these meals were mushy, frozen with ice shards, and not very appetizing. Regardless of the early pitfalls of frozen foods, there was still a lot to benefit from in having a home ice box.

 

Today

For now, the last remnants of ice harvesting are exhibits produced in museums, and small sects of those who are bent on living off-grid sustainable lives. The rest of the world relies on refrigeration for ice. Americans, who scoffed at the initial idea of an ice trade, instantly became hooked when they were shown the advantages of using it. Fredric Tudor, the “Ice King”, went bankrupt many times, but leaves an enduring legacy.

 

Did you enjoy the article? Let us know below if so!

 

[1] Boyd, Julian P., Charles T. Cullen, John Catanzariti, Barbara B. Oberg, et al, eds. The Papers of Thomas Jefferson (Princeton: Princeton University Press, 1950-), 11:439.

Abraham Lincoln is one of the most published figures in history. Hundreds of books have been written regarding his most important legacies on the United States. With all of that publishing there are still many misconceptions about Abraham Lincoln that are taught today in schools and in popular culture. Some misconceptions are obviously inaccurate, while others can be intelligently argued in several directions. Here are the debates around ten of the most common ‘misconceptions’ about Abraham Lincoln as shared by Scott M. Hopkins.

A close-up of the official White House portrait of President Abraham Lincoln.

A close-up of the official White House portrait of President Abraham Lincoln.

Abraham Lincoln the Rail Splitter

Most students of history today are confused when they hear the term rail splitter. It had nothing to do with creating railroad tracks, but actually building rail fences. The task was difficult in the 19th century without the use of modern equipment. It was immensely important in keeping livestock managed and property lines separated. Lincoln excelled at the task as a youth and retained the skill as an adult. The chore lent itself to Lincoln’s peculiar physical attributes; tall and lanky, skinny legs, with robust arms, and mammoth hands.

What many people do not realize is that Lincoln actually hated his backwoods upbringing. Even as president he would outperform his own Union Soldiers in exercises of physical endurance, many half his age. Still his preference was for being indoors and reading. In fact he often did extra manual labor to be paid in borrowed books, then subsequently more labor in order to pay them off when he accidently destroyed the treasured texts he had borrowed. Even during the election, Republicans desperately sold the idea of Lincoln as the backwoods hero. City slickers loved the rail splitter image. Lincoln hated it.

 

Abraham Lincoln the Atheist

Like many Americans before and after him, Lincoln struggled with his religious faith. The traditional frontier Baptist tradition he was raised with left him with many more questions than answers. His uncertainty should not be confused with Atheism though. As a child Lincoln made great efforts to memorize passages of scripture and to orate them to his siblings and mother.

Following the demoralizing death of his mother Nancy Lincoln in 1818 to milk poisoning, Lincoln denounced Jesus as the Christ repeatedly in public settings. It was further worsened when his first love, Ann Rutledge, died in 1835. He fell into a melancholy state many today might term depression. Some even worried about him taking his own life. William Herndon, a close friend and the earliest biographer maintained Lincoln was not a Christian, though many more biographies have surfaced challenging that. However, towards the end of his life he made several public announcements for the praise of a higher power. He even attempted to contact the spirit of his dead son, Willie, in séance rituals.

 

Abraham Lincoln Started the Civil War

This topic is contentious in the southern half of the United States as it is commonly understood there that Lincoln was an aggressor to a peaceful separatist movement, known as the Confederate States of America. It does not help that the majority of battles took place in the South, Reconstruction was a failure, and that much of the wealth of the South was invested in slavery, which immediately put businesses, industries, and families out of business at the end of the war. At the height of the Lost Cause movement Lincoln blaming was beginning to receive immense respect among historians.

States’ rights are usually cited as one of the main reasons that Lincoln can be blamed for starting what is still sometimes known as The War of Northern Aggression. Just as states had the right to vote for or against slavery, there is the belief that they could vote to leave the Union. Lincoln held that the secession of South Carolina in December of 1860 - before he would take over the White House - was firmly illegal and pledged not to start the war, but do everything to prepare for it. Imagine today if Donald Trump were elected president. Should states have the right to leave the Union because a majority of people disagree with the candidate who won?

Ironically, Abraham Lincoln advocated for minimal punishment for the Confederacy at the conclusion of the war. His desire to return to investing in infrastructure and creating jobs in the South cannot be measured as he was assassinated before his ideas could become reality.

 

Abraham Lincoln: The Classic Rags to Riches Story

It is true that Lincoln was born in a log cabin in Kentucky (it’s where we get Lincoln Logs from) and that his father barely completed enough labor to provide for the sustenance of his family, let alone save much money. He also spent much of his youth in the frontier of Indiana in another log cabin.

As a teenager though he learned the importance of entrepreneurship after taking a raft to New Orleans and earning a two fifty cent silver coins from two merchants that he assisted with travel of their cargo. He applied himself to his work thereafter, managing a shop, delivering mail, surveying, and even leading a militia in the Black Hawk War of 1832. None of this gave him wealth, nor did his hard work at teaching himself law pay the dividends it does today. Wealth only came to Lincoln through chance that his wife, Nancy Todd Lincoln, came from a prominent Kentucky plantation family with money invested in land and slaves. Even so, Lincoln himself never lived lavishly.

 

Abraham Lincoln owned Slaves

According to historian and East Carolina University Professor Gerald J. Prokopowicz in Did Lincoln Own Slaves And Other Frequently Asked Questions about Abraham Lincoln it is one of the most commonly asked questions by all age groups, races, and creeds regarding the fourteenth president. It’s puzzling to consider why someone would have had such an inclination. It is well documented that Lincoln often supported the end to slavery, but only when he supported an end to rebellion and a return to the Constitution. Nevertheless, he never harbored any desire in owning slaves, despite his wife’s immediate family background.

The case that is sometimes made to argue that Lincoln owned slaves is that during a White House function, short on labor, the Lincolns hired a group of ex-slaves to assist with serving guests. The history suggests that they may not have been ex-slaves as the White House thought, nor were they compensated financially, leading to a slavery connotation. The hiring was handled by the White House staff and not Lincoln, and nor were his staff aware of the workers’ situation.

Lincoln detested slavery and wanted its demise ever since he experienced the sight of it on one of his riverboat trips as a teenager to New Orleans. He never owned a plantation property to necessitate slaves and preferred to do the majority of manual labor himself, even while at The White House.

 

Abraham Lincoln Would Vote for My Party Today

One of the most politically charged assertions is when non-historians attempt to pigeonhole Lincoln into their political party today. Yes, Abraham Lincoln was a Republican, right at the time of the founding of the party and was the first Republican President of the United States. Initially Lincoln was a Whig, though the party dissolved prior to the 1861 election over the issue of slavery. The formation of the Republican Party was almost exclusively made up of abolitionist former Whigs, hell-bent on ending the spread of slavery into new states and territories.  

Still many of his efforts can be argued to be more in line with today’s Democratic Party. Most notably Lincoln introduced the country’s first income tax, spent lavishly on infrastructure and public assistance, and promoted social justice initiatives like attempting to buy all slaves and then relocate them to Liberia for freedom’s sake. Interestingly much of Lincoln’s support in the election of 1861 is today firmly Democrat, while the South, who failed to put him even on the ballot, is firmly Republican.

Lincoln would not fit conveniently into either party today as his political views were often changing as the Civil War changed. He made decisions that he knew were best for the country and its future. Although he filled his cabinet with Republicans, they were all his most fierce competitors and differed from him in many ways, as evidenced in Doris Kearns Goodwin’s essential Lincoln text, Team of Rivals. Lincoln viewed each competitor as the best at what they did and took advantage of their skills, regardless of personal relationship, social, or political persuasion. In fact, his class of politicising is rarely seen today amongst the careerists and party loyal.

 

Abraham Lincoln the Abolitionist

We cannot take away the magnitude of what Abraham Lincoln did to end the Civil War and end slavery. His disgust at slavery was apparent and those closest to him knew he waited for each opportunity to rid the United States of it. Ambitious steps like the Emancipation Proclamation – which didn’t actually free slaves – are not the same as the Abolitionist Movement. Abolitionists were on the front lines and often had no support or funding.

Founded in the Atlantic States, the Abolitionist Movement advocated an end to slavery and largely equal rights for black men and women of the United States. It had its roots in Evangelical churches. It was a tireless and often dangerous commitment. Not only was it unpopular prior to 1861, helping slaves through the Underground Railroad was illegal - often leading to business and political suicide. Well-off business owners, church preachers, and hardworking mothers risked everything and often lost everything hiding slaves and defending the equality of others. Many eventually made their way to Canada where slavery was expressly illegal.

 

Abraham Lincoln Was a Racist

Those that understand Lincoln know that he was not an Abolitionist and certainly did cooperate with slavery until he could remove it. Children of several different generations learned of Lincoln as the Great Emancipator in school. That title is largely dismissed as inaccurate today. Many in the 1960s - namely prominent black journalist Lerone Bennet Jr. - have labelled him nothing more than a typical racist of the time. That was in the heat of the Civil Rights Movement.

The claim set off a firestorm of controversy as several prominent historians arguing both sides began to take shape. Besides the political and war reasons for withholding the end of slavery, Lincoln made a number of outright racist comments during the Douglas Debates in rural Illinois. Comments like: “I am not, nor ever have been in favor of bringing about in any way the social and political equality of the white and black races.” He went on to deny the possibility for intermarriage, blacks to public office, and suggested separation was the best possible outcome.

Today the belief by most historians is that Lincoln was a realist. Many of his decisions while President were motivated by aiding the Union war effort and reuniting the country as whole. They see him shaped and melded by the Radical Republicans of his party. And they recognize that many of his efforts to end slavery and granted citizenship to blacks were revolutionary and hardly necessary for the president.

 

Abraham Lincoln was Homosexual

One of the most important jobs for historians is to teach subsequent generations of what life was like before them.  As we are further removed from that time it becomes more difficult. In Lincoln’s time, men slept with other grown men when it was feasible. Beds were expensive and it was impractical for Lincoln to have attempted to rent his own room and own bed in rural Illinois in the 1840s.

So when Joshua Speed offered Lincoln a room to rent it was Joshua’s room that they shared. On the lawyer’s circuit, the traveling band along with the judges shared a room and bed because they could rarely find an establishment in backwoods Illinois equipped like a hotel is today. It took time for many of these communities to populate themselves and commerce was slow to adjust. Fortunately for the judge, he was so large and overweight, he had his own bed.

Besides sleeping together, those who believe Lincoln was homosexual, cite the many ‘love letters’ exchanged between Lincoln and Speed as evidence of an erotic relationship. In Lincoln’s age it was not uncommon for two men to have shared such an intimate relationship that was not based on eroticism or sexual attraction. Writing to each other in eloquence, respect, and a desire to see a friend again were quite common. Expressing it through letters was nothing to be ashamed of.

 

Abraham Lincoln’s Emancipation Proclamation Freed all Slaves

The accuracy to which Lincoln’s achievements are  taught in primary and secondary schools is haphazard, with this topic perhaps the most misunderstood and poorly taught. The Emancipation Proclamation declared all slaves in the Confederacy to be free. It did not actually make them free. That required a slave owner to acknowledge the proclamation as law. Border States such as Lincoln’s home state of Kentucky were not necessarily required to follow the new Proclamation, nor were Union states and territories like Maryland or Washington, D.C.

The Proclamation set a precedent though. Lincoln took a gamble in making it public after months of drafts and consultation with his cabinet. He wanted to only release it upon high Union morale and only when he could sell it both as the right thing to do, but also as a way to help win the war. It nullified the Fugitive Slave Act which required northerners to return runaway slaves to their masters and allowed the Union to prevent slaves from assisting the Confederacy on the battlefield with supplies and chores vital to their efforts.

Even more important to teach was that not all of America rejoiced at The Emancipation Proclamation. One more egregious error taught in our schools is that all of the North was in unison in opposition to slavery. After Lincoln’s announcement many families began to question what their husbands, sons, brothers, and fathers were fighting for. Certainly they would not fight for African Americans, who experienced segregation and black codes – prohibitive living and working laws – in big cities across the North.

 

Scott M. Hopkins is a personal property appraiser focusing on numismatics. Do you have a rare coin at home that you believe might make you rich? Send Scott a message on his website. He will give you a thorough understanding of what to do with your rare coins.

Links

https://www.google.ca/search?q=staircase+made+of+abraham+lincoln+books&biw=1366&bih=659&source=lnms&sa=X&ved=0ahUKEwihmt7Zp5TPAhXErB4KHTO4DEAQ_AUIBygA&dpr=1

http://history.furman.edu/benson/fyw2010/graham/grahamcharactersource4.htm

http://blogs.chicagotribune.com/news_columnists_ezorn/2008/04/did-abraham-lin.html

https://www.tripadvisor.ca/ShowUserReviews-g60708-d108188-r263148782-Abraham_Lincoln_Birthplace_National_Historical_Park-Hodgenville_Kentucky.html

http://blogs.chicagotribune.com/news_columnists_ezorn/2008/02/lincoln.html

http://www.goodreads.com/book/show/2062906.Did_Lincoln_Own_Slaves_

http://www.smithsonianmag.com/history/what-can-collapse-whig-party-tell-us-about-todays-politics-180958729/?no-ist

http://quod.lib.umich.edu/j/jala/2629860.0002.104/--lincoln-and-the-problem-of-race-a-decade-of-interpretations?rgn=main;view=fulltext

The word progressive is used a badge of honor by some and a means of attack by others in modern politics. But to be progressive meant something different in earlier times. Here, Joseph Larsen tells us about a new book on the subject: Illiberal Reformers: Race, Eugenics and American Economics in the Progressive Era, by Thomas C. Leonard.

Bernie Sanders, a self-styled progressive and contender for the Democratic presidential nomination in 2016.  Pictured here in 2014.

Bernie Sanders, a self-styled progressive and contender for the Democratic presidential nomination in 2016.  Pictured here in 2014.

The United States is in an election year with public confidence in government sinking – 2014 and 2015 Gallup polls show confidence in Congress at all-time lows.[1] Voters and pundits are engaged in bitter battles over the meaning of left and right, with the politically charged term “progressive” used and abused by voices across the political spectrum. Bernie Sanders and Hillary Clinton, the leading Democratic Party candidates, both wear it as a badge of honor. But this term is often used but little understood. During Barack Obama’s first presidential term, one left-leaning history professor described a progressive as anyone “who believes that social problems have systemic causes and that governmental power can be used to solve those problems.”

Progressivism has an ugly history, too. The side of the Progressive Era the American left would rather forget is dredged up by Princeton University Scholar Thomas C. Leonard in Illiberal Reformers: Race, Eugenics and American Economics in the Progressive Era. In a scathing criticism of the American Progressive Era Leonard emphasizes the movement’s rejection of racial equality, individualism, and natural rights. Progressivism was inspired by the torrent of economic growth and urbanization that was late nineteenth century America. Mass-scale industrialization had turned the autonomous individual into a relic. “Society shaped and made the individual rather than the other way around,” writes Leonard. “The only question was who shall do the shaping and molding” (p. 23). Naturally, the progressives chose themselves for that task.

Much of the book is devoted to eugenics. Defined as efforts to improve human heredity through selective breeding, the now-defunct pseudoscience was a pillar of early 20th century progressivism. Leonard argues that eugenics fit snugly into the movement’s faith in social control, economic regulation, and Darwinism (p. 88). But Darwin was ambiguous on whether natural selection resulted in not only change but also progress. This gave progressive biologists and social scientists a chance to exercise their self-styled expertise. Random genetic variance and the survival of inferior traits is useless; what’s needed is social selection, reproduction managed from above to ensure proliferation of the fit and removal of the unfit (p. 106). Experts could expose undesirables and remove them from the gene pool. Forced sterilization and racial immigration quotas were popular methods.

 

 

The book’s most memorable chapter is where it analyzes minimum wage legislation. These days, this novelty of the administrative state is taken for granted – many on the left currently argue that raising the wage floor doesn’t destroy jobs – but Leonard finds its roots in Progressive Era biases against market exchange, immigrants, and racial minorities. Assuming that employers always hire the lowest-cost candidates and that non-Anglo-Saxon migrants (as a function of their inferior race) always underbid the competition, certain progressives undertook to push them out of the labor market. Their tool was the minimum wage. Writes Leonard:

The economists among labor reformers well understood that a minimum wage, as a wage floor, caused unemployment, while alternative policy options, such as wage subsidies for the working poor, could uplift unskilled workers without throwing the least skilled out of work … Eugenically minded economists such as [Royal] Meeker preferred the minimum wage to wage subsidies not in spite of the unemployment the minimum wage caused but because of it (p. 163).

 

In the hands of a lesser author, this book could have been a partisan attack on American liberalism, and one that would find a welcoming audience in the current political landscape. Leonard deftly stands above the left-right fray. Rather than give ammunition to the right he argues that progressivism attracted people from both ends of the political spectrum. Take Teddy Roosevelt, a social conservative and nationalist who nonetheless used the presidency to promote a progressive agenda. “Right progressives, no less than left progressives were illiberal, glad to subordinate individual rights to their reading of the common good. American conservative thinking was never especially antistatist”, Leonard writes (p. 39). Furthermore, eugenics had followers among progressives, conservatives, and socialists alike. The true enemy of progressivism? Classical liberalism, the belief that society is a web of interactions between individuals and not a collective “social organism.”

 

Insights for today?

Leonard combines rigorous research with lucid writing, presenting a work that is intellectually sound, relevant, and original. Readers should take his insights to heart when asking how much of the Progressive Era still lives in 2016. The answer is not simple. Contemporary progressives like Clinton and Sanders certainly don’t espouse biological racism. For those who whip up anti-immigrant sentiment to win votes, “progressive” is a dirty word, not a badge of honor. Moreover, the American left long ago abandoned attempts to control the economy via technocratic experts.

But that doesn’t tell the whole story. Modern progressives still place a disturbing amount of faith in the administrative state and a lack of it in market exchange. Leonard closes by arguing that the Progressive Era lives on: “Progressivism reconstructed American liberalism by dismantling the free market of classical liberalism and erecting in its place the welfare state of modern liberalism.” (p. 191). It is up to the reader to decide whether that is something to be lauded or fought against.

 

Did you find the article interesting? If so, share it with others by clicking on one of the buttons below.

 

You can buy the book Illiberal Reformers: Race, Eugenics and American Economics in the Progressive Era, by Thomas C. Leonard here: Amazon US | Amazon UK

 

Joseph Larsen is a political scientist and journalist based in Tbilisi, Georgia. He writes about the pressing issues of today, yesterday, and tomorrow. You can follow him on Twitter @JosephLarsen2.

 

[1] “Confidence in Institutions.” Gallup.com. Accessed January 29, 2016. http://www.gallup.com/poll/1597/confidence-institutions.aspx/.

For most of us, cocaine brings to mind the image of drug-fueled discos or wealthy Wall Street stockbrokers, feeding an insatiable habit. However, the history of this addictive stimulant is a far more interesting tale than one might imagine. Liz Greene explains.

An 1885 advert for children's cocaine toothache drops.

An 1885 advert for children's cocaine toothache drops.

The story of cocaine starts in the high mountain ranges of South America, where native Peruvians chewed the leaves of the coca plant in order to increase energy and strength. The stimulating effects of the leaf sped breathing, raising the oxygen level in their blood and countering the effects of living in thin mountain air. Once the Spanish arrived in the 1500s, word of the coca plant and its interesting effects began to spread.

 

The Wonder Drug

In 1859, German chemist Albert Niemann isolated, extracted, and named the purified alkaloid cocaine from a batch of coca leaves transported from South America. Despite the detailed information he provided on the alkaloid in his dissertation, it wouldn’t be until later in the century that its effects were recognized in the medical community.

As medical experiments testing cocaine’s analgesic properties began, other doctors were studying the drug’s more stimulating traits. In 1883, Theodor Aschenbrandt, a German army physician, administered cocaine to soldiers in the Bavarian Army. He reported that the drug reduced fatigue and enhanced the soldiers’ endurance during drills. These positive findings were published in a German medical journal, where they came to the attention of famed psychoanalyst, Sigmund Freud.

Freud’s findings on cocaine were based widely on his own experience with the drug. Not only did he use it regularly, he also prescribed it to his girlfriend, best friend, and father. In July 1884, he published Über Coca, a paper promoting cocaine as a treatment of everything from depression to morphine addiction. He concluded,

Absolutely no craving for the further use of cocaine appears after the first, or even after repeated taking of the drug...

 

Unfortunately, he was not only wrong, he was already addicted.

 

A Wider Audience

Inspired by Paolo Mantegazza’s reports of coca use in Peru, French chemist, Angelo Mariani developed a new drink concocted of claret and cocaine. With 6 milligrams of cocaine in every ounce, Vin Mariani became extremely popular, even among such high hitters as Queen Victoria, Pope Leo XIII, and Pope Saint Pius X.

Motivated by the success of Vin Mariani, in 1885, a drugstore owner in Columbus, Georgia decided to formulate his own version. Unfortunately for John Pemberton, the county in which he lived passed prohibition legislation, forcing him to come up with a new recipe for his French Wine Nerve Tonic. In 1886 he created a new, nonalcoholic version based on both coca and kola nut extracts — giving rise to the name Coca Cola. The euphoric and energizing effects of the drink helped to skyrocket the popularity of Coca-Cola by the turn of the century. Until 1903, a standard serving contained around 60mg of cocaine.

But cocaine wasn’t limited to beverages. Throughout the early 1900s, unregulated patent medicines containing cocaine were sold en masse. Toothache drops, nausea pills, analgesic syrups — all were easy to obtain, and far more addictive than consumers realized. By 1902 there were an estimated 200,000 cocaine addicts in the United States.

A 1890s advert for Vin Mariani tonic wine.

A 1890s advert for Vin Mariani tonic wine.

A Serious Problem

As cocaine use in society increased, the dangers of the drug became more evident. In 1903, the New York Tribune ran an expose that linked cocaine to crime in America, pressuring the Coca-Cola Company to remove cocaine from the soft drink. Eleven years later, the Harrison Narcotic Act came into effect, regulating the manufacture and dispense of cocaine in the United States. With the passing of the Narcotic Drugs Import and Export Act in 1922, cocaine became so heavily regulated that usage began to decline sharply — and continued to do so through the 1960s.

In 1970, the Controlled Substances Act was signed into law by President Richard Nixon. It classified cocaine as a Schedule II Controlled Substance, meaning the drug could only be possessed with a written prescription of a practitioner. This allowed for cocaine to still be used medically as a topical anesthetic, but not recreationally.

The passing of the Controlled Substances Act didn’t stop the popular media of the time from portraying cocaine as fashionable and glamorous. Rock stars, actors, and other popular figures of the time brandished paraphernalia like a trendy accessory, and America’s urban youth were watching.

Around this same time, a new, crystallized form of cocaine — known as crack — appeared. This cheaper alternative to cocaine made a name for itself in low-income communities during the 1980s. With such a high rate of addiction, users were willing to do almost anything for their next hit — leading to a dramatic rise in crime and a moral panic labeling crack as an epidemic.

Though cocaine use has steadily declined in recent years, the drug is still gathering about 1,600 new users each day. More than 40,000 people die from drug overdoses each year in the U.S — around 5,000 of which are due to cocaine. It’s seems as though cocaine isn’t quite ready to let go of its place in society — nor does it appear to be going away anytime soon.

 

Liz Greene is a dog loving, beard envying, history and pop culture geek from the beautiful city of trees, Boise, Idaho. You can catch up with her latest misadventures on Instant Lo or follow her on Twitter @LizVGreene.

 

Did you find this article interesting? If so, tell the world – tweet about it, like it, or share it by clicking on one of the buttons below…

Posted
AuthorGeorge Levrier-Jones
Categories19th century

In 1860 Western forces burned the Summer Palace, a wonderful and magnificent building to the northwest of Beijing, China. British and French troops pillaged the palace, and then burned it to the ground in a terrifying act during the Second Opium War. Here, Scarlett Zhu explains what happened and responses to the attack.

The looting of the Summer Palace by Anglo-French forces in 1860.

The looting of the Summer Palace by Anglo-French forces in 1860.

"We call ourselves civilized and them barbarians," wrote the outraged author, Victor Hugo. "Here is what Civilization has done to Barbarity."

One of the deepest, unhealed and entrenched historical wounds of China stems from the destruction of the country's most beautiful palace in 1860 - the burning of the Old Summer Palace by the British and French armies. As Charles George Gordon, a soldier of the force, wrote about his experience, one can "scarcely imagine the beauty and magnificence of the places being burnt."

 

The palace that once boasted of possessing the most extensive and invaluable art collection of China, became a site of ruins within 3 days in the face of some 3,500 screaming soldiers and burning torches. Dense smoke and ashes eclipsed the sky, marble arches crumbled, and sacred texts were torn apart.  At the heart of this merciless act stood Lord Elgin, the British High Commissioner to China, a man who preferred revenge and retaliation to peace talks and compromise. He was also a man highly sensitive to any injustices or humiliation suffered by his own country. Thus, the act was a response to the imprisonment and torture of the delegates sent for a negotiation on the Qing dynasty's surrender. However, as modern Chinese historians would argue, this was a far-from-satisfactory excuse to justify this performance of wickedness, as before the imprisonment took place, there had already been extensive looting by the French and British soldiers and the burning was only "the final blow".

The treasures of the Imperial Palace were irresistible and within the reach of the British and French. Officers and men seemed to have been seized with temporary insanity, said one witness; in body and soul they were absorbed in one pursuit: plunder. The British and the French helped themselves to all the porcelain, the silk and the ancient books - there were an estimated 1.5 million ancient Chinese relics taken away. The extent of this rampant abuse was highlighted even more by the burning of the Emperor's courtiers, eunuch servants and maids - many estimates place the death toll in the hundreds. This atrocious indifference towards human life inflamed international opposition, notably illustrated by Hugo's radiant criticisms.

 

The response to the attack

But there was no significant resistance to the looting, even though many Qing soldiers were in the vicinity - perhaps they had already anticipated the reality of colonial oppression or did not bother themselves with the painful loss of the often-distant imperial family. But the Emperor, XianFeng, was not an unreceptive spectator; in fact, he was said to have vomited blood upon hearing the news.

However, there was evidence to suggest that some soldiers did feel that this was "a wretchedly demoralizing work for an army". As James M'Ghee, chaplain to the British forces, writes in his narrative, he shall "ever regret the stern but just necessity which laid them in ashes". He later acknowledged that it was "a sacrifice of all that was most ancient and most beautiful”, yet he could not tear himself away from the palace's vanished glory. Historian Greg M. Thomas went so far as to argue that the French Ambassador and generals refused to participate this destruction as it "exceeded the military aims of their mission", and would be an irreparable damage to an important cultural monument.

Nowadays, what is left of the palace are the gigantic marble and stone blocks, which used to be backdrops of the European-style fountains situated in the distant corner of the Imperial gardens for entertaining the Emperor, since those made out of timber and tile did not survive the fires. The remains acted as a somber reminder of the West's ransack and the East's "century of humiliation".

This is more than a story of patriotism, nationalism and universal discontent. History used to teach us that patriotism isn't history, but rather propaganda in disguise. Yet how could one ignore and omit a historical event so demoralizing and compelling on its own, that it is no longer a matter of morality and dignity, but a matter of seeking the truth, tracing the past and its inseparable link with the present? When considering the savage and blatant destruction of the Old Summer Palace, along with the unspoken hatred of the humiliated and the suppressed, it seems therefore appropriate to end with the cries of the enraged Chinese commoners as they witnessed the worst of mankind's atrocities: “Kill the foreign devils! Kill the foreign devils!”

 

Did you find this article of interest? If so, tell the world – tweet about it, like it, or share it by clicking on one of the buttons below.

Bibliography

1. Hugo, Victor. The sack of the summer palace, November 1985

2. Bowlby, Chris. "The palace of shame that makes China angry"

3.  M'Ghee, Robert. How we got to Pekin: A Narrative of the Campaign in China of 1860, pp. 202-216, 1862

4. "The Burning of the Yuan Ming Yuan: 150 Years Later", http://granitestudio.org/2010/10/24/the-burning-of-the-yuanmingyuan-150-years-later

5. "Fine China, but at what cost?”, http://thepolitic.org/fine-china-but-at-what-cost/

In this article Janet Ford discusses the horrific act of infanticide in the nineteenth century with the help of records from London’s Old Bailey court – with cases from London and (from 1856) further afield. It provides an insightful look into this terrible crime in Victorian England…

The Old Bailey in the early nineteenth century.

The Old Bailey in the early nineteenth century.

In the nineteenth century there were 203 cases of infanticide recorded in the Old Bailey.

Of the 203 cases, 83 people were found guilty, 114 were found not guilty and one was a ‘misc’ verdict. Out of the 83 who were found guilty, only 18 were actually found guilty of killing, with three of those being found insane and two with a ‘recommendation’. 65 were not guilty of killing but guilty of the lesser crime of concealing the birth. This shows that even though it was a highly emotional and shocking crime women were not automatically found guilty. The reason why so many were found not guilty of killing was often due to medical evidence, such as the health of the baby and mother. There was also an increased involvement of character witnesses in the courts, who could explain the background of the person, and an increased interest in the criminal mind, especially those of women. Finally, there was more of an understanding of childbirth itself.

 

What the cases show about the crime and society

The role of Medical people

As all the cases involved doctors, surgeons or midwives, there was a need and want to have physical evidence, rather than just hearsay, in order to get the right verdict and justice. They would have knowledge and experience of all types of childbirth, and so they could provide evidence of it being accidental, deliberate or it being too difficult to tell.

 

What it shows about Childbirth and its effects on crime

The records show two main aspects of childbirth: the physical effect on the baby and the emotional aspect. The emotional aspect of childbirth was the shame of having a baby out of wedlock - but also of having the father run out during the pregnancy, not being sure who the father was, not wanting to be a single mother, or sexual assault. It meant that women felt they had to injure or kill their baby, conceal the birth or self deliver. They were seen as criminals, which many were, but many were also victims of social attitudes and even of crimes themselves. The physical aspect of childbirth was the consequence of these elements, as women felt they had to deliver on their own. This meant there was no other person to help if the delivery was difficult. An example of the physical affect can be seen with this statement from Doctor Thomas Green in Ellen Millgate’s case.  

Health of the mother and child

The cases show that the health of both the mother and baby were taken into consideration and used as evidence. The health of the mother, such as if she was epileptic, would have affected her ability to care for the baby properly. Poor health helped the mother’s case, as it was out of her control, as did the baby being premature. An example of health being used as evidence is shown with Ellen Middleship, who was found not guilty.

20151011 Clip 2.png

Born alive

One of the main reasons why so many were found not guilty or only guilty of concealing the birth was the baby being born dead on delivery. It was out of the mother’s control, and so she would have been found not guilty. In many cases, it was too difficult to tell if the baby had been born alive during the delivery, as shown with the case of Elizabeth Ann Poyle.

Personal aspects

Along with medical evidence, personal aspects were also taken into consideration. Personal elements such as good character, age, previous children and the relationship with the father were all taken into account. These elements could show that the mother could not have committed the crime, as it was out of character, or at least helped to lessen the punishment, which did happen with many women. An example is shown with Sarah Jeffery giving a statement about Jane Hale, who was guilty of concealing but not of killing.   

Violence

 

The most shocking aspect of the cases, whether the women were found guilty or not guilty, was violence. Violence could have been caused by cutting the cord, getting the child out, falling, or hitting. This was one of the most difficult aspects of a case, as it could be difficult to determine if injuries were caused by the birth or on purpose. What helped resolve this was medical knowledge, an understanding of childbirth, or eyewitness accounts. The understanding of childbirth helped to explain why there were marks on, for example, the neck and head. This was due to ribbons or rope being used to get the baby out, or the baby falling during childbirth. Even though the marks caused by childbirth were not committed on purpose, it is still shocking to read, as shown with Ellen Millgate - the marks were around a vulnerable part of the baby. With the help of eyewitness accounts, it was only in a few cases where it was determined that the injuries were committed on purpose. An example of this can be seen with Ann Dakin giving evidence in the Joseph Burch and Caroline Nash case, who were both found guilty and given four year penal servitude.

It is one of the most shocking cases due to the violence and a reminder that parents could abuse their own children. But also, as with many of the other guilty cases, it shows that women could be quite cruel and violent. Another element of violence was getting rid of the body. The main example is from this description by James Stone of what he found in Martha Barratt’s room. She was found guilty of concealing the birth but not of killing.  

Mercy towards women

Even with the violence, and the shame of committing the crime, the verdicts and the punishments show that there was an understanding and sympathy towards women, as the majority were found not guilty of infanticide or guilty of a lesser crime. This was due to a better understanding of women, society, childbirth, and the criminal mind over the century.

The cases show that infanticide was a very complex crime, as it involved and was affected by so many factors - health, childbirth, social attitudes, babies, violence and high levels of emotion. It also shows the various sides of the 19th century…

 

If you found this article of interest, do tell others. Tweet about it, like it, or share it by clicking on one of the buttons below…

References

Anne-Marie Kilday, A history of infanticide in Britain, c. 1600 to the present (Palgrave Macmillan, 2013)

M Jackson, Infanticide: historical perspectives on child murder and concealment, 1550-2000 (Ashgate, 2002)

Old Bailey Online, January 1800-December 1899, Infanticide 

Ellen Millgate, 28th November 1842

Ellen Middleship, 21st October 1850

Elizabeth Ann Poyle, 22nd May 1882

Jane Hale, 28th November 1836

Joseph Nash and Caroline Nash, 24th October 1853

Martha Barratt, 9th April 1829

Body parts and the strangeness of the human anatomy have fascinated people for centuries. And they have been displayed and collected for some time. Here, Rebecca Anne Lush takes a look at how displays of ‘medical marvels’ have progressed though the ages…

An old scene from the Hunterian Museum in London.

An old scene from the Hunterian Museum in London.

With contents to both fascinate and repulse it is no wonder medical museums continue to entice visitors. Gunther von Hagen’s Body Worlds has attracted thousands of visitors worldwide since its first exhibition in Tokyo in 1995. Today, there are nine exhibitions on display across the world. With a further four planned in the near future, it appears as though this museum has sustained the public’s interest. According to their mission statement, they endeavor to teach the public the ins and outs of anatomy. Body Worlds is not alone. The Mütter Museum in Philadelphia and the Hunterian and Wellcome Museums in London also continue to engage the public with their morbid and fascinating specimens.

The history of medical museums is incredibly rich, filled with mystery and mayhem, curiosity and control. In the Victorian era especially, they came to represent a conflict between the professional and the public. No longer could an individual pay a small fee to sit in on an autopsy and leave with a qualification. As the Victorian era progressed, pathology and anatomy schools both professionalized and specialized. Their conflict with the public realm is a curious case indeed.

 

Before the nineteenth-century

Body parts have been displayed for centuries serving multiple purposes. It can be argued that medieval churches displaying relics and reliquaries were amongst some of the earliest in the Western world.

The collection and display of body parts became a more secular practice during the Renaissance. So called Cabinets of Curiosities allowed avid collectors to organize their specimens and exhibit them to the public. Such cabinets could include human rarities to please and entertain visiting crowds.

It was not until the seventeenth-century, however, that anatomical specimens were more carefully collected, labeled, and stored in permanent institutions. Many anatomy teachers during this period held private collections to increase their credibility. Two very famous brothers, William and John Hunter, collected en masse anatomical specimens later donated to the Royal College of Surgeons. The seventeenth-century was also a time for commercial anatomical displays, such as freak shows and travelling exhibitions of human oddities.

 

Dr Kahn’s Anatomical Museum

Such early examples were the foundations for Victorian public and professional medical museums. No public medical museum was more influential than Dr Kahn’s Anatomical Museum. Joseph Kahn, a self-professed medical doctor, moved from Alsace, Germany to London opening his anatomical museum in London, 1851. Initially entry was restricted to males who could afford the fee of two shillings. After two months, however, women were allowed to step inside during specific viewing times. Eight years later, the admission price halved to one shilling attracting larger crowds and more inquisitive minds.

On entering the exhibition space, visitors encountered an anatomical wax Venus, the organs of which could be removed. The rest of the museum consisted of wax models, specimens held in jars, and special “doctors-only” rooms. Medical doctors frequented Dr Kahn’s until its closure in 1864.

 

Dr. Kahn.

Dr. Kahn.

Professional Museums

Developing alongside these public spectacles were the more professional museums, belonging to hospitals, pathology societies, private schools, universities and Royal Colleges.

More formal institutions collected specimens to aid in medical education. Acquiring both abnormal and normal specimens increased levels of anatomical knowledge and encouraged anatomy to transform into a professional activity that aimed to improve standards of health. Although some were open to the public, the majority were kept under lock and key.

 

Conflict

In 1857 the Obscene Publications Act prevented any ‘obscene’ anatomy to be displayed in a public setting. Dr Kahn’s museum was deemed immoral under this act resulting in its later closure. Other public anatomy museums continued to operate until the mid-1870s.

Both professional and public museums were striving to be centers of education. At first, the professionals admired Dr Kahn’s museum, especially the rooms dedicated to their study. Not only were early opinions favorable, but there is also evidence to suggest there were close relationships. Robert Abercrombie, for example, affiliated himself with the Strand Museum in London, establishing a consultation room next to the museum. Visitors were able to not only visit the museum, but also receive medical care on site.

As the Victorian era progressed, and as anatomy became specialized, these public museums were regarded inappropriate to disseminate such medical information. Ongoing legal and social battles ensured that the professional schools of anatomy and pathology alone were the stakeholders to the industry. It was a conflict of words with professional museums writing at length about their distrust and disgust in their medical journals.

 

Today

It is quite interesting to see another shift occurring in the past few decades. Today, even the more professional museums from the Victorian era are now open to the wider public. No longer is all medical information guarded by the elite and trained, but it can be accessible to anyone who wants to learn. Accompanying this is the fact that public medical museums displaying wax models are again appearing on the medical landscape. The curious case of medical marvels is a comment on how medical museums have been developed and transformed in order to meet the human desire for knowledge.  

 

Did you find this article interesting? If so, tell us why below…

References

Alberti, Samuel J. M. M. Morbid Curiosities: Medical Museums in Nineteenth- Century Britain. Oxford: Oxford University Press, 2011.

Bates, A. W. “Dr Kahn’s Museum: obscene anatomy in Victorian London.” Journal of the Royal Society of Medicine 99, no. 12 (2006): 618-624.

Bates, A. W. “Indecent and Demoralizing Representations: Public Anatomy Museums in mid-Victorian England.” Medical History 52, no. 1 (2008): 1-22.

Kahn, Dr. Joseph. Catalogue of Dr Kahn’s Celebrated Anatomical Museum. Leicester Square: W. J. Golbourn, 1853.

Kesteven, W. B. “The Indecency of the Exhibition of Dr Kahn’s Museum.” Letter. The British Medical Journal 1, no. 49 (1853): 1094.

“Medical News: Dr Kahn’s Anatomical Museum.” The Lancet 1, no. 1443 (April 26, 1851): 474.

Stephens, Elizabeth. Anatomy as Spectacle: Public Exhibitions of the Body from 1700 to the Present. Liverpool: Liverpool University Press, 2011.

It may seem strange, but there is very strong evidence that the White House killed a number of presidents in the mid-nineteenth century. The deaths of Zachary Taylor, William Henry Harrison, and James K. Polk are all linked to something in the White House – although many believed that some presidents were poisoned by their enemies. William Bodkin explains all…

A poster of Zachary Taylor, circa 1847. He is one the presidents the White House may have helped to killed...

A poster of Zachary Taylor, circa 1847. He is one the presidents the White House may have helped to killed...

President of the United States is often considered the most stressful job in the world.  We watch fascinated as Presidents prematurely age before our eyes, greying under the challenges of the office.  Presidential campaigns have become a microcosm of the actual job, with the conventional wisdom being that any candidate who wilts under the pressures of a campaign could never withstand the rigors of the presidency.  But there was a time, not so long ago, when it was not just the stress of the job that was figuratively killing the Presidents.  In fact, living in the White House was, in all likelihood, literally killing them.

Between 1840 and 1850, living in the White House proved fatal for three of the four Presidents who served.  William Henry Harrison, elected in 1840, died after his first month in office.  James K. Polk, elected in 1844, died three months after he left the White House.  Zachary Taylor, elected in 1848, died about a year into his term, in 1850.  The only occupant of the Oval Office during that period to survive was John Tyler, who succeeded to the Presidency on Harrison’s death.  What killed these Presidents?  Historical legend tells us that William Henry Harrison “got too cold and died” and that Zachary Taylor “got too hot and died.”  But the truth, thanks to recent research, indicates that Harrison, Taylor, and Polk may have died from similar strains of bacteria that were coursing through the White House water supply.


Conspiracies and Legends

On July 9, 1850, President Zachary Taylor, Old Rough and Ready, former general and hero of the Mexican-American War, succumbed to what doctors called at the time “cholera morbus,” or, in today’s terms, gastroenteritis.  On July 4, 1850, President Taylor sat out on the National Mall for Independence Day festivities, including the laying of the cornerstone for the Washington Monument.  Taylor, legend has it, indulged freely in refreshments that day, including a bowl of fresh cherries and iced milk.  Taylor fell ill shortly after returning to the White House, suffering severe abdominal cramps.  The presidential doctors treated Taylor with no success.  Five days later, he was dead.

Taylor’s death shocked the nation.  Rumors began circulating immediately concerning his possible assassination.  The rumors arose for a good reason.  Taylor, a Southerner, opposed the growth of slavery in the United States despite being a slave owner himself.  While President, Taylor had worked to prevent the expansion of slavery into the newly acquired California and Utah territories, then under the control of the federal government.  Taylor prodded those future states, which he knew would draft state constitutions banning slavery, to finish those constitutions so that they could be admitted to the Union as free states.

Taylor’s position infuriated his southern supporters, including Jefferson Davis, who had been married to Taylor’s late daughter, Knox.  Davis, who would go on to be the first and only President of the Confederate States of America, had campaigned vigorously throughout the South for Taylor, assuring Southerners that Taylor would be friendly to their interests.  But in truth, no one really knew Taylor’s views.  A career military man, Taylor hewed to the time honored tradition of taking no public positions on political issues.  Taylor believed it was improper for him to take political positions because he had sworn to serve the Commander-in-Chief, without regard to person or party.  Indeed, he had never even voted in a Presidential election before running himself.

Tensions between Taylor and the South grew when Henry Clay proposed his Great Compromise of 1850, which offered something for every interest.  The slave trade would be abolished in the District of Columbia, but the Fugitive Slave Law would be strengthened.  The bill also carved out new territories in New Mexico and Utah.  The Compromise would allow the people of the territories to decide whether those territories would be slave or free by popular vote, circumventing Taylor’s effort to have slavery banned in their state constitutions.  But Taylor blocked passage of the compromise, even threatening in one exchange to hang the Secessionists if they chose to carry out their threats.


More speculation

Speculation on the true cause of Taylor’s death only increased throughout the years, particularly after his former son-in-law, Davis, who had been at Taylor’s bedside when he died, became President of the Confederacy.  The wondering reached a fever pitch in the late twentieth century, when a University of Florida professor, Clara Rising, persuaded Taylor’s closest living relative to agree to an exhumation of his body for a new forensic examination.  Rising, who was researching her book The Taylor File: The Mysterious Death of a President, had become convinced that Taylor was poisoned.  But the team of Kentucky medical examiners assembled to examine the corpse concluded that Taylor was not poisoned, but had died of natural causes, i.e. something akin to gastroenteritis, and that his illness was undoubtedly exacerbated by the conditions of the day.

But what caused Taylor’s fatal illness?  Was it the cherries and milk, or something more insidious?   While the culprit lurked in the White House when Zachary Taylor died, it was not at the President’s bedside, but rather, in the pipes.

During the first half of the nineteenth century, Washington D.C. had no sewer system.  It was not built until 1871.  The website of the DC Water and Sewage company notes that by 1850, most of the streets along Pennsylvania Avenue had spring or well water piped in, creating the need for a sanitary sewage process. Sewage was discharged into the nearest body of water.  With literally nowhere to go, the sewage seeped into the ground, forming a fetid marsh.  Perhaps even more shocking, the White House water supply itself was just seven blocks downstream from a depository for “night soil,” a euphemism for human feces collected from cesspools and outhouses.  This depository, which likely contaminated the White House’s water supply, would have been a breeding ground for salmonella bacteria and the gastroenteritis that typically accompanies it.  Ironically, the night soil deposited a few blocks from the White House had been brought there by the federal government.


Something in the water

It should come as no surprise, then, that Zachary Taylor succumbed to what was essentially an acute form of gastroenteritis.  The cause of Taylor’s gastroenteritis was probably salmonella bacteria, not cherries and iced milk.  James K. Polk, too, reported frequently in his diary that he suffered from explosive diarrhea while in the White House.  For example, Polk’s diary entry for Thursday, June 29, 1848 noted that “before sun-rise” that morning he was taken with a “violent diarrhea” accompanied by “severe pain,” which rendered him unable to move.  Polk, a noted workaholic, spent nearly his entire administration tethered to the White House.  After leaving office, weakened by years of gastric poisoning, Polk succumbed, reportedly like Taylor, to “cholera morbis”, a mere three months after leaving the Oval Office.

The White House is also a leading suspect in the death of William Henry Harrison. History has generally accepted that Harrison died of pneumonia after giving what remains the longest inaugural address on record, in a freezing rain without benefit of hat or coat.  However, Harrison’s gastrointestinal tract may have been a veritable playground for the bacteria in the White House water.

Harrison suffered from indigestion most of his life.  The standard treatment then was to use carbonated alkali, a base, to neutralize the gastric acid.  Unfortunately, in neutralizing the gastric acid, Harrison removed his natural defense to harmful bacteria.  As a result, it might have taken far less than the usual concentration of salmonella to cause gastroenteritis.  In addition, Harrison was treated during his final illness with opium, standard at the time, which slowed the ability of his body to get rid of bacteria, allowing them more time to get into his bloodstream.  It has been noted, that, as Harrison lay dying, he had a sinking pulse and cold, blue extremities, which is consistent with septic shock.  Did Harrison die of pneumonia?  Possibly.  But the strong likelihood is that pneumonia was secondary to gastroenteritis.

Neither was this phenomena limited to the mid-nineteenth century Presidents.  In 1803, Thomas Jefferson mentioned in a letter to his good friend, fellow founder Dr. Benjamin Rush that “after all my life having enjoyed the benefit of well formed organs of digestion and deportation,” he was taken, “two years ago,” after moving into the White House, “with the diarrhea, after having dined moderately on fish.  Jefferson noted he had never had it before.  The problem plagued him for the rest of his life.  Early reports of Jefferson’s even death stated that he had died because of dehydration from diarrhea.

Presidents after Zachary Taylor fared better, once D.C. built its sewer system.  The second accidental President, Millard Fillmore, lived another twenty years after succeeding Zachary Taylor.  But what about the myths surrounding these early Presidential deaths?  They were created, in part, by a lack of medical and scientific understanding of what really killed these men.  With the benefit of modern science we can turn a critical eye on these myths. But we should not forget that myth-making can serve an important purpose past simple deception.  In the case of Zachary Taylor, it provided a simple explanation for his unexpected death.  Suspicion or accusations of foul play would have further inflamed the sides of the slavery question that in another decade erupted into Civil War, perhaps even starting that war before Lincoln’s Presidency.  In Harrison’s case, that overcoat explanation helped the country get over the shock of the first President dying in office and permitted John Tyler to establish the precedent that the Vice-President became President upon the death of a President.  In sum, these nineteenth century myths helped the still new Republic march on to its ever brighter future.


What did you think of today’s article? Do you think it was the water that killed several Presidents? Let us know below…


Finally, William's previous pieces have been on George Washington (link here), John Adams (link here), Thomas Jefferson (link here), James Madison (link here), James Monroe (link here), John Quincy Adams (link here), Andrew Jackson (link here), Martin Van Buren (link here), William Henry Harrison (link here), John Tyler (link here), and James K. Polk (link here).


Sources

  • Catherine Clinton, “Zachary Taylor,” essay in “To The Best of My Ability:” The American Presidents, James M. McPherson, ed. (Dorling Kindersley, 2000)
  • Letter, Thomas Jefferson to Benjamin Rush, February 28, 1803
  • Milo Milton Quaife, ed., “Diary of James K. Polk During His Presidency, 1845-1849” (A.C. McClurg & Co., 1910)
  • Jane McHugh and Philip A. Mackowiak, “What Really Killed William Henry Harrison?” New York Times, March 31, 2014
  • Clara Rising, “The Taylor File: The Mysterious Death of a President” (Xlibris 2007)

Nineteenth century poet Margaret Fuller died in a tragic way in 1850. And it was the writer Ralph Waldo Emerson who was perhaps most devastated by the loss. Here Edward J. Vinski looks at the fascinating relationship between them and what happened after Fuller’s passing.

A nineteenth century engraving of Margaret Fuller.

A nineteenth century engraving of Margaret Fuller.

Margaret Fuller

“On Friday, 19 July, Margaret dies on the rocks of Fire Island Beach within sight of & within 60 rods of the shore. To the last her country proves inhospitable to her.” (Emerson, 1850/1982, p. 511)

 

The Margaret to whom Ralph Waldo Emerson referred is Margaret Fuller, a writer and poet associated with American transcendentalism in the nineteenth century. Born in 1810, Fuller was educated under her father’s direction. Timothy Fuller’s tutelage was both intense and, in its own way, fortuitous. He began her instruction in Latin when she was but six years of age. Her lessons would last throughout the day, and young Margaret was often sent to bed overtaxed and unable to sleep. In spite of the nausea, bad dreams and headaches she incurred, Margaret appreciated that he held her to the same standards to which he would have held a son (Richardson, 1995).

Although they had mutual friends, Fuller and Emerson did not meet until the summer of 1836 when Fuller paid a three-week visit to the Emerson home in Concord, Massachusetts. Prior to this, she had attended some of Emerson’s talks and had wished to meet him for some time, but it was only after he read her translation of Goethe’s Taso that Emerson returned the interest and offered her the long-awaited invitation (Richardson, 1995). Thus began a relationship between the two that would have a profound effect on both of them.

 

Fuller and Emerson

Richardson (1995) has remarked that “Fuller took less from Emerson than either Thoreau or Whitman, and she probably gave him more than either of them” (p. 239-240). Perhaps more than any person other than his deceased first wife, Ellen, Fuller knew best how to pierce the armor of his innermost life. Nowhere is this more clearly evident than in the fact that following their initial meeting, Emerson finished his book Nature which had been drifting toward theoretical idealism. Fuller, according to Richardson (1995), pushed him toward an “idealism that is concerned with ideas only as they can be lived […] with the spiritual only when it animates the material” (p. 240).

Fuller, however, took from Emerson as well.  “From him,” she wrote, “I first learned what is meant by an inward life” (Fuller, n.d., as cited in in Bullen, 2012, Chapter V, para 4). She had long searched for an intellectual mentor and by the time of her first visit to Emerson, she was fearful that she may never find one. In Emerson, she found someone with whom she could share her ideas as well as her intimacies. As their relationship developed, however, it became clear that she was requiring even more from Emerson. Since no written record of her requests survive, precisely what she asked of him is difficult to discern. Although married, he was clearly conflicted by his feelings for her. In his journal, he confessed that she was someone “Whom I always admire, most revere and sometimes love” (Emerson, 1841/1914, p. 167), and in a later entry recorded a nighttime river walk with her. Whatever the case may be, it is clear that Emerson’s second wife, Lydian, saw Fuller as a threat (Allen, 1981).

After editing The Dial, a transcendentalist magazine, for several years, Fuller left America for Europe in the summer of 1846 as a correspondent for the New York Tribune. After some time in England, she relocated to Italy with her husband, Giovani Ossoli[1], a marquis who supported the Italian revolution. Fuller and her husband both took an active role in the revolution, and she chronicled its events in a book she had hoped to publish. When the revolt finally failed, the family, which now included a young son, was forced to return to America. Their ship, the Elizabeth, met with bad luck almost immediately. At Gibraltar, the captain died of smallpox, leaving the ship under the direction of its first mate. In the early morning of July 19, 1850, the ship ran aground on a sandbar a few hundred meters off Fire Island, NY. The following day, Margaret Fuller, her husband, and her child drowned when the ship broke up.

 

Thoreau’s Mission

News of the disaster reached Concord some days later. On or about July 21, Emerson made the journal entry indicated above. In a letter to Marcus Spring, dated July 23, Emerson wrote:

At first, I thought I would go myself and see if I could help in the inquiry at the wrecking ground and act for the friends. But I have prevailed on my friend, Mr Henry D. Thoreau, to go for me and all the friends. Mr Thoreau is the most competent person that could be selected and […] he is authorized to act for them all (Emerson, 1850/1997, p. 385).

 

Emerson doubted that any manuscripts would have survived the wreck, but knowing that Fuller would have had with her the manuscript to her History of the Italian Revolution, he was willing to pay whatever costs Thoreau might incur in his attempt to salvage it.

Thoreau, for his part, set out immediately. On July 25, he wrote to Emerson describing what details he had learned of the disaster:

…the ship struck at ten minutes after four A.M., and all hands, being mostly in their nightclothes, made haste to the forecastle, the water coming in at once […] The first man got ashore at nine; many from nine to noon. At flood tide, about half past three o’clock, when the ship broke up entirely, they came out of the forecastle, and Margaret sat with her back to the foremast, with her hands on her knees, her husband and child already drowned. A great wave came and washed her aft. The steward had just before taken her child and started for shore. Both were drowned (Thoreau, 1850/1958a, p. 262).

 

Margaret Fuller’s remains and those of her husband were never found. Her son’s body washed ashore, dead but still warm. A desk, a trunk, and a carpet bag were recovered from the scene, but none of Margaret’s valuable papers were found. Thoreau promised to do what he could, holding out some hope that, since a significant part of the wreckage remained where the ship ran aground, some items might still be salvaged, but it is clear that he was not confident.

In a letter to abolitionist and future Senator Charles Sumner, whose brother Horace was also aboard, Thoreau wrote

I saw on the beach, four or five miles west of the wreck, a portion of a human skeleton, which was found the day before, probably from the Elizabeth, but I have not knowledge enough of anatomy to decide confidently, as many might, whether it was that of a male or a female (Thoreau, 1850/1958b, p. 263).[2]

 

After visiting nearby Patchogue, New York, where many of those who scavenged the wreckage instead of attempting a rescue were thought to reside, he returned to Fire Island empty handed.

In all, Thoreau’s mission was unproductive. “I have visited the child’s grave,” he wrote to Emerson. “Its body will probably be taken away today” (Thoreau, 1850/1958, p. 262). The corpse of her son, a few insubstantial papers, and a button pried from her husband’s jacket by Thoreau himself were essentially Margaret Fuller’s only relics that would return to Massachusetts.

 

Conclusion

The relationship between Emerson and Margaret Fuller is enigmatic. She was not only his intellectual equal, but their interactions suggest “an only slightly erotic relationship, about which he clearly fretted” (Sacks, 2003, p. 51). Although Emerson’s life had been scarred by the losses of many loved ones, Fuller’s death clearly devastated him on many levels. The intellectual impact is obvious in a journal entry around the time of her death. “I have lost in her my audience,” he wrote (Emerson, 1850, p. 512). No longer would the two be able to exchange ideas with one another. It impacted him socially as well.  “She bound in the belt of her sympathy and friendship all whom I know and love,” (p. 511) he wrote. Perhaps he wondered what would happen now that the belt had been broken. But was there, in fact, something deeper? “Her heart, which few knew, was as great as her mind, which all knew,” (Emerson, 1850, p. 511-512). Emerson clearly knew her heart more intimately than most.

Why did Emerson dispatch Thoreau to Fire Island and not go himself as he had initially planned? Ostensibly, he wanted to begin work, at once, on a memorial book in Fuller’s honor. We may, however, speculate that there were deeper reasons as well. Years earlier, Emerson had opened the coffin of his first wife, Ellen, who had died of tuberculosis fourteen months before. While he gave no explanation for his action, it seems that he needed to view her decomposing corpse to somehow convince himself of the soul’s immortality (Richardson, 1995). This event marked a turning point in his life. His focus shifted from death to life, from the material to the ideal. 

The death of Margaret Fuller marked another profound turn. Ellen’s death due to illness, while tragic, was predictable. Fuller’s death was unexpected, and he would struggle mightily to recover from it. He became acutely aware of his own mortality. “I hurry now to my work admonished that I have few days left,” he wrote (Emerson, 1850/1982, p. 512). Fuller, who had pushed Emerson to focus on the spiritual as it animates the material was now, herself, inanimate. Emerson might well have stayed in Concord because he somehow sensed that the trip would be fruitless. It might also be that he could not bear the thought of once again standing over the lifeless body of a woman he loved.

 

Postscript

Years later, a small monument to Margaret Fuller was erected on the Fire Island beach not far from the wreck site. It stood as a memorial to a remarkable woman for 10 years. Then, it too was claimed by the sea (Field, n.d.).

 

What do you think of the article? Let us know by leaving a comment below…

 

References

  • Allen, G.W. (1981). Waldo Emerson. NY: Viking.
  • Bullen, D. (2012). The dangers of passion: The transcendental friendship of Ralph Waldo Emerson and Margaret Fuller. Amherst, MA: Levellers Press (Kindle Fire Version). Retrieved from http://www.amazon.com
  • Emerson, R. W. (1841/1914). Journal entry. In B. Perry (Ed.).The heart of Emerson’s journals. Boston: Houghton Mifflin.
  • Emerson, R.W. (1850/1982). Journal entry. In L. Rosenwald (Ed.). Ralph Waldo Emerson: Selected Journals 1841-1877. NY: Library of America.
  • Emerson, R.W. (1850/1997). Letter to Marcus Spring. In J. Meyerson (Ed.). The selected letters of Ralph Waldo Emerson (p. 358).  NY: Columbia University Press.
  • Field, V. R. (n.d.). The strange story of the bark ELIZABETH. http://longislandgenealogy.com/BarkElizabeth.html
  • Richardson, R. D. (1995). Emerson: The mind on fire. Berkley, CA: University of California Press
  • Sacks, K.S. (2003) Understanding Emerson: “The American Scholar” and his struggle for self-reliance. Princeton, NJ: Princeton University Press.
  • Thoreau, H.D. (1850/1958a). Letter to Ralph Waldo Emerson. In W. Hardy & C. Bode (Eds.). The correspondence of Henry David Thoreau (pp. 262-263). NY: NYU Press.
  • Thoreau, H.D. (1850/1958b). Letter to Charles Sumner. In W. Hardy & C. Bode (Eds.). The             correspondence of Henry David Thoreau (p. 263). NY: NYU Press.

 

Footnotes

1. There is some question as to whether they were officially married.

2. Thoreau would incorporate some of his memories from this mission, including that of the skeleton, into his book Cape Cod

Posted
AuthorGeorge Levrier-Jones
5 CommentsPost a comment

The banjo has a popular place in American culture. But few people know of the instrument’s complex roots. In this article, Reed Parker discusses how a banjo-like instrument was originally brought to the US by African slaves - before being remodeled. And the complex cultural interactions between different groups and the banjo…

The Banjo Player, a painting by William Sidney Mount from 1856.

The Banjo Player, a painting by William Sidney Mount from 1856.

In 2005, the first Black Banjo Gathering took place at Appalachian State University in Boone, North Carolina. The purpose of the gathering was to celebrate the tradition of the banjo and bring awareness to the fact that, even though the banjo has become an emblem of white-mountain culture, it is an African instrument at its core. The banjo as we know it today has a decidedly tragic origin story.

 

From Africa to America

Over the last few centuries, the banjo has secured a spot in the canon of traditional American music. In the time before the American Revolution, minstrels became a popular form of entertainment and they often played an early relative to the banjo known as a banjar.

Other relatives of what would eventually become the banjo existed in many different areas of West Africa. There is the ngoni, which had anywhere from three to nine strings, the konou, which has two strings, and the juru keleni, which has just one string. One of the most elaborate of these variations is the kora which has 21 strings and leather straps tied to the pole neck to hold the strings in place. These predecessors are still being played today in their native lands.

The direct predecessor of the banjo, most commonly known as a banjar, arrived on the slave ships that came from West Africa in the 17th century. The instrument was made from half of a gourd with animal skin stretched over it and a pole that acted as a neck. The strings of the banjar were made from waxed horsehair or from the intestines of animals, most commonly cattle or goat. The intestinal strings were referred to as catgut or simply gut strings. The banjar was easily constructed because the materials required were easy to find. Eventually the instrument evolved to include tuning pegs and a flat fretboard in place of the pole neck. This allowed for notes to be manipulated with slides and bends.

 

The banjar in the US

In West Africa, “talking drums” were a common method of long distance communication. This tradition was carried across the ocean to the plantations. In 1739, drums and brass horns were outlawed in the colonies as a result of the Stono Insurrection in which slaves on a South Carolina plantation coordinated an uprising against their slave owners. They had used these instruments to communicate the plan. Prior to this, ensembles of brass horns, drums, and banjars were quite popular. Afterward, however, solo banjar acts became more popular.

A sad reality of this time in the banjar’s life is that its burgeoning popularity had a lot to do with traveling white minstrels who would perform in blackface. The banjar acted as a prop for the minstrels to use in their acts, acts that often satirized aspects of African culture that were brought to the US. It is also theorized that some white old time musicians learned the oral tradition directly from black banjo players and merely wanted to continue the tradition, instead of satirizing it.

By the early 1800s, the European fiddle music that settlers brought over with them and African banjar music were beginning to mutually influence each other. The style of banjar play that started to emerge at this time was known as thumping, which would evolve to become the clawhammer or “frailing” style, a style that combines rhythm and melody into one strumming pattern using a claw-shaped hand position.

 

The arrival of the banjo

Joel Sweeney, a Virginia man of Irish descent, has been credited with either inventing or popularizing the earliest form of the modern banjo which features five strings, an open back, and a wooden rim. His contributions are contested and some claim that it was actually the fourth string that was Sweeney’s invention and that the fifth came later.

Around the middle of the nineteenth century, minstrel groups traveled to Britain, spreading the banjo’s influence over the musical landscape. At the same time, the now booming steamboat travel business put African slaves, on lease from their owners, together with Irish and German immigrant laborers. These marginalized groups would entertain each other with jigs and reels. The mutual influencing continued into the Civil War era and the musical pairing of the banjo and fiddle became and would stay the most popular in the Appalachian region into the twentieth century.

Fortunately, other events outside of blackface minstrel shows were developed to showcase banjo skill. Banjo contests and tournaments were held at a multitude of venues including bars, race tracks, and hotels. Before the Civil War, the contestants were almost exclusively white, but blacks began making an appearance when the war was over.

Further changes to banjo construction were made around this time such as tension rods and wire strings. Tension rods, or truss rods, were implemented to provide the ability to adjust the neck if it warped from dryness or humidity. Wire strings were a cheaper alternative to gut strings, but they were largely dismissed at first for the buzzing they produced.

In the early 1900s, full string bands began to emerge. These groups added a fuller sound to the banjo/fiddle duos with the addition of guitar, upright bass, mandolin, and sometimes other instruments. That is not to say that banjo/fiddle duos were replaced entirely though. Many loyal traditionalist Appalachian banjo players, such as Roscoe Holcomb and Fred Cockerham, continued to play solo or with fiddle accompaniment. Also around this time, different playing styles emerged that were starkly different than the Appalachian clawhammer style. Where clawhammer used thumb and index finger, these styles used three finger picking patterns that allow for a higher volume of notes to be played in a short amount of time. These picking styles are collectively referred to as bluegrass style.

Through the mid-1900s, the banjo was used to evoke Appalachian imagery in contemporary folk and country music as well as pop culture. For example, the theme songs to the television show The Beverly Hillbillies and the film Deliverance became earworms that spread to a mainstream audience, even though their appeal was somewhat of a novelty.

 

The modern age

According to Robert Lloyd Webb, author of Ring the Banjar!, a major turning point for the banjo came in 2000 with the release of the film O Brother, Where Art Thou? The film’s Grammy-winning soundtrack was full of traditional music and was able to garner a more universal appeal. Among those captivated by the soundtrack were members of the band Mumford & Sons who, when they formed, began featuring the banjo in their Pop-Americana sound.

Additionally, celebrities such as Steve Martin and Ed Helms, whether inadvertently or not, have given mainstream credibility to the instrument. Martin, who has been playing the banjo for more than fifty years, has been touring extensively recently in support of his bluegrass albums. Helms recently put out a record with his group The Lonesome Trio and during his time on the sitcom The Office, his character Andy Bernard was shown playing the banjo.

The story of the banjo is a bitter one because of its slavery and racism-laden roots. Lately efforts have emerged for the history to come full circle. In addition to the Black Banjo Gathering, bands like The Carolina Chocolate Drops are reviving old minstrel-style music that consists of a banjo, a fiddle, and a set of bones (a percussion instrument traditionally made from animal bones, but now more often from wood).

The banjo has proven itself to be a versatile instrument appearing in the genres of folk, bluegrass, country, and traditional, as well as jazz, swing, and blues. Deering banjos, one of the most popular manufacturers in the United States, has reported a surge in sales since 2011. Hopefully the growth in the banjo’s popularity will lead to a further fleshing out of its history.

 

Did you find this article interesting? If so, tell the world. Tweet about it, like it, or share it by clicking on one of the buttons below!

Sources

Posted
AuthorGeorge Levrier-Jones
3 CommentsPost a comment