In colonial America, your minister often provided succor to those afflicted in faith and fever. Ministers were usually the most educated people in a community so it made sense that they would be entrusted with medical care as well. Not to mention that so little was known about disease and its causes that fusing religion and medicine made as much sense as any other theory (and probably more so to the deeply religious American colonists). As historian Richard D. Brown put it, "even the most rational and learned individual..saw no clear boundary between physical and spiritual phenomena." The joining of these two roles, however, was not without its complications.
Cotton Mather, one of the most prominent Puritan ministers in New England, introduced smallpox inoculation in 1721, a move that ignited a fierce debate in the Puritan community. Puritans believed that every affliction was proof of God's special interest in their affairs. Some suggested that smallpox (a serious and deadly threat in colonial America) was perhaps God's punishment for sin and that to interfere with that through inoculation or anything else would only anger God and provoke more punishment. Mather did not waver, though. He wrote that "whether a Christian may not employ this Medicine (let the matter of it be what it will) and humbly give Thanks to God’s good Providence in discovering of it to a miserable World; and humbly look up to His Good Providence (as we do in the use of any other Medicine) It may seem strange, that any wise Christian cannot answer it. And how strangely do Men that call themselves Physicians betray their Anatomy, and their Philosophy, as well as their Divinity in their invectives against this Practice?"Eventually, inoculation did gain widespread acceptance.
The spread of print and increased literacy helped bring elite medicine to the masses through home medical manuals. Many ministers as well as their followers had copies of some of the most popular manuals, including William Buchan's Domestic Medicine (1769) and John Wesley's Primitive Physick (1747).
Wesley, better known as the founder of Methodism, was extremely interested in the democratization of medicine. Believing that medicine and doctors were often only accessible to the wealthy, he wrote his own medical manual to allow anyone to easily treat themselves. Wesley considered making medical knowledge and treatments available and comprehensible to the public was part and parcel of his pastoral duties.
The minister-doctor predominated until at least 1750 and continued on to the early 19th century in some areas of the country. It was only with the rise of medical education, apprenticeships, and licensing laws that the two forms of healing separated. But ministers continued to play an integral role in healing, particularly in many of the alternative health movements (homeopathy, hydropathy, osteopathy, etc) that flourished in the 19th century.
Thursday, December 23, 2010
Wednesday, December 22, 2010
Poland Rising
How did the Irish get so popular? In the 19th century and even into the early 20th, the Irish were decidedly not a favorite immigrant group. They were often poor and, more damning, Catholic, a major strike against them in a very Protestant United States. The Irish were stereotyped as hot-headed drunkards, uncivilized and unskilled. Political cartoons were widely used to express negative opinions about Irish immigrants. And yet something changed...
Today, everyone wants to be Irish. There are stores selling Celtic this and that all over the country, Irish bars, and Irish music festivals. Maybe it was sheer numbers. Millions of Irish came to the United States so their influence became impossible to ignore. And they didn't remain poor and unskilled. Many became politically, economically, and religiously powerful in their new country. John F. Kennedy--Irish and Catholic--probably had something to do with it, too.
All of this leads me to the Poles (and myself as I'm a quarter Polish). Why haven't the Polish experienced this renaissance of opinion? Poles were often poor and Catholic, too. They didn't come in quite the same numbers as the Irish but most settled in urban areas like the Irish, especially heavily in the Midwest. Chicago today bills itself as the largest Polish city outside of Poland. And yet Poles are often still the butt of jokes rather than a beloved culture. Everyone I know who has been to Poland raves about its beauty and culture, but rarely do you see a special issue of Conde Nast Traveler or Budget Travel telling you where to go now in Poland.
It all makes me want to start a Polish rehabilitation project. Someday, maybe we'll all start wearing purple on Polish Independence Day (November 11th, for those who haven't already marked their calendars).
Today, everyone wants to be Irish. There are stores selling Celtic this and that all over the country, Irish bars, and Irish music festivals. Maybe it was sheer numbers. Millions of Irish came to the United States so their influence became impossible to ignore. And they didn't remain poor and unskilled. Many became politically, economically, and religiously powerful in their new country. John F. Kennedy--Irish and Catholic--probably had something to do with it, too.
All of this leads me to the Poles (and myself as I'm a quarter Polish). Why haven't the Polish experienced this renaissance of opinion? Poles were often poor and Catholic, too. They didn't come in quite the same numbers as the Irish but most settled in urban areas like the Irish, especially heavily in the Midwest. Chicago today bills itself as the largest Polish city outside of Poland. And yet Poles are often still the butt of jokes rather than a beloved culture. Everyone I know who has been to Poland raves about its beauty and culture, but rarely do you see a special issue of Conde Nast Traveler or Budget Travel telling you where to go now in Poland.
Southwest Poland, near Jelenia Gora |
It all makes me want to start a Polish rehabilitation project. Someday, maybe we'll all start wearing purple on Polish Independence Day (November 11th, for those who haven't already marked their calendars).
Sunday, December 12, 2010
History = War
What counts as history? And why is the answer for so many people one war or another?
As someone who writes about history, I often hear from people how much they love history (and to clarify, it's almost always an adult, usually male, aged 50+), especially the Civil War or World War II. Those are important and fascinating events in history, but I often wonder why that's the only history that people seem to know and care about--why it's the only history that seems to matter.
In part, I blame The History Channel. Or as I like to call it, the channel of never-ending war. It's battles and technologies of war (how guns work and who invented the cannon) or stories of great generals and war heroes. Reading a review of a new history book in the New York Times this weekend, the reviewer hit on what I think is one of the main problems with the way history is depicted on TV:
This may be realism, but it is History Channel realism, where the rawest facts of combat on the ground become the only facts that really count. Entertaining anecdotes abound, but there are more descriptions of mangled bodies than information or insights about strategy.
As someone who writes about history, I often hear from people how much they love history (and to clarify, it's almost always an adult, usually male, aged 50+), especially the Civil War or World War II. Those are important and fascinating events in history, but I often wonder why that's the only history that people seem to know and care about--why it's the only history that seems to matter.
In part, I blame The History Channel. Or as I like to call it, the channel of never-ending war. It's battles and technologies of war (how guns work and who invented the cannon) or stories of great generals and war heroes. Reading a review of a new history book in the New York Times this weekend, the reviewer hit on what I think is one of the main problems with the way history is depicted on TV:
This may be realism, but it is History Channel realism, where the rawest facts of combat on the ground become the only facts that really count. Entertaining anecdotes abound, but there are more descriptions of mangled bodies than information or insights about strategy.
That's it exactly. The spectacle overshadows the actual context and motivations behind it. I get it, I really do. I know that battles are exciting and that draws people in. But that's not what history is ALL about. And I think that's why I grow weary of hearing about the Civil War or World War II, not because they aren't pivotal historical events, but that the battle scenes, often the only glimpses we get and often the only ones we seek, become the dominant (and only) story. Ideas are exciting, too. Ideas are what led to the bloodshed yet that piece doesn't get the same attention even though it is the explanation for it all.
And much of history has nothing to do with war at all. Most of history, as Bill Bryson said in his new book At Home, consists of people going about their daily lives, cooking, sleeping, and bathing. I realize that on its surface, our domestic lives are no where near as exciting as a pitched battle between heroes and villains. But really, it's all a matter of presentation. War may have an inherently interesting package so it's easy to present, though I might argue that the package is often missing the ideas and the critical thought behind it for the sake of explosions, confusing the real importance with the flashy front.
And maybe why you don't hear as many young people proclaiming their love of history or why history books aren't marketed as Mother's Day gifts as they are for Father's Day, is because the history of battles and war isn't as relatable as how people like you and me went about their daily lives to these groups. That's not the whole reason, but it's one element that I think plays a significant role.
I continue to wait for the day that someone tells me they love history and follows it up by saying, "especially the utopian movements in the 19th century."
Monday, December 6, 2010
Giving Fruitcake a Good Name
There are few gifts more vilified, more dreaded or ridiculed than fruitcake, which is too often mass produced with cheap ingredients. Who doesn't join the laughter when a coworker opens the gleaming, bejeweled brick of cake at the office gift exchange?
It's a shame, really, because fruitcake, at its best, is a delicious mix of dried fruits and nuts, bound together by sugar, flour, eggs, and spices. Most of us only know the cake at its worst, rock hard, laced with day-glo candied fruit and bitter citron. Liberally bathed in alcohol, a fruitcake can last more than ten years, a fact that only adds to its supernatural horror. No wonder people in Manitou Springs, Colorado toss them every winter during the Great Fruitcake Toss.
The idea of making cakes with dried fruits and honey dates back to ancient times. Fruitcakes were a means of food preservation. Not only could fruits be conserved, but they could be served out of season, when fresh fruit was unavailable. Egyptians considered fruitcake an essential food for the afterlife (and some of the cakes could outlast you), while the conquering force of the Roman legions was fruitcake-powered.
The fruitcakes we know and... well, love... came from the Middle Ages, when sweet ingredients like honey and spices became more widely available. The arrival of cheap sugar in Europe from the colonies, beginning in the 16th century, resulted in a flourishing of sweet, fruitcake-like breads, including Italian panettone, black cake (common in Jamaica and Trinidad), dreikonigsbrot, king cake, babka, and my personal favorite, stollen.
So what makes something a fruitcake? The fruit-to-cake ratio is pivotal. Anything less than 50% fruit is not really a fruitcake. The fruitcakes from Swiss Colony in Monroe, Wisconsin, contain around 75% fruit and nuts.
And despite what you commonly see in grocery stores, candied fruits in colors that suggest some kind of nuclear disaster are not obligatory and should be avoided. Naturally sweet, dried fruits are the key to turning fruitcake hate into love.
Alcohol allows for long-term storage and also helps to mellow the sweetness of the ingredients. Fruitcakes actually do taste better with age because the dried fruit contains tannins, like wine, that are released over time to create complex flavors and aromas.
Like all things, fruitcake can be great, amazing even, done right. Don't let the imitations fool you.
It's a shame, really, because fruitcake, at its best, is a delicious mix of dried fruits and nuts, bound together by sugar, flour, eggs, and spices. Most of us only know the cake at its worst, rock hard, laced with day-glo candied fruit and bitter citron. Liberally bathed in alcohol, a fruitcake can last more than ten years, a fact that only adds to its supernatural horror. No wonder people in Manitou Springs, Colorado toss them every winter during the Great Fruitcake Toss.
The idea of making cakes with dried fruits and honey dates back to ancient times. Fruitcakes were a means of food preservation. Not only could fruits be conserved, but they could be served out of season, when fresh fruit was unavailable. Egyptians considered fruitcake an essential food for the afterlife (and some of the cakes could outlast you), while the conquering force of the Roman legions was fruitcake-powered.
The fruitcakes we know and... well, love... came from the Middle Ages, when sweet ingredients like honey and spices became more widely available. The arrival of cheap sugar in Europe from the colonies, beginning in the 16th century, resulted in a flourishing of sweet, fruitcake-like breads, including Italian panettone, black cake (common in Jamaica and Trinidad), dreikonigsbrot, king cake, babka, and my personal favorite, stollen.
Stollen... yum |
So what makes something a fruitcake? The fruit-to-cake ratio is pivotal. Anything less than 50% fruit is not really a fruitcake. The fruitcakes from Swiss Colony in Monroe, Wisconsin, contain around 75% fruit and nuts.
And despite what you commonly see in grocery stores, candied fruits in colors that suggest some kind of nuclear disaster are not obligatory and should be avoided. Naturally sweet, dried fruits are the key to turning fruitcake hate into love.
The fruit and nut to cake ratio appears right but those colors only reinforce fruitcake's poor reputation. |
Alcohol allows for long-term storage and also helps to mellow the sweetness of the ingredients. Fruitcakes actually do taste better with age because the dried fruit contains tannins, like wine, that are released over time to create complex flavors and aromas.
Like all things, fruitcake can be great, amazing even, done right. Don't let the imitations fool you.
Thursday, December 2, 2010
What's so heroic about "heroic medicine?"
Heroes don't usually make you bleed or vomit. You'll never see Superman fight evil doers by lancing someone or forcing large doses of calomel down Lex Luther's throat.
And yet "heroic" was what medicine in the 18th and 19th centuries was called. At least the medicine practiced by so-called "regular" doctors through at least the mid-19th century (those who didn't practice heroic medicine were known as alternative or sectarian practitioners--or in unkinder moments, quacks).
Heroic medicine consisted of bleeding, purging, leeching, blistering, and sweating patients to release disease. Calomel, mercurous chloride, was one of the most commonly used mineral concoctions, a harsh treatment that would induce vomiting and purging. At the time, most diseases were seen as systemic imbalances caused by something being either over- or under-stimulated in the body. Until the 1840s, most doctors believed that diseases overstimulated the body so most treatments involved lowering the overexcited patient back to a normal, healthy state. Bleeding often became the first "therapeutic" line of attack, being a seemingly easy way to get whatever disease was poisoning the system and knocking it out of balance back into place.
Heroic medicine gave clear evidence that it worked. Or at least that the treatment was doing something--and that something was often healing in of itself. People felt better just knowing that they were being treated, even if that treatment could sometimes kill them. It also gave the doctor the appearance of being in control of the situation. There's just something about doing that feels a whole lot better than waiting and watching for nature to run its course, as it often does in disease. The body is an amazing healing machine.
So why heroic? The word apparently comes from the large dosage size and effect of the therapies. They didn't just give you a bit of calomel. They gave you A LOT to produce a near instantaneous effect--which, in this instance, was mostly a lot of vomiting. According to the dictionary, "heroic" is "behavior that is bold and dramatic." These treatments certainly were bold and the results often dramatic if not always healing.
But they were also rather harsh and public outcry against them helped lead to their demise in the 19th century.
And yet "heroic" was what medicine in the 18th and 19th centuries was called. At least the medicine practiced by so-called "regular" doctors through at least the mid-19th century (those who didn't practice heroic medicine were known as alternative or sectarian practitioners--or in unkinder moments, quacks).
Heroic medicine consisted of bleeding, purging, leeching, blistering, and sweating patients to release disease. Calomel, mercurous chloride, was one of the most commonly used mineral concoctions, a harsh treatment that would induce vomiting and purging. At the time, most diseases were seen as systemic imbalances caused by something being either over- or under-stimulated in the body. Until the 1840s, most doctors believed that diseases overstimulated the body so most treatments involved lowering the overexcited patient back to a normal, healthy state. Bleeding often became the first "therapeutic" line of attack, being a seemingly easy way to get whatever disease was poisoning the system and knocking it out of balance back into place.
Heroic medicine gave clear evidence that it worked. Or at least that the treatment was doing something--and that something was often healing in of itself. People felt better just knowing that they were being treated, even if that treatment could sometimes kill them. It also gave the doctor the appearance of being in control of the situation. There's just something about doing that feels a whole lot better than waiting and watching for nature to run its course, as it often does in disease. The body is an amazing healing machine.
So why heroic? The word apparently comes from the large dosage size and effect of the therapies. They didn't just give you a bit of calomel. They gave you A LOT to produce a near instantaneous effect--which, in this instance, was mostly a lot of vomiting. According to the dictionary, "heroic" is "behavior that is bold and dramatic." These treatments certainly were bold and the results often dramatic if not always healing.
But they were also rather harsh and public outcry against them helped lead to their demise in the 19th century.
Friday, November 26, 2010
Berry Picking
I made a sour cream cranberry pie for Thanksgiving. And it got me thinking about berries...
I started picking blueberries before I ate them. I was a strange kid who loved vegetables far more than fruit. Blueberries were something my mom liked a lot, though, and I happened to enjoy picking them so it worked out well. And I was never tempted to even put one in my mouth. Unlike strawberries, where for every one that went into my box, two went into my mouth, the blueberries went straight from bush to box. Raspberries were a different thing altogether--not only did I hate eating them, I hated picking them, too. In part because my mom would make me get the all the low ones since I was closer to the ground. As an adult, the logic of that set-up makes perfect sense but at the time, it just seemed like she was being lazy.
The blueberries back in Washington were so large and purpley-blue that you barely had to pick at all to get them to fall into your hand. Even the slightest graze of your fingers and a handful would fall right off. I started going picking with my then-best friend's family in high school. We went to a farm in the shadow of Mt. Si in North Bend with the most magical name--Bybee-Nims. Who wouldn't want to pick blueberries at Bybee-Nims? Even if you were like me and hated blueberries, you wanted to go, if for the view of the mountain alone. And so we would go and I would pick 20 to 40 pounds in about an hour and a half with little to no effort.
Now when I live in a place where the berries are few and far between (completely unlike the berry nirvana of the Northwest), I, of course, have learned to love--YEARN even--for blueberries. Figures, doesn't it?
We do have cranberries, though. Lots and lots of cranberries, but I'm not sure you can ever pick-your-own cranberries from the bog. I'd sure like to try.
Wisconsin's Cranberry Queen ca. 1947. Harvesting cranberries looks like fun doesn't it? Apparently a bikini is required. |
Monday, November 22, 2010
Creating Frankenstein
Mary Shelley's 1818 novel Frankenstein is more than a literary work of the early 19th century--it also represents the scientific discoveries and enthusiasms of her time for electricity.
"I succeeded in discovering the cause of the generation and life; nay, more, I became myself capable of bestowing animation on life matter." --Victor Frankenstein
In Frankenstein, electricity is seen as the secret of life, able to give life to the lifeless. Shelley merely reflected a belief that was becoming increasingly popular in American and European culture. In her novel, Victor Frankenstein alludes to lightning and to Galvanism as the basis for reanimating a lifeless cadaver. Luigi Galvani had popularized the idea of electricity as an innate force of life, what he called animal electricity. Galvani's ideas had largely been supplanted in the scientific community by the time of Shelley, but the idea of an internal electrical fire and particularly reanimation remained strong in the public imagination.
Shelley herself mentioned discussing many of the electrical experiments going on in Europe and the United States with her husband, Percy Shelley, and Lord Byron. They, like the rest of the public, were especially intrigued with the idea of reanimating the dead. These discussions led Shelley to explore the moral and personal responsibilities of scientific advances in her own writing. She recognizes science as a powerful force but one capable of great harm if left uncontrolled. Victor Frankenstein uses science to create his monster yet it ultimately leads to his demise.
Interestingly, Shelley does not provide much description of the laboratory or the way in which Frankenstein is created. Only two sentences in the book mention lightning and Galvanism, though, spectacular electrical displays with shooting lightning bolts became the standard means for depicting the act of creation in movies.
"I succeeded in discovering the cause of the generation and life; nay, more, I became myself capable of bestowing animation on life matter." --Victor Frankenstein
In Frankenstein, electricity is seen as the secret of life, able to give life to the lifeless. Shelley merely reflected a belief that was becoming increasingly popular in American and European culture. In her novel, Victor Frankenstein alludes to lightning and to Galvanism as the basis for reanimating a lifeless cadaver. Luigi Galvani had popularized the idea of electricity as an innate force of life, what he called animal electricity. Galvani's ideas had largely been supplanted in the scientific community by the time of Shelley, but the idea of an internal electrical fire and particularly reanimation remained strong in the public imagination.
Shelley herself mentioned discussing many of the electrical experiments going on in Europe and the United States with her husband, Percy Shelley, and Lord Byron. They, like the rest of the public, were especially intrigued with the idea of reanimating the dead. These discussions led Shelley to explore the moral and personal responsibilities of scientific advances in her own writing. She recognizes science as a powerful force but one capable of great harm if left uncontrolled. Victor Frankenstein uses science to create his monster yet it ultimately leads to his demise.
Interestingly, Shelley does not provide much description of the laboratory or the way in which Frankenstein is created. Only two sentences in the book mention lightning and Galvanism, though, spectacular electrical displays with shooting lightning bolts became the standard means for depicting the act of creation in movies.
Sunday, November 21, 2010
Black Cherry Cheery
We all have weaknesses and mine is for black cherry soda (I also love black cherry ice cream so there must be something about that flavor). Even after I've given up most other sodas, I'll still happily crack a bottle of black cherry* whenever the opportunity arises. I have standards, though; it has to be in a glass bottle, preferably from a regional manufacturer. Even chemically-flavored water has a taste of place, doesn't it? Even if that taste of place is just the tastes of the people in that place.
Years ago, my dad and I created a ratings system for these sodas. It wasn't confined to black cherry--we'd also buy bottles of Nehi Peach, Moxie, KaPow!, and others. But if black cherry was available and we hadn't tried that particular variety before, that was the first choice. The empty bottles lined the shelves in the laundry room like trophies. The tops of the cabinets were lined with old beer cans, mementos of my dad's previous tasting adventures. A few still had beer in them for reasons that were always unclear to me. Beer doesn't get better with age, Dad. Especially if that beer is from the late 1960s as many of them were.
The ratings were taped to the inside of the pantry door. It was a check, check minus, or plus system. Some got a double plus. None were terrible, but some were definitely superior. One of the best was actually not in a bottle at all but on tap at The Pyramid Ale House in Seattle. Fresh, rich, and delicious, it almost makes me wish I still lived there.
One year, my dad bought the Jones Soda Thanksgiving pack. Filled with sodas tasting of green beans, mashed potatoes, and turkey gravy, it was disgusting and fascinating all at once. Turkey soda tasted just as horrible as you imagine ("meat," even artificial meat flavor, doesn't quite refresh like other flavors), though, the corn flavor was oddly delicious.
The ratings system is still on, even though I've moved thousands of miles away. I've also branched out to other things, keeping lists of beers, whiskeys, bourbons, and tequilas with my husband. We don't keep the empty bottles and cans like my dad, but the legacy of lists and tastings lives on.
* Black cherry is an actual fruit, even if it is most commonly seen in artificial products. It's native to North America and is also known as the wild cherry or wild rum cherry. Straight off the tree, the fruit is often inedible and bitter, which is why it most commonly appears in a highly sugared form.
Years ago, my dad and I created a ratings system for these sodas. It wasn't confined to black cherry--we'd also buy bottles of Nehi Peach, Moxie, KaPow!, and others. But if black cherry was available and we hadn't tried that particular variety before, that was the first choice. The empty bottles lined the shelves in the laundry room like trophies. The tops of the cabinets were lined with old beer cans, mementos of my dad's previous tasting adventures. A few still had beer in them for reasons that were always unclear to me. Beer doesn't get better with age, Dad. Especially if that beer is from the late 1960s as many of them were.
The ratings were taped to the inside of the pantry door. It was a check, check minus, or plus system. Some got a double plus. None were terrible, but some were definitely superior. One of the best was actually not in a bottle at all but on tap at The Pyramid Ale House in Seattle. Fresh, rich, and delicious, it almost makes me wish I still lived there.
One year, my dad bought the Jones Soda Thanksgiving pack. Filled with sodas tasting of green beans, mashed potatoes, and turkey gravy, it was disgusting and fascinating all at once. Turkey soda tasted just as horrible as you imagine ("meat," even artificial meat flavor, doesn't quite refresh like other flavors), though, the corn flavor was oddly delicious.
The ratings system is still on, even though I've moved thousands of miles away. I've also branched out to other things, keeping lists of beers, whiskeys, bourbons, and tequilas with my husband. We don't keep the empty bottles and cans like my dad, but the legacy of lists and tastings lives on.
* Black cherry is an actual fruit, even if it is most commonly seen in artificial products. It's native to North America and is also known as the wild cherry or wild rum cherry. Straight off the tree, the fruit is often inedible and bitter, which is why it most commonly appears in a highly sugared form.
Wednesday, November 17, 2010
Recharging my batteries
Why do we say we are "recharging our batteries" when we take a break or do something for ourselves? Or that we "short-circuited" if we can't remember something?
It turns out that these phrases are directly tied to our enthusiasm for electricity in the 19th century. As I've written elsewhere, many scientists, doctors, and the general public came to believe that we all had a set amount of electricity in our bodies that made everything run--called our "vital fluid." Modern urban living tended to deplete this energy source according to some leading doctors and scientists, our "internal battery" as the analogy went, and, therefore, we needed a "recharge" from a jolt of electricity. Our bodies were essentially electrical machines that could short-circuit and burn out just like any other machine.
Public enthusiasm for electricity by the late 19th was so great that many came to believe that electricity could fix anything! And waiting in the wings to take advantage of that deep desire were any number of doctors and entrepreneurs promising electrotherapy treatments for every dysfunction or ill-feeling you could imagine. Few in the general public completely understood electricity so they relied on manufacturers of electrical devices to educate them. In a world changing so fast with new inventions and technology, it was hard for anyone to know what was possible and probable. Advertisements for electrical devices made boisterous cure-all promises, and people richly rewarded those manufacturers for giving them what they wanted.
Books, entertainment, and even food of the late 19th century showed that the image of the "electric body" wasn't just a metaphor--people willingly imbibed electricity directly in an attempt to receive all of its benefits. One scientist even compared its effects to that of the sun on the leaves of a plant.
In Europe, researchers studied electricity's effect on school children. They outfitted a classroom with a high-frequency electrical current that ran for six months. At the end of the experiment, the researchers found that the children had grown an average of 20mm more than those not exposed to the continuous current. Their teachers also reported that they had grown smarter during the experiment due to the "quickening" of their faculties by electrical stimulation.
It turns out that these phrases are directly tied to our enthusiasm for electricity in the 19th century. As I've written elsewhere, many scientists, doctors, and the general public came to believe that we all had a set amount of electricity in our bodies that made everything run--called our "vital fluid." Modern urban living tended to deplete this energy source according to some leading doctors and scientists, our "internal battery" as the analogy went, and, therefore, we needed a "recharge" from a jolt of electricity. Our bodies were essentially electrical machines that could short-circuit and burn out just like any other machine.
Public enthusiasm for electricity by the late 19th was so great that many came to believe that electricity could fix anything! And waiting in the wings to take advantage of that deep desire were any number of doctors and entrepreneurs promising electrotherapy treatments for every dysfunction or ill-feeling you could imagine. Few in the general public completely understood electricity so they relied on manufacturers of electrical devices to educate them. In a world changing so fast with new inventions and technology, it was hard for anyone to know what was possible and probable. Advertisements for electrical devices made boisterous cure-all promises, and people richly rewarded those manufacturers for giving them what they wanted.
Books, entertainment, and even food of the late 19th century showed that the image of the "electric body" wasn't just a metaphor--people willingly imbibed electricity directly in an attempt to receive all of its benefits. One scientist even compared its effects to that of the sun on the leaves of a plant.
In Europe, researchers studied electricity's effect on school children. They outfitted a classroom with a high-frequency electrical current that ran for six months. At the end of the experiment, the researchers found that the children had grown an average of 20mm more than those not exposed to the continuous current. Their teachers also reported that they had grown smarter during the experiment due to the "quickening" of their faculties by electrical stimulation.
Popular culture teemed with electrical fads and follies, providing both tangible and intangible signs that linked electricity, and especially the electrified human body, with ideas of progress.
While electricity remains part of the treatment regiment for some diseases today, the idea of a vital fluid made of electricity that needs recharging has since passed out of popular and scientific medical theories. But its mark on our language remains.
Monday, November 15, 2010
Not Bitter on Bitters
Are they too cheap to get a label maker that makes the right-sized labels? |
Bitters are a mysterious concoction of herbs and spices used in many cocktails, a mystery far more interesting and intriguing than the Colonel's blend of seven spices. Sugar and gentian are the only two acknowledged ingredients in Angostura Bitters. The secret recipe was developed in 1824 by Dr. J G B Siegert, a Surgeon General in Simon Bolivar's Venezuelan army.
Maduro, my favorite bar in Madison has a line of different bottles of bitters, including one with blood oranges and another with peaches. We've sometimes ordered one of the specialty cocktails that called for bitters just so we could try one of these unique flavored bitters--all of which also had mysterious origins and ingredients, too. The bottles also mentioned using bitters on food. Food?
Of course, I had to try it.
Despite subscribing to every food magazine and newsletter available, I had never seen a recipe calling for bitters. Angostura's website has a whole section of recipes, including a pumpkin soup. Being inundated with squash as we are in the fall and winter, soup seemed like the perfect choice. So I boiled and pureed and then added a few dashes of bitters to the finished soup. To be honest, I'm not sure I could exactly taste the bitters. There was a slight herbal flavor to the soup that may have come from the bitters, but it may also have come from my imagination. I'm highly suggestible. I like the idea of using bitters, though. Besides, it makes everything much more mysterious when you can say it calls for a secret ingredient.
I still don't know about that label, though.
Friday, November 12, 2010
Anglophilia
There's just something about the United Kingdom that I can't get enough of. I'm not sure when it began. In sixth grade, I did my country report on England. I'm not sure any other country was even a contender. I spent months making poster and after poster, until my English homage covered the entire front of the classroom as well as the fronts of the desks where I had carefully taped the 3-D map and word game that asked you to guess the American equivalent to the Britishism (i.e. lorry, boot, loo).
In college, I claimed to be a political science major for a month while I filled out the application to study British politics in London for a month. As soon as we got back, I dropped the double major but not my fascination with that part of the world. My first trip with my now-husband was to England and Wales. An accident? I think not. And where did we go for our honeymoon? That's right. Ireland (yes, not part of the UK now but still in that magic realm of those isles), Northern Ireland, and Scotland.
Where does this anglophilia come from?
It turns out that anglophilia has a long history in the United States--it's not, in fact, merely confined to bedrooms in Redmond, Washington. Anglophilia is about admiring England, its people and culture. The Federalists (Alexander Hamilton and John Adams, kind of), one of the first pseudo political parties in our nascent nation, were generally anglophiles, while their rivals, the Democratic Republicans (James Madison and Thomas Jefferson), admired the French. Even as we threw off English authority, England itself retained some symbolic value and a compelling object of attention throughout the 19th century and into the 20th.
Affinity with another nation allows people to feel some release from the burdens of their own nationality. Personally, whenever I feel fed up with American politics, I only need to pull up the BBC or the Guardian on my computer to happily immerse myself in David Cameron's latest idea. This England in my mind and that of other anglophiles isn't necessarily a true image of the country, however. Our anxieties and wishes are often imposed on our image of England. Anglophilia owes much of its energy to a backward belief in the aura of the British. The Englishness that Americans love may not exist at all.
The United Kingdom played an integral part in the United States' history as well as the way we defined ourselves after breaking free. Benedict Arnold once said that "it is useful to remind ourselves that nations inspire love." Even nations we didn't want to be a part of anymore yet can't seem to completely pull ourselves away.
In college, I claimed to be a political science major for a month while I filled out the application to study British politics in London for a month. As soon as we got back, I dropped the double major but not my fascination with that part of the world. My first trip with my now-husband was to England and Wales. An accident? I think not. And where did we go for our honeymoon? That's right. Ireland (yes, not part of the UK now but still in that magic realm of those isles), Northern Ireland, and Scotland.
Where does this anglophilia come from?
It turns out that anglophilia has a long history in the United States--it's not, in fact, merely confined to bedrooms in Redmond, Washington. Anglophilia is about admiring England, its people and culture. The Federalists (Alexander Hamilton and John Adams, kind of), one of the first pseudo political parties in our nascent nation, were generally anglophiles, while their rivals, the Democratic Republicans (James Madison and Thomas Jefferson), admired the French. Even as we threw off English authority, England itself retained some symbolic value and a compelling object of attention throughout the 19th century and into the 20th.
Affinity with another nation allows people to feel some release from the burdens of their own nationality. Personally, whenever I feel fed up with American politics, I only need to pull up the BBC or the Guardian on my computer to happily immerse myself in David Cameron's latest idea. This England in my mind and that of other anglophiles isn't necessarily a true image of the country, however. Our anxieties and wishes are often imposed on our image of England. Anglophilia owes much of its energy to a backward belief in the aura of the British. The Englishness that Americans love may not exist at all.
The United Kingdom played an integral part in the United States' history as well as the way we defined ourselves after breaking free. Benedict Arnold once said that "it is useful to remind ourselves that nations inspire love." Even nations we didn't want to be a part of anymore yet can't seem to completely pull ourselves away.
Tuesday, November 9, 2010
Just Humor Me
The idea of staying in good humor or humoring those around us has an ancient lineage. It actually traces back to Greece and the humoral theory of medicine.
"To begin at the beginning: the elements from which the world is made are air, fire, water, and earth; the seasons from which the year is composed are spring, summer, winter and autumn; the humours from which animals and humans are composed are yellow bile, blood, phlegm, and black bile." --Galen
For centuries, the idea that an excess of phlegm or of yellow bile could cause illness was an accepted medical diagnosis. The four humours--yellow and black bile, phlegm, and blood--circulated throughout the body and an in balance in one or more were believed to be the cause of illness. The theory began in the 5th century BCE with work attributed to Greek physician Hippocrates (though it was his son-in-law and disciple Polybus who wrote the first treatise that clearly explained the whole idea of the humours) and continued with Roman doctor Galen, who adopted the theory in the 2nd century CE. For the next two thousand years (give or take some disruptions and such like the sacking of Rome), humoral theory explained most things about a person's character, medical history, taste, appearance, and behavior.
Why? What was so compelling about this theory?
Well, for one, it seemed to unify passions and cognition, physiology and psychology, and the individual and his/her environment. Various parts of the body and the environment caused disease and stirred various emotions and passions. The humours also made sense to many cultures of people who based their earliest stories of creation on four elements: air, fire, water, and earth. Each of the four humours was tied to one of these elements so it seemed a natural extension of what people already knew about the world. Our human need to understand what we are made of, where we came from, and how we work often causes us to resort to structures and traditions that match our intuitions. The theory offered a potent image of substances, particles, or currents traveling through the body from the limbs to the organs to the brain and heart and back. It seemed to explain how the sight of an attractive person could trigger desire, induce a rush of blood in the veins, and increase the heartbeat.
The fall of Rome wasn't the end of the humoral theory, though. It was more of a shift--to the east to Islam where the knowledge of the Greeks and Romans was saved and expanded upon, and to the abbeys were monks preserved ancient texts. (sidenote: In researching my book on apples, I discovered how much of ancient knowledge was preserved in the Islamic and Christian monastic traditions on apple orcharding and fruit growing in general. So it wasn't too surprising to find that medical knowledge, too, lived on in these same places.)
And so the theory lived on and on, taking on various forms to fit the times but always coming back to the idea of balance and imbalance in the body as the source of illness. The theory only really died with the discovery of the existence of germs in the late 19th century. Not everyone bough the germ theory right away, however, so books based on humoral theory continued into the early 20th century.
Humours now remain mostly familiar in our expressions about keeping balanced and experiencing something with ill-humor. In French, the word for mood is humeur. Many Asian medical traditions are also humoral, based on the idea of energy flows, mind-body connections, and balances between hot and cold, moist and dry. So even if the theory is no longer used to describe disease (at least in the West), the idea of humours still serve as useful and suggestive images in our culture.
"To begin at the beginning: the elements from which the world is made are air, fire, water, and earth; the seasons from which the year is composed are spring, summer, winter and autumn; the humours from which animals and humans are composed are yellow bile, blood, phlegm, and black bile." --Galen
For centuries, the idea that an excess of phlegm or of yellow bile could cause illness was an accepted medical diagnosis. The four humours--yellow and black bile, phlegm, and blood--circulated throughout the body and an in balance in one or more were believed to be the cause of illness. The theory began in the 5th century BCE with work attributed to Greek physician Hippocrates (though it was his son-in-law and disciple Polybus who wrote the first treatise that clearly explained the whole idea of the humours) and continued with Roman doctor Galen, who adopted the theory in the 2nd century CE. For the next two thousand years (give or take some disruptions and such like the sacking of Rome), humoral theory explained most things about a person's character, medical history, taste, appearance, and behavior.
Why? What was so compelling about this theory?
Well, for one, it seemed to unify passions and cognition, physiology and psychology, and the individual and his/her environment. Various parts of the body and the environment caused disease and stirred various emotions and passions. The humours also made sense to many cultures of people who based their earliest stories of creation on four elements: air, fire, water, and earth. Each of the four humours was tied to one of these elements so it seemed a natural extension of what people already knew about the world. Our human need to understand what we are made of, where we came from, and how we work often causes us to resort to structures and traditions that match our intuitions. The theory offered a potent image of substances, particles, or currents traveling through the body from the limbs to the organs to the brain and heart and back. It seemed to explain how the sight of an attractive person could trigger desire, induce a rush of blood in the veins, and increase the heartbeat.
The fall of Rome wasn't the end of the humoral theory, though. It was more of a shift--to the east to Islam where the knowledge of the Greeks and Romans was saved and expanded upon, and to the abbeys were monks preserved ancient texts. (sidenote: In researching my book on apples, I discovered how much of ancient knowledge was preserved in the Islamic and Christian monastic traditions on apple orcharding and fruit growing in general. So it wasn't too surprising to find that medical knowledge, too, lived on in these same places.)
And so the theory lived on and on, taking on various forms to fit the times but always coming back to the idea of balance and imbalance in the body as the source of illness. The theory only really died with the discovery of the existence of germs in the late 19th century. Not everyone bough the germ theory right away, however, so books based on humoral theory continued into the early 20th century.
Humours now remain mostly familiar in our expressions about keeping balanced and experiencing something with ill-humor. In French, the word for mood is humeur. Many Asian medical traditions are also humoral, based on the idea of energy flows, mind-body connections, and balances between hot and cold, moist and dry. So even if the theory is no longer used to describe disease (at least in the West), the idea of humours still serve as useful and suggestive images in our culture.
Sunday, November 7, 2010
A Chaos of People
Collective nouns have got to be one of the strangest and most interesting aspects of English. All of the words mean "group" and yet are specific to the particular thing you are grouping. Examples include a crash of rhinos, an exultation of larks, a knot of toads, and a pride of lions.
After staying at a bed and breakfast with a poster listing many of these animal nouns on dining room wall, my husband and I started to come up with ones for different groups of people. And because I like food so much, many of mine were food related.
Here's some of what we came up with:
A draught of Germans.
A kebab of Turks.
A sauna of Finns.
A longboat of Norwegians.
A politeness of Canadians. (I love this one)
A pierogi of Poles.
A fiesta of Mexicans.
A crumpet of Brits.
A bento box of Japanese.
A kimchi of Koreans.
Monday, November 1, 2010
Who's a Quack?
What makes someone a quack? Is he or she actually doing something nefarious or just doing something you don't agree with?
When I first started reading medical history, I (foolishly) thought the line between a quack and a legitimate doctor were easily drawn. A quack is selling ridiculous medicines claiming to cure everything and bilking gullible people out of money, right? The real story isn't nearly so simple.
Quack or man with a different idea? |
Before the 20th century, medical knowledge was very limited. Those proclaiming themselves legitimate doctors rarely knew anything more than those hawking patent medicines and traveling from town to town. Many doctors engaged in what was thought of as "quackish" behavior, including advertising and putting their name on proprietary remedies. Some quacks even trained at celebrated medical schools or had medical licenses. There really was little scientific evidence separating the two, so calling someone a "quack" became an easy way of targeting those you didn't agree with for one reason or another. So many people stood accused of quackery that the term lost any real meaning, though not its sting of opprobrium.
Everyone felt okay excoriating quacks because all were sure they weren't one. Most of the time, those calling out quacks were those in the medical establishment who belonged to some organization or institution or who had trained in Europe. But sometimes, so-called quacks called out other quacks. What makes someone a quack? Is he or she actually doing something nefarious or just doing something you don't agree with?
When I first started reading medical history, I (foolishly) thought the line between a quack and a legitimate doctor were easily drawn. A quack is selling ridiculous medicines claiming to cure everything and bilking gullible people out of money, right? The real story isn't nearly so simple.
Before the 20th century, medical knowledge was very limited. Those proclaiming themselves legitimate doctors rarely knew anything more than those hawking patent medicines and traveling from town to town. Many doctors engaged in what was thought of as "quackish" behavior, including advertising and putting their name on proprietary remedies. Some quacks even trained at celebrated medical schools or had medical licenses. There really was little scientific evidence separating the two, so calling someone a "quack" became an easy way of targeting those you didn't agree with for one reason or another. So many people stood accused of quackery that the term lost any real meaning, though not its sting of opprobrium.
Everyone felt okay excoriating quacks because all were sure they weren't one. Most of the time, those calling out quacks were those in the medical establishment who belonged to some organization or institution or who had trained in Europe. But sometimes, so-called quacks called out other quacks.
Many that the medical establishment labeled as quacks simply disagreed with the medical therapies that had been practiced for centuries, including blood letting. And they had good reason to do so as many of these traditional practices had hurt and even killed people rather than helped them.
As doctors began to organize into professional organizations in the mid-19th century, one of the motivating factors was to protect people from quacks. These organizations created sharp divisions between "insiders" and "outsiders." But the ethical and moral grounds for this distinction weren't nearly so clear, despite claims to the contrary. The medical marketplace was competitive and what these organizations did do was give some doctors a competitive advantage by their membership and illusory claims at standards, although many people found these organizations elitist and, obviously, exclusionary: but that was the point.
So maybe the better way to think of quacks, doctors, and medical history more generally is to think of the development of the profession as one with many ways to prosperity. Medical men of all kinds were competing for custom, recognition, and financial reward in his own way, each straining to seize the high moral ground in a vicious arena. Some opted for the individualism of the entrepreneur and others opted for the safety and security of the establishment. None were better than the other with the scientific evidence available at the time. Surely, there were a few people who did know that they were peddling nothing more than alcohol and herbs in a jar and wanted to make as much money as possible, but it may not be as many as is commonly portrayed in the literature of the heroic doctor.
Wednesday, October 27, 2010
Genius or Madness?
They say there's a fine line between genius and madness. Some of our foremost thinkers and artists have also suffered from mental illness and/or lived really tumultuous, troubling lives. A recent New Yorker cartoon featured a teenage girl blaming her parents for sinking her writing career by giving her a stable, happy childhood.
It turns out, we've been thinking this way for a long time.
In the mid-19th century, French doctor Jacques Joseph Moreau attributed genius and madness to an overexcitation of certain parts of the brain. Moreau was a follower of phrenology, a system developed by Franz Josef Gall (1758-1828), that attributed various human attributes to specific areas of the brain. These attributes were mapped on the brain and could be used to measured to determine certain things about a person. An overly emotional person probably had a larger emotion section of the brain, for example.
Moreau took this idea a step farther and applied it to nervous disorders. He believed that nervous energy could become more concentrated and active in certain people, causing an overexcitation in one part of the brain that could either result in insanity or genius. A build of energy in the thinking part of the brain could lead to raving madness or it could lead to a great work of literature or a whole new philosophical system. Moreau believed that an exalted state of mind could allow genius to spring forth! But it didn't work for everyone.
In his book Morbid Psychology, Moreau wrote that "the virtue and the vices [of overexcitation] can come from the same foyer, the virtue being the genius, the vice, idiocy." That is, the genius was in constant danger of crossing the line because, according to Moreau, creative energy exhibits all of the reveries, trances, and exalted moments of inspiration that madness often has.
Moreau also did a lot of work with the effects of drugs on the central nervous system, writing a book called Hashish and Mental Alienation that has made him a hero of sorts for the marijuana crowd.
Moreau's work, along with several other doctors and scientists, greatly influenced late 19th century interpretations of neuroses and its causes. It may also have helped popularize the notion of madness as an antecedent to the creative process.
It turns out, we've been thinking this way for a long time.
In the mid-19th century, French doctor Jacques Joseph Moreau attributed genius and madness to an overexcitation of certain parts of the brain. Moreau was a follower of phrenology, a system developed by Franz Josef Gall (1758-1828), that attributed various human attributes to specific areas of the brain. These attributes were mapped on the brain and could be used to measured to determine certain things about a person. An overly emotional person probably had a larger emotion section of the brain, for example.
A phrenology map of the brain |
Moreau took this idea a step farther and applied it to nervous disorders. He believed that nervous energy could become more concentrated and active in certain people, causing an overexcitation in one part of the brain that could either result in insanity or genius. A build of energy in the thinking part of the brain could lead to raving madness or it could lead to a great work of literature or a whole new philosophical system. Moreau believed that an exalted state of mind could allow genius to spring forth! But it didn't work for everyone.
In his book Morbid Psychology, Moreau wrote that "the virtue and the vices [of overexcitation] can come from the same foyer, the virtue being the genius, the vice, idiocy." That is, the genius was in constant danger of crossing the line because, according to Moreau, creative energy exhibits all of the reveries, trances, and exalted moments of inspiration that madness often has.
Moreau also did a lot of work with the effects of drugs on the central nervous system, writing a book called Hashish and Mental Alienation that has made him a hero of sorts for the marijuana crowd.
Moreau's work, along with several other doctors and scientists, greatly influenced late 19th century interpretations of neuroses and its causes. It may also have helped popularize the notion of madness as an antecedent to the creative process.
Monday, October 25, 2010
Living and Loving Lefse
To those without Viking blood coursing through their veins, the fight that ensues over the last piece of lefse at my grandmother's house likely makes no sense. How could an admittedly bland potato flatbread tear a family apart? But lefse is no ordinary food; it's an edible legend. And it's a minor miracle that the pointy stick used to turn lefse on the grill has not resulted in injury… yet.
Although recipes vary, mashed or riced potatoes, flour, sugar, and salt, along with some combination of milk, cream, shortening or butter, are the simple ingredients that turn to magic when rolled flat and grilled. Butter and sugar are traditional lefse fillings but in my family, anything from mashed potatoes and salad greens to green bean casserole are rolled inside like a Norwegian tortilla. The recent acquisition of a small cookbook called 99 Ways with Lefse was nothing too revolutionary for my family.
The pale, golden-brown flecked rounds are a staple of Norwegian grandmothers everywhere, especially in the Midwest, where lefse can be found in restaurants and in the refrigerated case at the grocery store. But the best lefse is homemade and I was fortunate to make lefse many times under the expert guidance of my grandmother.
More than 850,000 Norwegians came to the United States between 1820 and 1875, most to the Upper Midwest. Many left because of the impossibility of farming Norway’s mountainous, rocky terrain, leaving little opportunity for poorer families in a highly stratified society. So the Norwegians came to America, bringing the poverty foods of lefse and lutefisk that sustained them through artic winters and poor harvests with them.
In Norway, lefse virtually disappeared from the culinary landscape, but not in America, where Norwegian women would get together to make enough lefse to last the year. In many communities, a woman’s worth was measured by the thinness and lightness of her lefse.
The arsenal of the lefse maker |
Lefse appears on the table of nearly every Norwegian-American family during the holidays. Or anytime it was available in mine, holiday or special occasion-be-damned. Everything seems more special when lefse is in the mix.
Like her Norwegian ancestors, my grandmother usually makes lefse once, maybe twice, a year, freezing small packets of lefse to last throughout the year. Truth be told, lefse making is chaos incarnate, which perhaps explains why a year’s supply is made in one fell swoop. The sticky, gummy dough sticks like library paste to the grooved rolling pin and counter tops, while a thin layer of flour covers every horizontal surface in addition to your face, hair, clothes, and the inside of your eyelids.
The dough is no match for my grandmother, though, whose slight frame masks a fierce rolling skill. The dough quickly becomes thin enough to “read the newspaper through,” her constant refrain as she rolls and watches my feeble attempts to match her dexterity. Good lefse requires careful discernment of the right amount of flour, the proper temperature of the griddle or pan, and the perfect temperature of the dough, neither too warm nor too cold. What’s right depends on the cook, however, as a discussion between relatives on whether to place the dough outside overnight before baking quickly grew heated; Garrison Keillor would have had a field day with this material. Skill and know-how are the badge of the good lefse maker, skills that can’t be learned from a cookbook.
The grilling is one area that I’ve managed to master with aplomb, lifting the dough in one swift swoop of my sword-shaped stick, laying down the edge, and rolling it out quickly so it lies flat on the round lefse griddle. Thirty seconds or so later, the lefse needs to be flipped. Timing is everything in achieving the perfect balance of knobby brown flecks and bubbles on the pale rounds. A whole batch can take all day.
The mess, frustration, and hard work are worth it. The foil wrapped packets that emerge from my grandmother's freezer—and often arrive in the mail if we miss a holiday—taste all the better for the struggle that went into their creation. Lefse both ties us to the past while simultaneously carrying forth our family’s cultural marker to the future.
These days, good commercial lefse is available in many places, making it a rare family that continues to make their own. But learning from a master like my grandmother, standing shoulder-to-shoulder in the kitchen with floured arms and hands, hot griddles, and melted butter, is the only way to keep the tradition alive.
Saturday, October 23, 2010
All Leaders Are Men... or so I hear
One of the best parts of my job at the radio station is receiving books in the mail. It's like Christmas every day when I go in and find a big stack of books from publishers waiting for me below my mailbox. Even the bad ones--and many of them are really bad--are still fun to open because you just never know what will be inside.
The other day I got a book called Profiles in Leadership: Historians on the Elusive Quality of Greatness edited by Walter Isaacson. Being of a historical mind, I thought "great!" And then I opened it and looked at the table of contents. Ugh. Seriously?
George Washington...
Charles Finney...
Ulysses S. Grant...
Herbert Hoover and FDR...
Wendell Willkie...
Robert Kennedy...
And finally the one that I was looking for: Pauli Murray. The ONE token woman who counts as great (not to diminish Murray in any way. She was a champion for civil and human rights and she deserves our attention). I couldn't believe it. I mean, I could believe it because it isn't uncommon to find books heralding our nation's great men with nary a mention of the other half of the population. But it makes me sad that these books continue to be published, especially from Isaacson who's previous work I've enjoyed.
It's easy to find male leaders. The position of "leader" has throughout much of our history only been open to men. But that's taking a very narrow view of leadership. If leadership is only open to those holding a high political office or leading big companies then women and people of color can be hard to find in history. But why is that our definition of leadership?
Where's Elizabeth Cady Stanton and Susan B. Anthony? Stanton helped organize that world's first women's rights convention. The first convention in the world proclaiming that half of the human race had rights. Anthony worked tirelessly until the day she died speaking and writing and traveling for women's rights, particularly suffrage. She spoke on stages with armed guards to protect her. Can you imagine? Is that not greatness and leadership?
How about Sojourner Truth? Or Wilma Pearl Mankiller? Dorothea Dix? Frances Perkins? Fannie Lou Hamer? Ida B. Wells? The list goes on and on (and if you want to see more, look at Equal Visibility Everywhere's list of 100 Great American Women).
I'm not saying the men included in the book are not worthy of attention and consideration for their leadership skills and greatness. I just don't think another book profiling mostly men--and nearly all white men at that--is the real story of leadership and greatness in this country or any country.
The other day I got a book called Profiles in Leadership: Historians on the Elusive Quality of Greatness edited by Walter Isaacson. Being of a historical mind, I thought "great!" And then I opened it and looked at the table of contents. Ugh. Seriously?
George Washington...
Charles Finney...
Ulysses S. Grant...
Herbert Hoover and FDR...
Wendell Willkie...
Robert Kennedy...
And finally the one that I was looking for: Pauli Murray. The ONE token woman who counts as great (not to diminish Murray in any way. She was a champion for civil and human rights and she deserves our attention). I couldn't believe it. I mean, I could believe it because it isn't uncommon to find books heralding our nation's great men with nary a mention of the other half of the population. But it makes me sad that these books continue to be published, especially from Isaacson who's previous work I've enjoyed.
It's easy to find male leaders. The position of "leader" has throughout much of our history only been open to men. But that's taking a very narrow view of leadership. If leadership is only open to those holding a high political office or leading big companies then women and people of color can be hard to find in history. But why is that our definition of leadership?
Where's Elizabeth Cady Stanton and Susan B. Anthony? Stanton helped organize that world's first women's rights convention. The first convention in the world proclaiming that half of the human race had rights. Anthony worked tirelessly until the day she died speaking and writing and traveling for women's rights, particularly suffrage. She spoke on stages with armed guards to protect her. Can you imagine? Is that not greatness and leadership?
How about Sojourner Truth? Or Wilma Pearl Mankiller? Dorothea Dix? Frances Perkins? Fannie Lou Hamer? Ida B. Wells? The list goes on and on (and if you want to see more, look at Equal Visibility Everywhere's list of 100 Great American Women).
I'm not saying the men included in the book are not worthy of attention and consideration for their leadership skills and greatness. I just don't think another book profiling mostly men--and nearly all white men at that--is the real story of leadership and greatness in this country or any country.
Wednesday, October 20, 2010
Hiking the Ice Age Trail
The death throes of the last Ice Age are clearly visible along a path not far from a busy stretch of highway near Madison, Wisconsin. Standing in the parking lot of the Ice Age National Scenic Trail, stands of oaks covering rock-strewn moraines and areas of crater-like kettle ponds are clearly visible. The trail preserves and celebrates the state's geologic past as it courses like a u-shaped river through Wisconsin.
Around ten thousand years ago, the mile-high wall of ice known as the late Wisconsin Glacier, responsible for shaping much of the physical landscape of Canada, the Upper Midwest, New England and parts of Montana and Washington, began to melt. In its wake, the retreating ice deposited a line of sediment along its southern edge, a serpentine strip of gravelly hills called a terminal moraine that defined the glacier’s final reach. And fittingly--considering the glacier's name--one of the best places to see the effects of the continent's Wisconsin glaciation is in Wisconsin, along the Ice Age Trail.
Ice Age Trail segment in the Lodi Marsh |
Extending like a ribbon from St. Croix Falls in Polk County to Potawatomi State Park in Door County, this scenic belt provides a walking tour of geologic beauty both close to home and in some of Wisconsin's remotest places. Carved from land both privately and publicly owned in 30 counties, the trail is currently more than half-finished, about 600 of the proposed 1,200 miles, and is one of only two national trails in the U.S. contained within one state.
Preservation of this geologic fingerprint was the idea of Milwaukee attorney and avid outdoorsman Raymond Zillmer, who believed the trail would tell the story of Wisconsin's past while serving as a wide-scale conservation effort in a state he believed destined for more and more development. And since 1958, thousands of Wisconsin residents have volunteered countless hours to protect, preserve and share this past through the creation of a continuous park along the glacier’s edge.
Last fall, we started section hiking the trail, beginning with the part closest to Madison and working our way north and south. Our progress is slow. It's hard to get very far when you have to park at one end and hike back to the car, and when you only have a day or two every two weeks to go out. But we're getting there.
The Devil's Staircase in Janesville |
Last weekend we completed about half of the Janesville section, about six miles, but 12 for us since we had to walk back the way we came. The trail is a fantastic mix of city, park, suburbia, prairie, and woods. We've hiked--if I can even use that word--around the Farm and Fleet in Verona, past several libraries, and through the downtown streets of Lodi, Cross Plains, and Janesville. We walked through a dark tunnel under a roadway, hoping to find a safer way to cross, only to find ourselves at what appeared to be the underground entrance to a mental hospital. And a few weeks ago, a golf ball from a wildly off course golfer nearly knocked me out as we hiked the strip of woods between a neighborhood and golf course. But we've also hiked through gorgeous restored prairies and along limestone bluffs.
The Ice Age Trail is truly a wonderful adventure. And one I think I'll be on for the rest of my life at the rate we are going.
Sunday, October 17, 2010
Defining an "academic diaspora"
Do you have a graduate degree(s) and work in a cultural institution? Then you may be part of the academic diaspora.
My husband, an academic, and I often talk about academia vs. the rest of the world. But in the course of our conversations, it soon became apparent that there was a big category of working people that seemed to straddle the lines between these two, admittedly simplistic, divisions, especially in a place like Madison, Wisconsin: it's the periacademic. And I happen to be one of them.
The periacademic or member of the academic diaspora (take your pick, we haven't decided which term is best) has a Masters degree or even a Ph.D. and works in a library, historical society, art museum, or public broadcasting (ding ding ding! That's me). They have the expertise and training of an academic in many cases but have, by choice or circumstance, ended up outside the ivory tower in some kind of bridge institution. Many of the stereotypes of academia exist in the diaspora, too, particularly the insularity, ego, competitiveness over seemingly small stakes, and the know-it-alls.
The primary difference between the two, though, seems to be related to audience and the use of knowledge. The role of an academic is to create and disseminate knowledge, while the periacademic curates knowledge. Most of the knowledge created by an academic is for other academics and maybe their students. Some periacademics also create knowledge, but their primary role seems to be taking the information created by the academic and synthesizing, arranging, and packaging it into a form that is accessible and useful to the general public. So in public radio, where I work, for example, we create programs by assembling the academics or other spokespeople who can verbally communicate some segment of information. Historical societies and museums create exhibits and displays by arranging knowledge. And so on.
Despite a symbiotic and seemingly important relationship to each other, the academic and periacademic don't always get along, unfortunately. In my training as a historian, the public historian, the person who works in a museum or some other public historical institution, was derided as lesser and not serious. But the periacademic is the direct conduit to the public that the academic often wants to reach but doesn't necessarily know how. To use the hip science term, periacademia is translational, bringing information and research out of the lab and into the community for, hopefully, the benefit of all.
The academic diaspora seems to be particularly large in college towns. There's a lot of people who want to engage with big ideas but do so in a way that's more outwardly focused. As academic jobs become harder to come by, especially in the humanities, it seems that the diaspora will only get larger. I'm not sure where all these people will end up, particularly since the periacademic jobs tend to be financially perilous as well. Both play essential roles in knowledge creation, interpretation, and dissemination.
My husband, an academic, and I often talk about academia vs. the rest of the world. But in the course of our conversations, it soon became apparent that there was a big category of working people that seemed to straddle the lines between these two, admittedly simplistic, divisions, especially in a place like Madison, Wisconsin: it's the periacademic. And I happen to be one of them.
The periacademic or member of the academic diaspora (take your pick, we haven't decided which term is best) has a Masters degree or even a Ph.D. and works in a library, historical society, art museum, or public broadcasting (ding ding ding! That's me). They have the expertise and training of an academic in many cases but have, by choice or circumstance, ended up outside the ivory tower in some kind of bridge institution. Many of the stereotypes of academia exist in the diaspora, too, particularly the insularity, ego, competitiveness over seemingly small stakes, and the know-it-alls.
The primary difference between the two, though, seems to be related to audience and the use of knowledge. The role of an academic is to create and disseminate knowledge, while the periacademic curates knowledge. Most of the knowledge created by an academic is for other academics and maybe their students. Some periacademics also create knowledge, but their primary role seems to be taking the information created by the academic and synthesizing, arranging, and packaging it into a form that is accessible and useful to the general public. So in public radio, where I work, for example, we create programs by assembling the academics or other spokespeople who can verbally communicate some segment of information. Historical societies and museums create exhibits and displays by arranging knowledge. And so on.
Despite a symbiotic and seemingly important relationship to each other, the academic and periacademic don't always get along, unfortunately. In my training as a historian, the public historian, the person who works in a museum or some other public historical institution, was derided as lesser and not serious. But the periacademic is the direct conduit to the public that the academic often wants to reach but doesn't necessarily know how. To use the hip science term, periacademia is translational, bringing information and research out of the lab and into the community for, hopefully, the benefit of all.
The academic diaspora seems to be particularly large in college towns. There's a lot of people who want to engage with big ideas but do so in a way that's more outwardly focused. As academic jobs become harder to come by, especially in the humanities, it seems that the diaspora will only get larger. I'm not sure where all these people will end up, particularly since the periacademic jobs tend to be financially perilous as well. Both play essential roles in knowledge creation, interpretation, and dissemination.
Wednesday, October 13, 2010
What foods do you hate?
My mom is picky. Really picky. So guess what I was? A really picky child.
What your parents eat have a really profound effect on what you will eat... at least as a child. The list of things I refused to eat as a kid was far greater than the list I would eat. A sampling of my no-way-in-hell- eating list: salad dressing, Mexican food (more than just a food--a whole culture of food!), fish, french toast, eggs, bananas (my mom actually liked them--I just hated them), and mustard. The last one is particularly amusing in hindsight since I got married at a Mustard Museum. Clearly, I've grown.
It wasn't until I was a teenager and eating dinner at a friend's house that I began to try more foods. And I think the only reason I did try them was because I had also grown up to be a pleaser. I had no idea how to say "no" to anything or anyone so how in the world would I say no to someone serving me dinner?
And so, I soon discovered that salad dressing is actually pretty good, that Mexican food is delicious, and a whole new world of spice and flavor in Thai, Chinese (I'd only had Chinese from Safeway), and Indian food. I've actually introduced my parents to new foods, though, my mom has continued to resist most of them.
Despite the opening of my food world, though, there are still foods I just don't like. Watermelon is one. I know that makes me un-American but I just don't like it. I've tried. I even went to a watermelon tasting at a farm that grows more than 10 varieties and none tasted of anything more than dirty water to me. I should say that I will eat watermelon... I just won't enjoy it.
The object of my primary animus, however, is fennel. That anise flavor is a no go in any form, from black jelly beans to black licorice and raki. While I pride myself on my vegetable love, I just can't muster any love for fennel no matter what I do.
I'm happy to leave you my share of the fennel.
What your parents eat have a really profound effect on what you will eat... at least as a child. The list of things I refused to eat as a kid was far greater than the list I would eat. A sampling of my no-way-in-hell- eating list: salad dressing, Mexican food (more than just a food--a whole culture of food!), fish, french toast, eggs, bananas (my mom actually liked them--I just hated them), and mustard. The last one is particularly amusing in hindsight since I got married at a Mustard Museum. Clearly, I've grown.
It wasn't until I was a teenager and eating dinner at a friend's house that I began to try more foods. And I think the only reason I did try them was because I had also grown up to be a pleaser. I had no idea how to say "no" to anything or anyone so how in the world would I say no to someone serving me dinner?
And so, I soon discovered that salad dressing is actually pretty good, that Mexican food is delicious, and a whole new world of spice and flavor in Thai, Chinese (I'd only had Chinese from Safeway), and Indian food. I've actually introduced my parents to new foods, though, my mom has continued to resist most of them.
Despite the opening of my food world, though, there are still foods I just don't like. Watermelon is one. I know that makes me un-American but I just don't like it. I've tried. I even went to a watermelon tasting at a farm that grows more than 10 varieties and none tasted of anything more than dirty water to me. I should say that I will eat watermelon... I just won't enjoy it.
The object of my primary animus, however, is fennel. That anise flavor is a no go in any form, from black jelly beans to black licorice and raki. While I pride myself on my vegetable love, I just can't muster any love for fennel no matter what I do.
I'm sorry, fennel. I hate you. |
Monday, October 11, 2010
Songcatchers
Did you know that the government once paid people to record our nation's musical heritage? In the years before World War II, fieldworkers, evocatively known as "songcatchers," traveled around the country recording, collecting, and transcribing folk music from everyone from lumberjacks to American Indians and recent immigrants. Many of the recordings ended up in the Library of Congress in its folk music collection, along with photographs and other ephemera from our nation's singers and musicians.
The urge to collect this music came from many sources. One was technology. As electricity spread and more people bought radios, many folklorists and other songcatchers worried that people would sing along with the radio rather than their traditional music, spelling the end to the rich and vital music of our nation's ethnic heritage.
Another was employment. During the Depression, several New Deal programs, including the Federal Music Project, the California Folk Music Project, the Wisconsin Folk Music Project, and the Resettlement Administration, gave unemployed men and women jobs collecting music. One goal of the Federal Music Project was to record and define the American musical scene in all its variety.
Interestingly, many of the people given the task of collecting folk music were women. Frances Densmore, for instance, devoted her life to the study of American Indian music, visiting Indian communities across the Upper Midwest to study and transcribe their music. Another woman, Helen Heffron Roberts traveled to Jamaica, Hawaii, California, and the American Southwest collecting music and other ethnographic materials.
At the time, music, like many other professional fields, was largely closed to women. Many did not believe women possessed the bodily strength or presence of mind to play music in professional orchestras, conduct, or compose complete pieces. Yet these women became the first to go out in the field and live among their informants, studying and recording music. They truly were pioneers in American ethnomusicology and in pushing women forward into new careers.
This music truly is great stuff. It's nice to think that we once thought it was so important to invest in our culture.
Here's a verse from a song collected in Wisconsin called "Fond du Lac Jail:"
"In the morning you receive a dry loaf of bread
That's hard as stone and heavy as lead
It's thrown from the ceiling down into your cell,
Like coming from Heaven popped down into Hell."
My story on Wisconsin songcatcher Helene Stratman-Thomas will appear in the winter issue of the Wisconsin Magazine of History.
A great picture of some Wisconsin women playing the Swiss bells. WHI-25191 |
The urge to collect this music came from many sources. One was technology. As electricity spread and more people bought radios, many folklorists and other songcatchers worried that people would sing along with the radio rather than their traditional music, spelling the end to the rich and vital music of our nation's ethnic heritage.
Another was employment. During the Depression, several New Deal programs, including the Federal Music Project, the California Folk Music Project, the Wisconsin Folk Music Project, and the Resettlement Administration, gave unemployed men and women jobs collecting music. One goal of the Federal Music Project was to record and define the American musical scene in all its variety.
Interestingly, many of the people given the task of collecting folk music were women. Frances Densmore, for instance, devoted her life to the study of American Indian music, visiting Indian communities across the Upper Midwest to study and transcribe their music. Another woman, Helen Heffron Roberts traveled to Jamaica, Hawaii, California, and the American Southwest collecting music and other ethnographic materials.
At the time, music, like many other professional fields, was largely closed to women. Many did not believe women possessed the bodily strength or presence of mind to play music in professional orchestras, conduct, or compose complete pieces. Yet these women became the first to go out in the field and live among their informants, studying and recording music. They truly were pioneers in American ethnomusicology and in pushing women forward into new careers.
This music truly is great stuff. It's nice to think that we once thought it was so important to invest in our culture.
Here's a verse from a song collected in Wisconsin called "Fond du Lac Jail:"
"In the morning you receive a dry loaf of bread
That's hard as stone and heavy as lead
It's thrown from the ceiling down into your cell,
Like coming from Heaven popped down into Hell."
My story on Wisconsin songcatcher Helene Stratman-Thomas will appear in the winter issue of the Wisconsin Magazine of History.
Sunday, October 10, 2010
The Mismatch of Writing and Speaking
I have a new book out, A Short History of Wisconsin, so I've been on the speaking circuit once again. Public speaking is never something that has come easily to me, though it has certainly gotten easier with time and experience. Being out on the road, driving to libraries and bookstores across the state, always reminds me of how disparate the skills of writing and speaking really are--and how rare it is the person who can do both and do them well.
As a writer, I think many of us think that we'll write something and hopefully get it published, all from the safety of our garret. Speak about it? Oh no, that's why I'm a WRITER. I express myself much better on paper than in the open air.
But then you actually get something published, particularly a book, and you learn that hiding away just isn't really an option. You must speak and you must learn to do it pretty well to help promote your work and to, hopefully, keep the work coming. I just don't think that being a good speaker and being a good writer involve the same skill set. So it can cruel indeed to discover that you have to cultivate both of these skills if you want to be a writer. Your readers want to hear you... and see you.
While I'm sure I would vigorously deny it as I'm shaking and sweating in the sidelines before each of my talks, public speaking has been good for me. Talking to readers help make you a better writer because you can get a better sense of the people you are trying to reach through your words. It's also made me more confident and able to think--and speak--on my feet, a skill that I think can be difficult to learn other than by throwing yourself into the experience you are so desperately trying to avoid.
I'll never be an amazing speaker. I'm a writer and researcher first. But I am getting better.
As a writer, I think many of us think that we'll write something and hopefully get it published, all from the safety of our garret. Speak about it? Oh no, that's why I'm a WRITER. I express myself much better on paper than in the open air.
But then you actually get something published, particularly a book, and you learn that hiding away just isn't really an option. You must speak and you must learn to do it pretty well to help promote your work and to, hopefully, keep the work coming. I just don't think that being a good speaker and being a good writer involve the same skill set. So it can cruel indeed to discover that you have to cultivate both of these skills if you want to be a writer. Your readers want to hear you... and see you.
While I'm sure I would vigorously deny it as I'm shaking and sweating in the sidelines before each of my talks, public speaking has been good for me. Talking to readers help make you a better writer because you can get a better sense of the people you are trying to reach through your words. It's also made me more confident and able to think--and speak--on my feet, a skill that I think can be difficult to learn other than by throwing yourself into the experience you are so desperately trying to avoid.
I'll never be an amazing speaker. I'm a writer and researcher first. But I am getting better.
Thursday, October 7, 2010
The Hysterical Woman
Reading 19th century literature, you might start to think that every woman was a swooning, mad hysteric. Women seemed forever prone to fainting or madness--few were ever fully in their right minds. Most acted somewhat bizarre and with high theatricality, not unlike the clothes that fashion dictated women should wear.
Hysteria was a purely female problem. Men suffered from their own version of nervous disease known as neurasthenia. The supposed problem? Women's small size and supposed governance by their reproductive systems. That's right. Women acted crazy because their womb overrode the power of their brains. The word even comes from the Greek word hystera for womb.
So little was known about physical illnesses, such as epilepsy and neurological diseases, that the ''nervous'' illnesses were generally lumped in together with them. The idea of neuroses as distinctly and purely in the mind was a minority view. More popular were a whole series of medical models for nervous illness and the psychological and physical options for treating them. Nerves were commonly ascribed a "force" that gave vitality to organs. Hysteria and neurasthenia were said to be caused by a weakness in this force, though the psychological evidence for this "force" and its loss were never evident. Anecdotes provided more powerful proof... at least for a time.
Hysteria helped to reinforce existing gender and class attitudes. Women couldn't hold positions of power or be trusted to vote if they were not rational beings! Hysteria worked very well at keeping women in the home and out of the public space, just what many men wanted.
Hysteria could also work in a woman's favor, though. Being classed as "sick" or "infirm" was one way to get out of performing your expected duties as a woman. In the book The Peabody Sisters by Megan Marshall, an amazing biography of three women in the 19th century, one of the sisters, Sophia, is often ill and seems to use her illness to forge an art career for herself.
Female hysteria had mostly run its course by the 20th century. Doctors stopped recognizing it as a legitimate medical diagnosis and catch-all disease for any number of symptoms.
Hysteria was a purely female problem. Men suffered from their own version of nervous disease known as neurasthenia. The supposed problem? Women's small size and supposed governance by their reproductive systems. That's right. Women acted crazy because their womb overrode the power of their brains. The word even comes from the Greek word hystera for womb.
So little was known about physical illnesses, such as epilepsy and neurological diseases, that the ''nervous'' illnesses were generally lumped in together with them. The idea of neuroses as distinctly and purely in the mind was a minority view. More popular were a whole series of medical models for nervous illness and the psychological and physical options for treating them. Nerves were commonly ascribed a "force" that gave vitality to organs. Hysteria and neurasthenia were said to be caused by a weakness in this force, though the psychological evidence for this "force" and its loss were never evident. Anecdotes provided more powerful proof... at least for a time.
Hysteria helped to reinforce existing gender and class attitudes. Women couldn't hold positions of power or be trusted to vote if they were not rational beings! Hysteria worked very well at keeping women in the home and out of the public space, just what many men wanted.
Hysteria could also work in a woman's favor, though. Being classed as "sick" or "infirm" was one way to get out of performing your expected duties as a woman. In the book The Peabody Sisters by Megan Marshall, an amazing biography of three women in the 19th century, one of the sisters, Sophia, is often ill and seems to use her illness to forge an art career for herself.
Female hysteria had mostly run its course by the 20th century. Doctors stopped recognizing it as a legitimate medical diagnosis and catch-all disease for any number of symptoms.
Tuesday, October 5, 2010
When Toast Tastes Better
Why does food taste better when someone else makes it for you? Case in point (at least for me): toast.
At home, I rarely toast--at least to eat plain. I'll toast bread for a sandwich, but rarely do I just eat it with butter or jam. But take me out to brunch or to a bed and breakfast in the UK and toast is all I want. This makes even less sense when you consider that the toast often served is made from the cheapest bread that I would never buy at home and usually arrives cold and rather limp on the table. But I will eat every piece with relish, as though it's the most delicious thing you could possibly eat for breakfast. Or even dinner given the opportunity.
Toast has been around for centuries. No one liked eating stale bread so toasting became common in Rome as way to preserve bread. The word comes from the Latin tostum, which meant scorching or burning. Most early toast was held over the fire like a marshmallow or laid on hot stones. The toaster didn't make its appearance until 1893 when Cromptin and Company introduced the first toaster in Great Britain. It didn't reach America until 1909.
This early toaster only toasted on one side at a time. It took another decade for a two-sided toaster with a pop-up feature to appear. We still had to cut our own bread to make toast, though. Sliced bread didn't appear until 1930 when Wonder introduced this modern marvel, which made toast even more popular. Today, nearly 90% of Americans have a toaster in the their home.
So why does toast taste better? Maybe it's the familiarity--the comfort of something simple--that makes toast so powerful when I'm away from home. Or maybe I'm just hoping to see Jesus or the Virgin Mary appear in my bread.
At home, I rarely toast--at least to eat plain. I'll toast bread for a sandwich, but rarely do I just eat it with butter or jam. But take me out to brunch or to a bed and breakfast in the UK and toast is all I want. This makes even less sense when you consider that the toast often served is made from the cheapest bread that I would never buy at home and usually arrives cold and rather limp on the table. But I will eat every piece with relish, as though it's the most delicious thing you could possibly eat for breakfast. Or even dinner given the opportunity.
Toast has been around for centuries. No one liked eating stale bread so toasting became common in Rome as way to preserve bread. The word comes from the Latin tostum, which meant scorching or burning. Most early toast was held over the fire like a marshmallow or laid on hot stones. The toaster didn't make its appearance until 1893 when Cromptin and Company introduced the first toaster in Great Britain. It didn't reach America until 1909.
This early toaster only toasted on one side at a time. It took another decade for a two-sided toaster with a pop-up feature to appear. We still had to cut our own bread to make toast, though. Sliced bread didn't appear until 1930 when Wonder introduced this modern marvel, which made toast even more popular. Today, nearly 90% of Americans have a toaster in the their home.
So why does toast taste better? Maybe it's the familiarity--the comfort of something simple--that makes toast so powerful when I'm away from home. Or maybe I'm just hoping to see Jesus or the Virgin Mary appear in my bread.
Monday, October 4, 2010
Brussels Sprouts Appreciation Club
I have a, perhaps, unnatural love of brussels sprouts. A whole stalk can easily become my dinner when roasted in olive oil and lightly salted (my husband loves them too, though, so I try to share). And I seriously think many times while eating them, "these are way better than any candy." Seriously. I do.
Many people hate brussels sprouts. And I used to be one of them.
As a kid, my parents would buy bags of frozen sprouts and steam them, topping the whole lot with butter. My mom, despite being incredibly picky about food, happened to love brussels sprouts. I would eat them, but always quickly and sometimes covered in parmesan cheese (the sawdust from the green can, that is). I could never imagine that there would be day when I would voluntarily eat them, much less look forward to their appearance at the market.
But then, about 7 years ago, for reasons still mysterious, I woke up CRAVING them. Having moved out of my parents' house nearly five years earlier, I'm not sure I had eaten even one in that time. Nor had I ever thought about them. So I'm not sure why I needed those tiny cabbages so badly that day. Maybe its my Polish blood craving some cabbage-y goodness. I went to the store and purchased the same bag of frozen sprouts my mom had bought so many times. And I ate them all in one sitting. I loved them. I had to have more.
Forsaking frozen for fresh, I now eagerly await the appearance of the mutant stalk at the farmers market. One vendor had them labeled "Wisconsin Palm Trees" a few weeks back and they did look kind of tropical, with their nubby stalk and oversized leaves flopping over the top. I popped them off the stalk and roasted them for 35 minutes. I ate so many I nearly gave myself a stomach ache. I might be the first person to overdose on brussels sprouts.
To join my brussels sprouts appreciation club, cook this from 101Cookbooks and see if you can resist.
Many people hate brussels sprouts. And I used to be one of them.
As a kid, my parents would buy bags of frozen sprouts and steam them, topping the whole lot with butter. My mom, despite being incredibly picky about food, happened to love brussels sprouts. I would eat them, but always quickly and sometimes covered in parmesan cheese (the sawdust from the green can, that is). I could never imagine that there would be day when I would voluntarily eat them, much less look forward to their appearance at the market.
But then, about 7 years ago, for reasons still mysterious, I woke up CRAVING them. Having moved out of my parents' house nearly five years earlier, I'm not sure I had eaten even one in that time. Nor had I ever thought about them. So I'm not sure why I needed those tiny cabbages so badly that day. Maybe its my Polish blood craving some cabbage-y goodness. I went to the store and purchased the same bag of frozen sprouts my mom had bought so many times. And I ate them all in one sitting. I loved them. I had to have more.
Forsaking frozen for fresh, I now eagerly await the appearance of the mutant stalk at the farmers market. One vendor had them labeled "Wisconsin Palm Trees" a few weeks back and they did look kind of tropical, with their nubby stalk and oversized leaves flopping over the top. I popped them off the stalk and roasted them for 35 minutes. I ate so many I nearly gave myself a stomach ache. I might be the first person to overdose on brussels sprouts.
To join my brussels sprouts appreciation club, cook this from 101Cookbooks and see if you can resist.
Sunday, October 3, 2010
The Flow of Electricity
Did you know that we used to think electricity was a fluid? That's why we use words like "current" and "flow" to describe what is clearly not a liquid. But in the 19th century, electricity was a force that many believed had the power to heal and recharge humans.
In the early years of scientific medicine, many doctors came to believe that nervous diseases had physical causes and could only be cured by physical means. Some compared humans to batteries, saying that we sometimes suffered from a low charge that needed a jolt of energy to recharge. This idea seemed to offer a legitimate explanation for the worn out businessman, the languishing youth, and the excitable, swooning woman of the 19th century. This "disease" came to be called "neurasthenia," a name popularized by physician George M. Beard who combined all of the so-called symptoms of the disease under one named cause.
The solution? Electricity (it also probably didn't hurt that Beard happened to be friends with Thomas Edison, the electric wizard of Menlo Park). Many doctors came to believe that we could recharge our "neural batteries" with electricity. And so various devices purporting to soothe weary brains began to appear with increasing frequency. My favorite is Dr. Scott's Electric Hair-Brush, which claimed to both make hair more glossy and calm worn-out brains.
The era of electric medicine began to come to an end by the turn-of-the-20th century. Because neither neurasthenia nor electricity actually exhibited physical changes--only proclamations of change (suggestion is powerful)--skepticism about its power began to emerge in the medical community. We also had new theories about mental illness and the mind as scientific research advanced. Electricity remained a magical force, though, its enigmatic nature and invisible power provoking the imagination to ever more fantastical heights.
In the early years of scientific medicine, many doctors came to believe that nervous diseases had physical causes and could only be cured by physical means. Some compared humans to batteries, saying that we sometimes suffered from a low charge that needed a jolt of energy to recharge. This idea seemed to offer a legitimate explanation for the worn out businessman, the languishing youth, and the excitable, swooning woman of the 19th century. This "disease" came to be called "neurasthenia," a name popularized by physician George M. Beard who combined all of the so-called symptoms of the disease under one named cause.
The solution? Electricity (it also probably didn't hurt that Beard happened to be friends with Thomas Edison, the electric wizard of Menlo Park). Many doctors came to believe that we could recharge our "neural batteries" with electricity. And so various devices purporting to soothe weary brains began to appear with increasing frequency. My favorite is Dr. Scott's Electric Hair-Brush, which claimed to both make hair more glossy and calm worn-out brains.
The era of electric medicine began to come to an end by the turn-of-the-20th century. Because neither neurasthenia nor electricity actually exhibited physical changes--only proclamations of change (suggestion is powerful)--skepticism about its power began to emerge in the medical community. We also had new theories about mental illness and the mind as scientific research advanced. Electricity remained a magical force, though, its enigmatic nature and invisible power provoking the imagination to ever more fantastical heights.
Subscribe to:
Posts (Atom)