Wednesday, October 27, 2010

Genius or Madness?

They say there's a fine line between genius and madness. Some of our foremost thinkers and artists have also suffered from mental illness and/or lived really tumultuous, troubling lives. A recent New Yorker cartoon featured a teenage girl blaming her parents for sinking her writing career by giving her a stable, happy childhood.

It turns out, we've been thinking this way for a long time.

In the mid-19th century, French doctor Jacques Joseph Moreau attributed genius and madness to an overexcitation of certain parts of the brain. Moreau was a follower of phrenology, a system developed by Franz Josef Gall (1758-1828), that attributed various human attributes to specific areas of the brain. These attributes were mapped on the brain and could be used to measured to determine certain things about a person. An overly emotional person probably had a larger emotion section of the brain, for example.
A phrenology map of the brain

Moreau took this idea a step farther and applied it to nervous disorders. He believed that nervous energy could become more concentrated and active in certain people, causing an overexcitation in one part of the brain that could either result in insanity or genius. A build of energy in the thinking part of the brain could lead to raving madness or it could lead to a great work of literature or a whole new philosophical system. Moreau believed that an exalted state of mind could allow genius to spring forth! But it didn't work for everyone.

In his book Morbid Psychology, Moreau wrote that "the virtue and the vices [of overexcitation] can come from the same foyer, the virtue being the genius, the vice, idiocy." That is, the genius was in constant danger of crossing the line because, according to Moreau, creative energy exhibits all of the reveries, trances, and exalted moments of inspiration that madness often has.

Moreau also did a lot of work with the effects of drugs on the central nervous system, writing a book called Hashish and Mental Alienation that has made him a hero of sorts for the marijuana crowd.

Moreau's work, along with several other doctors and scientists, greatly influenced late 19th century interpretations of neuroses and its causes. It may also have helped popularize the notion of madness as an antecedent to the creative process.

Monday, October 25, 2010

Living and Loving Lefse

To those without Viking blood coursing through their veins, the fight that ensues over the last piece of lefse at my grandmother's house likely makes no sense.  How could an admittedly bland potato flatbread tear a family apart? But lefse is no ordinary food; it's an edible legend.  And it's a minor miracle that the pointy stick used to turn lefse on the grill has not resulted in injury… yet. 

Although recipes vary, mashed or riced potatoes, flour, sugar, and salt, along with some combination of milk, cream, shortening or butter, are the simple ingredients that turn to magic when rolled flat and grilled. Butter and sugar are traditional lefse fillings but in my family, anything from mashed potatoes and salad greens to green bean casserole are rolled inside like a Norwegian tortilla. The recent acquisition of a small cookbook called 99 Ways with Lefse was nothing too revolutionary for my family.

The pale, golden-brown flecked rounds are a staple of Norwegian grandmothers everywhere, especially in the Midwest, where lefse can be found in restaurants and in the refrigerated case at the grocery store. But the best lefse is homemade and I was fortunate to make lefse many times under the expert guidance of my grandmother.

More than 850,000 Norwegians came to the United States between 1820 and 1875, most to the Upper Midwest. Many left because of the impossibility of farming Norway’s mountainous, rocky terrain, leaving little opportunity for poorer families in a highly stratified society. So the Norwegians came to America, bringing the poverty foods of lefse and lutefisk that sustained them through artic winters and poor harvests with them.

In Norway, lefse virtually disappeared from the culinary landscape, but not in America, where Norwegian women would get together to make enough lefse to last the year. In many communities, a woman’s worth was measured by the thinness and lightness of her lefse.
The arsenal of the lefse maker

Lefse appears on the table of nearly every Norwegian-American family during the holidays. Or anytime it was available in mine, holiday or special occasion-be-damned. Everything seems more special when lefse is in the mix.

Like her Norwegian ancestors, my grandmother usually makes lefse once, maybe twice, a year, freezing small packets of lefse to last throughout the year. Truth be told, lefse making is chaos incarnate, which perhaps explains why a year’s supply is made in one fell swoop. The sticky, gummy dough sticks like library paste to the grooved rolling pin and counter tops, while a thin layer of flour covers every horizontal surface in addition to your face, hair, clothes, and the inside of your eyelids.

The dough is no match for my grandmother, though, whose slight frame masks a fierce rolling skill. The dough quickly becomes thin enough to “read the newspaper through,” her constant refrain as she rolls and watches my feeble attempts to match her dexterity. Good lefse requires careful discernment of the right amount of flour, the proper temperature of the griddle or pan, and the perfect temperature of the dough, neither too warm nor too cold. What’s right depends on the cook, however, as a discussion between relatives on whether to place the dough outside overnight before baking quickly grew heated; Garrison Keillor would have had a field day with this material. Skill and know-how are the badge of the good lefse maker, skills that can’t be learned from a cookbook.

The grilling is one area that I’ve managed to master with aplomb, lifting the dough in one swift swoop of my sword-shaped stick, laying down the edge, and rolling it out quickly so it lies flat on the round lefse griddle. Thirty seconds or so later, the lefse needs to be flipped. Timing is everything in achieving the perfect balance of knobby brown flecks and bubbles on the pale rounds. A whole batch can take all day.

The mess, frustration, and hard work are worth it. The foil wrapped packets that emerge from my grandmother's freezer—and often arrive in the mail if we miss a holiday—taste all the better for the struggle that went into their creation.  Lefse both ties us to the past while simultaneously carrying forth our family’s cultural marker to the future.

These days, good commercial lefse is available in many places, making it a rare family that continues to make their own. But learning from a master like my grandmother, standing shoulder-to-shoulder in the kitchen with floured arms and hands, hot griddles, and melted butter, is the only way to keep the tradition alive.  

Saturday, October 23, 2010

All Leaders Are Men... or so I hear

One of the best parts of my job at the radio station is receiving books in the mail. It's like Christmas every day when I go in and find a big stack of books from publishers waiting for me below my mailbox. Even the bad ones--and many of them are really bad--are still fun to open because you just never know what will be inside.

The other day I got a book called Profiles in Leadership: Historians on the Elusive Quality of Greatness edited by Walter Isaacson. Being of a historical mind, I thought "great!" And then I opened it and looked at the table of contents. Ugh. Seriously?

George Washington...
Charles Finney...
Ulysses S. Grant...
Herbert Hoover and FDR...
Wendell Willkie...
Robert Kennedy...

And finally the one that I was looking for: Pauli Murray. The ONE token woman who counts as great (not to diminish Murray in any way. She was a champion for civil and human rights and she deserves our attention).  I couldn't believe it. I mean, I could believe it because it isn't uncommon to find books heralding our nation's great men with nary a mention of the other half of the population. But it makes me sad that these books continue to be published, especially from Isaacson who's previous work I've enjoyed.

It's easy to find male leaders. The position of "leader" has throughout much of our history only been open to men. But that's taking a very narrow view of leadership. If leadership is only open to those holding a high political office or leading big companies then women and people of color can be hard to find in history. But why is that our definition of leadership?

Where's Elizabeth Cady Stanton and Susan B. Anthony? Stanton helped organize that world's first women's rights convention. The first convention in the world proclaiming that half of the human race had rights. Anthony worked tirelessly until the day she died speaking and writing and traveling for women's rights, particularly suffrage. She spoke on stages with armed guards to protect her. Can you imagine? Is that not greatness and leadership?

How about Sojourner Truth? Or Wilma Pearl Mankiller? Dorothea Dix? Frances Perkins? Fannie Lou Hamer? Ida B. Wells? The list goes on and on (and if you want to see more, look at Equal Visibility Everywhere's list of 100 Great American Women).

I'm not saying the men included in the book are not worthy of attention and consideration for their leadership skills and greatness. I just don't think another book profiling mostly men--and nearly all white men at that--is the real story of leadership and greatness in this country or any country.

Wednesday, October 20, 2010

Hiking the Ice Age Trail



The death throes of the last Ice Age are clearly visible along a path not far from a busy stretch of highway near Madison, Wisconsin. Standing in the parking lot of the Ice Age National Scenic Trail, stands of oaks covering rock-strewn moraines and areas of crater-like kettle ponds are clearly visible. The trail preserves and celebrates the state's geologic past as it courses like a u-shaped river through Wisconsin.

Around ten thousand years ago, the mile-high wall of ice known as the late Wisconsin Glacier, responsible for shaping much of the physical landscape of Canada, the Upper Midwest, New England and parts of Montana and Washington, began to melt. In its wake, the retreating ice deposited a line of sediment along its southern edge, a serpentine strip of gravelly hills called a terminal moraine that defined the glacier’s final reach. And fittingly--considering the glacier's name--one of the best places to see the effects of the continent's Wisconsin glaciation is in Wisconsin, along the Ice Age Trail.

Ice Age Trail segment in the Lodi Marsh
Extending like a ribbon from St. Croix Falls in Polk County to Potawatomi State Park in Door County, this scenic belt provides a walking tour of geologic beauty both close to home and in some of Wisconsin's remotest places. Carved from land both privately and publicly owned in 30 counties, the trail is currently more than half-finished, about 600 of the proposed 1,200 miles, and is one of only two national trails in the U.S. contained within one state.

Preservation of this geologic fingerprint was the idea of Milwaukee attorney and avid outdoorsman Raymond Zillmer, who believed the trail would tell the story of Wisconsin's past while serving as a wide-scale conservation effort in a state he believed destined for more and more development. And since 1958, thousands of Wisconsin residents have volunteered countless hours to protect, preserve and share this past through the creation of a continuous park along the glacier’s edge.

Last fall, we started section hiking the trail, beginning with the part closest to Madison and working our way north and south. Our progress is slow. It's hard to get very far when you have to park at one end and hike back to the car, and when you only have a day or two every two weeks to go out. But we're getting there. 
The Devil's Staircase in Janesville

Last weekend we completed about half of the Janesville section, about six miles, but 12 for us since we had to walk back the way we came. The trail is a fantastic mix of city, park, suburbia, prairie, and woods. We've hiked--if I can even use that word--around the Farm and Fleet in Verona, past several libraries, and through the downtown streets of Lodi, Cross Plains, and Janesville. We walked through a dark tunnel under a roadway, hoping to find a safer way to cross, only to find ourselves at what appeared to be the underground entrance to a mental hospital. And a few weeks ago, a golf ball from a wildly off course golfer nearly knocked me out as we hiked the strip of woods between a neighborhood and golf course. But we've also hiked through gorgeous restored prairies and along limestone bluffs. 

The Ice Age Trail is truly a wonderful adventure. And one I think I'll be on for the rest of my life at the rate we are going. 

Sunday, October 17, 2010

Defining an "academic diaspora"

Do you have a graduate degree(s) and work in a cultural institution? Then you may be part of the academic diaspora.

My husband, an academic, and I often talk about academia vs. the rest of the world. But in the course of our conversations, it soon became apparent that there was a big category of working people that seemed to straddle the lines between these two, admittedly simplistic, divisions, especially in a place like Madison, Wisconsin: it's the periacademic. And I happen to be one of them.

The periacademic or member of the academic diaspora (take your pick, we haven't decided which term is best) has a Masters degree or even a Ph.D. and works in a library, historical society, art museum, or public broadcasting (ding ding ding! That's me). They have the expertise and training of an academic in many cases but have, by choice or circumstance, ended up outside the ivory tower in some kind of bridge institution.  Many of the stereotypes of academia exist in the diaspora, too, particularly the insularity, ego, competitiveness over seemingly small stakes, and the know-it-alls.

The primary difference between the two, though, seems to be related to audience and the use of knowledge. The role of an academic is to create and disseminate knowledge, while the periacademic curates knowledge. Most of the knowledge created by an academic is for other academics and maybe their students. Some periacademics also create knowledge, but their primary role seems to be taking the information created by the academic and synthesizing, arranging, and packaging it into a form that is accessible and useful to the general public. So in public radio, where I work, for example, we create programs by assembling the academics or other spokespeople who can verbally communicate some segment of information. Historical societies and museums create exhibits and displays by arranging knowledge. And so on.

Despite a symbiotic and seemingly important relationship to each other, the academic and periacademic don't always get along, unfortunately. In my training as a historian, the public historian, the person who works in a museum or some other public historical institution, was derided as lesser and not serious. But the periacademic is the direct conduit to the public that the academic often wants to reach but doesn't necessarily know how. To use the hip science term, periacademia is translational, bringing information and research out of the lab and into the community for, hopefully, the benefit of all.

The academic diaspora seems to be particularly large in college towns. There's a lot of people who want to engage with big ideas but do so in a way that's more outwardly focused. As academic jobs become harder to come by, especially in the humanities, it seems that the diaspora will only get larger. I'm not sure where all these people will end up, particularly since the periacademic jobs tend to be financially perilous as well. Both play essential roles in knowledge creation, interpretation, and dissemination.





Wednesday, October 13, 2010

What foods do you hate?

My mom is picky. Really picky. So guess what I was? A really picky child.

What your parents eat have a really profound effect on what you will eat... at least as a child. The list of things I refused to eat as a kid was far greater than the list I would eat. A sampling of my no-way-in-hell- eating list: salad dressing, Mexican food (more than just a food--a whole culture of food!), fish, french toast, eggs, bananas (my mom actually liked them--I just hated them), and mustard. The last one is particularly amusing in hindsight since I got married at a Mustard Museum. Clearly, I've grown.

It wasn't until I was a teenager and eating dinner at a friend's house that I began to try more foods. And I think the only reason I did try them was because I had also grown up to be a pleaser. I had no idea how to say "no" to anything or anyone so how in the world would I say no to someone serving me dinner?

And so, I soon discovered that salad dressing is actually pretty good, that Mexican food is delicious, and a whole new world of spice and flavor in Thai, Chinese (I'd only had Chinese from Safeway), and Indian food. I've actually introduced my parents to new foods, though, my mom has continued to resist most of them.

Despite the opening of my food world, though, there are still foods I just don't like. Watermelon is one. I know that makes me un-American but I just don't like it. I've tried. I even went to a watermelon tasting at a farm that grows more than 10 varieties and none tasted of anything more than dirty water to me. I should say that I will eat watermelon... I just won't enjoy it.

The object of my primary animus, however, is fennel. That anise flavor is a no go in any form, from black jelly beans to black licorice and raki. While I pride myself on my vegetable love, I just can't muster any love for fennel no matter what I do.

I'm sorry, fennel. I hate you.
I'm happy to leave you my share of the fennel.

Monday, October 11, 2010

Songcatchers

Did you know that the government once paid people to record our nation's musical heritage? In the years before World War II, fieldworkers, evocatively known as "songcatchers," traveled around the country  recording, collecting, and transcribing folk music from everyone from lumberjacks to American Indians and recent immigrants. Many of the recordings ended up in the Library of Congress in its folk music collection, along with photographs and other ephemera from our nation's singers and musicians.


A great picture of some Wisconsin women playing the Swiss bells.  WHI-25191

The urge to collect this music came from many sources. One was technology. As electricity spread and more people bought radios, many folklorists and other songcatchers worried that people would sing along with the radio rather than their traditional music, spelling the end to the rich and vital music of our nation's ethnic heritage.

Another was employment. During the Depression, several New Deal programs, including the Federal Music Project, the California Folk Music Project, the Wisconsin Folk Music Project, and the Resettlement Administration, gave unemployed men and women jobs collecting music. One goal of the Federal Music Project was to record and define the American musical scene in all its variety.

Interestingly, many of the people given the task of collecting folk music were women. Frances Densmore, for instance, devoted her life to the study of American Indian music, visiting Indian communities across the Upper Midwest to study and transcribe their music. Another woman, Helen Heffron Roberts traveled to Jamaica, Hawaii, California, and the American Southwest collecting music and other ethnographic materials.

At the time, music, like many other professional fields, was largely closed to women. Many did not believe women possessed the bodily strength or presence of mind to play music in professional orchestras, conduct, or compose complete pieces. Yet these women became the first to go out in the field and live among their informants, studying and recording music. They truly were pioneers in American ethnomusicology and in pushing women forward into new careers.

This music truly is great stuff. It's nice to think that we once thought it was so important to invest in our culture.

Here's a verse from a song collected in Wisconsin called "Fond du Lac Jail:"
"In the morning you receive a dry loaf of bread
That's hard as stone and heavy as lead
It's thrown from the ceiling down into your cell,
Like coming from Heaven popped down into Hell."


My story on Wisconsin songcatcher Helene Stratman-Thomas will appear in the winter issue of the Wisconsin Magazine of History.

Sunday, October 10, 2010

The Mismatch of Writing and Speaking

I have a new book out, A Short History of Wisconsin, so I've been on the speaking circuit once again. Public speaking is never something that has come easily to me, though it has certainly gotten easier with time and experience. Being out on the road, driving to libraries and bookstores across the state, always reminds me of how disparate the skills of writing and speaking really are--and how rare it is the person who can do both and do them well.

As a writer, I think many of us think that we'll write something and hopefully get it published, all from the safety of our garret. Speak about it? Oh no, that's why I'm a WRITER. I express myself much better on paper than in the open air.

But then you actually get something published, particularly a book, and you learn that hiding away just isn't really an option. You must speak and you must learn to do it pretty well to help promote your work and to, hopefully, keep the work coming. I just don't think that being a good speaker and being a good writer involve the same skill set. So it can cruel indeed to discover that you have to cultivate both of these skills if you want to be a writer. Your readers want to hear you... and see you.

While I'm sure I would vigorously deny it as I'm shaking and sweating in the sidelines before each of my talks, public speaking has been good for me. Talking to readers help make you a better writer because you can get a better sense of the people you are trying to reach through your words. It's also made me more confident and able to think--and speak--on my feet, a skill that I think can be difficult to learn other than by throwing yourself into the experience you are so desperately trying to avoid.

I'll never be an amazing speaker. I'm a writer and researcher first. But I am getting better.

Thursday, October 7, 2010

The Hysterical Woman

Reading 19th century literature, you might start to think that every woman was a swooning, mad hysteric.  Women seemed forever prone to fainting or madness--few were ever fully in their right minds. Most acted somewhat bizarre and with high theatricality, not unlike the clothes that fashion dictated women should wear.


Hysteria was a purely female problem. Men suffered from their own version of nervous disease known as neurasthenia. The supposed problem? Women's small size and supposed governance by their reproductive systems. That's right. Women acted crazy because their womb overrode the power of their brains. The word even comes from the Greek word hystera  for womb.


So little was known about physical illnesses, such as epilepsy and neurological diseases, that the ''nervous'' illnesses were generally lumped in together with them. The idea of neuroses as distinctly and purely in the mind was a minority view. More popular were a whole series of medical models for nervous illness and the psychological and physical options for treating them. Nerves were commonly ascribed a "force" that gave vitality to organs. Hysteria and neurasthenia were said to be caused by a weakness in this force, though the psychological evidence for this "force" and its loss were never evident. Anecdotes provided more powerful proof... at least for a time.  


Hysteria helped to reinforce existing gender and class attitudes. Women couldn't hold positions of power or be trusted to vote if they were not rational beings! Hysteria worked very well at keeping women in the home and out of the public space, just what many men wanted. 


Hysteria could also work in a woman's favor, though. Being classed as "sick" or "infirm" was one way to get out of performing your expected duties as a woman. In the book The Peabody Sisters by Megan Marshall, an amazing biography of three women in the 19th century, one of the sisters, Sophia, is often ill and seems to use her illness to forge an art career for herself. 


Female hysteria had mostly run its course by the 20th century. Doctors stopped recognizing it as a legitimate medical diagnosis and catch-all disease for any number of symptoms. 

Tuesday, October 5, 2010

When Toast Tastes Better

Why does food taste better when someone else makes it for you? Case in point (at least for me): toast.

At home, I rarely toast--at least to eat plain. I'll toast bread for a sandwich, but rarely do I just eat it with butter or jam. But take me out to brunch or to a bed and breakfast in the UK and toast is all I want. This makes even less sense when you consider that the toast often served is made from the cheapest bread that I would never buy at home and usually arrives cold and rather limp on the table. But I will eat every piece with relish, as though it's the most delicious thing you could possibly eat for breakfast. Or even dinner given the opportunity.

Toast has been around for centuries. No one liked eating stale bread so toasting became common in Rome as way to preserve bread. The word comes from the Latin tostum, which meant scorching or burning. Most early toast was held over the fire like a marshmallow or laid on hot stones. The toaster didn't make its appearance until 1893 when Cromptin and Company introduced the first toaster in Great Britain. It didn't reach America until 1909.

This early toaster only toasted on one side at a time. It took another decade for a two-sided toaster with a pop-up feature to appear. We still had to cut our own bread to make toast, though. Sliced bread didn't appear until 1930 when Wonder introduced this modern marvel, which made toast even more popular. Today, nearly 90% of Americans have a toaster in the their home.

So why does toast taste better? Maybe it's the familiarity--the comfort of something simple--that makes toast so powerful when I'm away from home. Or maybe I'm just hoping to see Jesus or the Virgin Mary appear in my bread.

Monday, October 4, 2010

Brussels Sprouts Appreciation Club

I have a, perhaps, unnatural love of brussels sprouts. A whole stalk can easily become my dinner when roasted in olive oil and lightly salted (my husband loves them too, though, so I try to share). And I seriously think many times while eating them, "these are way better than any candy." Seriously. I do.

Many people hate brussels sprouts. And I used to be one of them.

As a kid, my parents would buy bags of frozen sprouts and steam them, topping the whole lot with butter. My mom, despite being incredibly picky about food, happened to love brussels sprouts. I would eat them, but always quickly and sometimes covered in parmesan cheese (the sawdust from the green can, that is). I could never imagine that there would be day when I would voluntarily eat them, much less look forward to their appearance at the market.

But then, about 7 years ago, for reasons still mysterious, I woke up CRAVING them. Having moved out of my parents' house nearly five years earlier, I'm not sure I had eaten even one in that time. Nor had I ever thought about them. So I'm not sure why I needed those tiny cabbages so badly that day. Maybe its my Polish blood craving some cabbage-y goodness. I went to the store and purchased the same bag of frozen sprouts my mom had bought so many times. And I ate them all in one sitting. I loved them. I had to have more.

Forsaking frozen for fresh, I now eagerly await the appearance of the mutant stalk at the farmers market. One vendor had them labeled "Wisconsin Palm Trees" a few weeks back and they did look kind of tropical, with their nubby stalk and oversized leaves flopping over the top. I popped them off the stalk and roasted them for 35 minutes. I ate so many I nearly gave myself a stomach ache. I might be the first person to overdose on brussels sprouts.

To join my brussels sprouts appreciation club,  cook this from 101Cookbooks and see if you can resist.

Sunday, October 3, 2010

The Flow of Electricity

Did you know that we used to think electricity was a fluid? That's why we use words like "current" and "flow" to describe what is clearly not a liquid. But in the 19th century, electricity was a force that many believed had the power to heal and recharge humans.

In the early years of scientific medicine, many doctors came to believe that nervous diseases had physical causes and could only be cured by physical means. Some compared humans to batteries, saying that we sometimes suffered from a low charge that needed a jolt of energy to recharge. This idea seemed to offer a legitimate explanation for the worn out businessman, the languishing youth, and the excitable, swooning woman of the 19th century. This "disease" came to be called "neurasthenia," a name popularized by physician George M. Beard who combined all of the so-called symptoms of the disease under one named cause.

The solution? Electricity (it also probably didn't hurt that Beard happened to be friends with Thomas Edison, the electric wizard of Menlo Park).  Many doctors came to believe that we could recharge our "neural batteries" with electricity. And so various devices purporting to soothe weary brains began to appear with increasing frequency. My favorite is Dr. Scott's Electric Hair-Brush, which claimed to both make hair more glossy and calm worn-out brains.
The era of electric medicine began to come to an end by the turn-of-the-20th century. Because neither neurasthenia nor electricity actually exhibited physical changes--only proclamations of change (suggestion is powerful)--skepticism about its power began to emerge in the medical community. We also had new theories about mental illness and the mind as scientific research advanced. Electricity remained a magical force, though, its enigmatic nature and invisible power provoking the imagination to ever more fantastical heights.