The Fat of the Land

Month: November, 2014



If you grew up in the United States in the past 130 years, you’ve heard the story of a 1621 feast shared by English Pilgrims and Wampanoag Natives to commemorate peaceful relations between the two communities and to share in the abundance that is a New England autumn. They ate wild turkey and venison, corn and cranberries, pumpkin, fruits, and root crops freshly harvested from their colonial gardens and the surrounding wilderness. On the other side of history, we have canonized the meal as symbolic of our nation’s origins. At the time, it was a welcome gorge preceding a long winter, an accidental union of two disparate cultures’ values.

Certainly the Wampanoag viewed such a feast through the eyes of thanksgiving. Holding a sense of thanks through all stages of the year’s cycle, planting, childbirth, harvest, and the hunt were tasks performed with deep gratitude to the powers that made them possible. Likewise, the Pilgrims’ spiritual tradition placed thanks at the center of their religious and daily life.

Feast is the opposite of famine, a dichotomy enshrined by the Puritan Pilgrims that colonized Plymouth Rock, who landed on that challenging shoreline in an effort to find a place to freely practice their unconventional version of Christianity. Staunch minimalists, the Puritans distilled the sum of their annual celebrations down to those that provided the fewest distractions to accomplishing work and observing the power of their god. The Puritans had abandoned Christmas, Easter, and all other Catholic and Anglican holidays before leaving England, celebrating two “holidays” outside of the weekly Sabbath: Thanksgiving Day and Fasting Day.

In line with the natural order of things, Fasting Day fell in spring, when the Puritans, who used fasting as a form of prayer, would hold a special day of fasting when the larder ran low and the seeds, so delicate and small, were placed in the coarse, unpredictable soil. Fasting Day commemorated the request for a good growing year, for relative fortune and health among their community. Thanksgiving Day was Fasting Day’s opposite, when the garden’s bounty (and another year of health) was gazed upon in the form of an indulgent spread, as if to say, “Look, we made it!”

The harvest has long held a singular place of reverence in temperate-climate cultures. Acknowledging that plenty is neither a constant nor a given, harvest time is the most exciting, the most culinary diverse, the wealthiest time of the year. Larders full, game fattened and plentiful, we can relax and tell stories, gather family and neighbors to share memories, thoughts, and thanks. It’s a time of year imbued with a sense of the spiritual; being an interval with the fewest external threats, the world (and our mind) opens itself to magic.

We call that meal shared by Pilgrims and Wampanoag The First Thanksgiving, though they, and two centuries of their descendants, didn’t see it that way. Our contemporary Thanksgiving came about through a mingling of the Puritan’s tradition of a religious Thanksgiving Day and regional harvest celebrations, and didn’t crystalize as a national holiday until the 19th Century. Its late November observance (a month or two off mark from the New England harvest season) is explained by the influence of a third holiday (or lack thereof): Christmas. Having given up the Catholic (and in their eyes, inappropriately indulgent) celebration of Christmas, the Puritans slid Thanksgiving back to winter’s parlor, shining that bright, festive meal onto the dark, cold days ahead.

Now entirely secular, Thanksgiving has become the quintessential American feast. Though most of the Americans that prepare a traditional Thanksgiving meal today have not harvested or hunted its ingredients, they adhere, by tradition, to a seasonal spread of mostly indigenous foods: squash and pumpkin, turkey, beans, cranberries, corn, and potatoes are New World foods that, in this particular configuration, still symbolize a sort of prosperity wrought from effort.

Thanksgiving today is less a sigh of relief for an agricultural season well played than the kickoff to an exhilarating (and often exhausting) holiday season, yet we maintain an emphasis on gathering with family and friends. And though outside forces (creeping Black Friday sales, football, and takeout, to name a few) increasingly compete with another era’s version of Thanksgiving, it remains in our time a cook’s holiday, venerating the traditional and innovative alike (just look at any cooking magazine’s November issue to see that the line between the two is walked carefully by our collective imagination).

Likely, someone in your life has prepared you a delicious feast on this day. Perhaps you have returned the favor. If you have been so lucky as to do the work or watch it, to smell the sacred rite of plenty, to let your mind slip into a starch-induced journey through memory or musing, then you know Thanksgiving’s least cynical secret: that food, as it is given and received, is a pure expression of love.


The Last Berry


Unlike the berries of summer, we tend to leave cranberries to the specialist. We happily plant a strawberry patch, wrestle canes of raspberries or blackberries into rows, tuck a blueberry bush here or there, but few gardeners ever think to grow cranberries. It isn’t so surprising—our national opinion of this peculiar fruit is limited at best. Cranberries are an essential part of Thanksgiving dinner; dried, they add a healthful zing to muffins or salads; juiced, they mix a sweetly tart cocktail. Why grow a fruit that is best after processing, that keeps well in the freezer, that you would never, on a stroll through the garden or market, eat out of hand?

For most of its history, the cranberry hasn’t been grown by anyone. Native to the eastern United States and Canada, foraged cranberries were an important food and medicine source of the indigenous communities of the region. After drying, cranberries were pounded into pemmican, a shelf-stable mixture that also included dried meat and fat (the original protein bar) and was widely adopted by voyageurs and other explorers needing packable foods.

Member of the heath family, a tribe of low-growing, berry-producing shrubs that rings the globe’s northern latitudes, cranberries have many cousins, notably the blueberry, huckleberry, and lingonberry. Though modern-day cranberry cultivars were developed from the North American species, Europeans tasting them for the first time may have recognized their citrusy tang. A number of similar species are found in northern European countries, growing alongside ponds and bogs, spreading through mossy understory, in a range that spans Iceland to Russia.

Especially favored in Nordic cuisine, lingonberries (like a slightly sweeter cranberry) are cooked into sauces, syrups, relishes, and pastries. Russian texts describe preserving them in jars topped with water to make a sour constitutional beverage enjoyed for its bracing flavor and widely acknowledged medicinal qualities. The English maintain a tradition of eating cranberry relish with the Christmas meal, a habit that likely extends from medieval times, when wild-collected cranberry relatives were mashed with spices or sweeteners to accompany wild game.

Despite their prominence throughout the northern hemisphere, cranberries and their ilk did not become a domesticated crop until the 1800’s, when New England farmers were plowing through one agricultural fad after another. Adventurous gardeners added a patch; farmers and horticulturists gathered choice cuttings from wild plants and embarked on the long process of taming them for commercial production. Having become a popular winter fruit (high in vitamin-C, cranberries help prevent scurvy—and blandness—in starch-heavy winter diets), growers set their sights on creating a new and lucrative industry.

Ironically, it was a British scientist (in his home garden) who made the first breakthrough in commercial cranberry cultivation. In the wild, cranberries often grow near bodies of water, an observation that led many to attempt growing them in standing water (a technique that resulted in stunted plants). Some planted them into prime garden soil, the sort any lettuce or carrot would thrive in, but the cranberries recoiled. Curious about the little-understood needs of this trendy New World fruit, Sir Joseph Banks, explorer and horticulturist, brought a few back to his English estate after a visit to America.

Setting them in boxes drilled with holes, he layered rocks, then soil and detritus from a nearby bog, before tucking in the cranberry starts. The planter was submerged five inches below the surface of a pond, allowing the lower roots access to consistent moisture, while elevating the majority of the root zone above the water line. The technique worked, and Banks soon had a thriving cranberry patch. Though contemporary growers have ditched the planter boxes for a system of dykes, Banks’ discovery revealed the cranberry’s preference for a combination of sandy, well draining but humus-rich soil and a high water table.

Cranberry plants send out lateral branches that can put down roots, self-propagating one plant into many in just one growing season. A cranberry patch is a tangled thing, impossible to preserve in tidy rows suited to mechanical weeding. Thus, organic cranberry growing is challenging; conventional growers treat weeds with herbicide, organic growers must pull them out by hand.

Many of us still associate cranberry farming with water. If we have any image of a cranberry patch at all, it is of one at harvest time, when most growers flood their fields. Since cranberries are partially hollow, they float. Farmers agitate the plants below the water level and the berries pop up, painting the surface of the bog a striking red and making for an efficient harvest.

A small number of growers harvest their berries without flooding, resulting in better quality fruit for the fresh market. Fresh cranberries give an especially good pop to pastries that showcase whole berries, such as tarts or cakes. Treated like grapes, they have an under-explored savory side, fit for roasting on their own or alongside pork or poultry. Roughly chopped and mixed with some combination of herbs, alliums, or citrus, fresh cranberries can make an intriguing winter salsa that adds a touch of lightness to heavier winter fare.

And that’s the charm of cranberries, extending the berry season into the realm of storage crops, bringing brightness into a season whose other flavors are pulled from the dark soil.

Grass and Muscle


In the early 2000’s, I was not long out of my childhood home, still new to shopping and cooking on my own. I’d been a bona fide vegetarian for four years, a habit I picked up while working at a small-town food coop that sold sometimes ragged but always fresh organic vegetables.

In college, I started having dreams about meat—juicy hamburgers hovered in my subconscious until I couldn’t take it any more and at a campus-wide picnic my junior year I just walked up to the grill and held out my plate. Meat found its way back into my diet, but as journalists and documentarians unveiled suspicions of the commercial meat industry into the mainstream, my coop conditioning predisposed me to listen.

The narrative they carved has become well known to foodies and locavores. Industrial meat production (i.e. cheap meat) relies on a paradigm shift: feeding grain to animals with stomachs that evolved to digest grass. In order to keep the cattle from getting sick on a food source that could literally kill them, they are treated with antibiotics. Such operations keep costs down by packing the animals into lots where they mill around in their own excrement and eat grain (whose price is kept artificially low through government subsidies) all day long.

Such measures become necessary in a country where the average person consumes over 75 pounds of red meat each year (a statistic that is deflated by the inclusion of babies and non-meat-eating populations—the actual statistic for meat eating adults is likely much higher). As urban populations put continued pressure on available rangeland and its resources, industrial meat (especially beef) has become an increasingly bad bargain, calorically speaking. The patty of a quarter-pounder requires almost seven pounds of feed, putting to question (as many have) the logic of a system that ties up farmland to produce grain to feed an animal that must be medicated to eat it whose digestion of it does not increase its protein (or nutritional) value.

Enter: grass. Well, pasture, actually, a blend of grasses, broadleaved plants known as forbs, and cereal grains (in their vegetative state). It’s what our ruminant companions have evolved to eat, whose stomachs, unlike ours, can convert it into protein and fatty acids. Pasture-raised cows produce what many consider to be healthier meat, but it takes longer and comes in smaller yields per acre than grain-fed cows (however, the acreage per cow is not so small in grain-fed operations if you take into account the acreage required to grow their feed).

In the infancy of its renaissance, grass-fed beef had a reputation for being tough and gamey. Producers had to relearn the intricacies of raising beef on pasture; older breeds, better adapted to a grass diet, needed reinforcing; consumers, used to always-tender high-fat beef, needed educating. High-quality grass-fed beef is not as simple as access to plentiful pasture. It requires (as does all good agriculture) attention to the soil, to land management, to incorporating diversity into the farm’s system from the pasture-mix on up to the grazers.

Several production techniques greatly influence the quality of grass-fed beef: dialing in the right combination of forage species and cattle breeds for your region, leaving the cows on dense pasture until they have put on the requisite fat (a process that can take twice as long as a grain-fed cow), and allowing the slaughtered cow to age (a method called “dry aging” that allows naturally present enzymes to tenderize the muscles).

No other food has saturated our cuisine and cultural identity as much as cattle. From steak, hamburgers, and hot dogs to cowboys and country music, a prominent brand of American identity is intimately bound to cows. The statistics don’t lie—we like our beef. So much so we sometimes dream about it. Yet, during the first years of my return to carnivorism, the meat aisle was always a challenge.

The problem for me was price, my eyes darting back and forth between the words and the numbers, the label claims and the level of financial burden purchasing the claims I felt good about would bestow. There’s no way around it: good meat is expensive, ethical good meat even more so.

In the future, we may find a world less accommodating to cattle—with the possibilities of drought, farmland degradation, super-bacteria, and food shortages looming, our fixation on beef may have to eventually wane. Ranchers that have transitioned to pasture are doing the hard work of building sustainable systems that could navigate the challenges ahead. Supporting them means eating some of the best meat out there (though likely you’ll be eating less of it). It also means investing in the next phase of cow culture—one with a focus on longevity over quantity.