Alaina M. Gercak
In class, we discussed one potential reason for why our human ancestors chose agricultural production over their traditional hunter-gatherer lifestyle: food security. The human tendency to avert or diminish risks/losses was one hypothesis explaining the choice to settle and invest time and resources into food production. Naturally, we as humans tend to play it safe in order to avoid losing out entirely (eg. choose a sedentary lifestyle to plant food or farm livestock for greater guarantee/storage of resources, but give up the chance to hunt and kill a wholesome prey to avoid the risk of coming back empty-handed or with less than needed). As Jared Diamond discussed, there’s the overwhelming suggestion that agricultural production and greater food security lead to higher birth rates and human population density. Further assumptions such as decreased starvation and greater ease in feeding offspring and ultimately all other non-farming members of community can also be made. From these ideas, it seems that agricultural production sparked the positive growth and maintenance of human populations and societies.
In comparison/contrast, I therefore wonder how this risk-aversion tendency in human nature, which lead us to adopt such sedentary agricultural practices, still plays out in our modern food system. The U.S. has shown a risk-averse attitude over the years by allocating revenue to subsidies to encourage farmers to grow corn. While the reasoning for these subsidies mostly falls within the economic realm (eg. establishing a price floor), we've found ourselves with a surplus of corn, which could be considered an example of food security in the most basic terms. Having such an abundance, however, does not necessarily deter ‘risks,’ and can even increase them. In an article from environmental news blog Grist, Donald Carr claims, “What our maze of farm and risk management subsidies does is facilitate a highly risky agricultural business model that makes our industrial food system overly dependent on one or two grain crops.” The article, as well as another by the editors at Scientific American, both go on to explain that the greater supply of corn means cheaper corn prices, leading to an abundant appearance of corn in inexpensive, unhealthy foods, thus resulting in a greater consumption of less nutritional foods causing greater instances of obesity and the deadly health risks associated.
In the long run, paying farmers to grow and store all of this corn creates a loop that results in greater physical health risks to humans in the U.S.. More food doesn't necessarily mean greater outcomes.The surplus of this commodity crop allows unhealthy foods to dominate the market at the cheapest prices, while nutritious specialized food items, like fruits and vegetables, are less appealing to consumers with their greater price tag. As described in the Scientific American article, this is reflected in Americans’ diets: it’s clear that malnutrition and obesity are prevailing. Would we be ‘better off (health-wise) if we had stuck to being hunter-gatherers? At this point, it’s difficult to imagine what current human lifestyles would be like if this was our way of obtaining food. While I assume we might be eating more nutritious foods, it would be highly challenging to feed a massive population this way. Realistically, in order to instill a more true form of food security, I think the U.S. government should aim to not only avert economic risks, but to ideally reduce starvation/malnutrition and obesity by refocusing our agricultural production on crops better suited to maintain good health in our society.
"For a Healthier Country, Overhaul Farm Subsidies." Scientific American. Published online Apr. 19, 2012. http://www.scientificamerican.com/article.cfm?id=fresh-fruit-hold-the-insulin
Carr, Donald. "Corn Subsidies Make Unhealthy Food Choices the Rational Ones." Grist. (Date published not available). http://grist.org/article/food-2010-09-21-op-ed-corn-subsidies-make-unhealthy-food-choices/
Diamond, Jared. Guns, Germs, and Steel: The Fates of Human Societies. New York: W.W. Norton. 1998. Print
Agercak 21:41, 10 March 2013 (UTC)
- You point out something interesting -being malnourished is a different issue from being undernourished. A person could be getting enough calories, but not enough nutrients and vise versa. I wonder which one tended to be the issue during the transition from hunting and gathering to agriculture?
Cuden 17:50, 13 March 2013 (UTC)
- I'd imagine developing humans suffered both from being undernourished and malnourished between that transition, and that when agriculture intensified and more cereals were grown, calorie intake increased but malnutrition probably lasted longer (so I would assume). Agercak 02:23, 14 March 2013 (UTC)
- Not only does subsidizing corn lead to obesity in the US, but it also leads to poverty in developing countries, where there are no farm subsidies and farmers can't compete with prices. This then leads to more malnourishment. So, I completely agree that these risks seem to overshadow any benefits. I wonder how subsidizing more nutritious fruits and vegetables in the U.S. would affect developing countries.
Lbaker 19:56, 13 March 2013 (UTC)
- To be honest, the resulting poverty in other nations due to American corn subsidies didn't even pop into my mind while writing this, so I'm very glad you brought it up! I'd have to do more research into how much developing nations export 'healthier' fruits and veggies to get a better idea of what American subsidies for those crops would do to their economic situation as well.. Great question though. Agercak 02:23, 14 March 2013 (UTC)
- Alaina, your connection that risk aversive actions (such as the promotion and production of surplus corn) can lead to even greater risk (if such a dominant crop fails) is striking. I'm curious how newer systems such as todays CSA's, in which all participants share the risk, plays into this discussion. Do you prefer a system like this? Do you think its possible for something to like it to be implemented nationally? What about globally between developed and developing nations? Rkelleher 20:38, 13 March 2013 (UTC)
- I'm definitely a fan of CSA's, though they didn't come to mind at all while thinking about risk aversive actions (but I'm glad you brought it up because it's absolutely relevant!). What really attracts me to CSAs is the fact that the system is based on planting and harvesting a variety of crops at particular times throughout the season, rather than focusing on only one or a few crops (which can be riskier if that crop fails). Farmers and shareholders have mutual share of the risk, but even if a certain crop is bad one week, there are usually plenty of other crops offered in the 'share', so shareholders aren't losing out on food, and the farmers don't have to worry about losing out on money (since they're usually paid one sum ahead of time). Plus, the farming practices are typically organic and sustainable, and members are receiving wholesome, fresh, nutritious foods. I don't know that we'll ever see a national CSA, and I'm unsure of how the economics would work, but ideally I think it's a wonderful system that unites farmers and nonfarmers with simply good (and good for you) food. Globally is an even bigger venture, but again, I do think it makes more sense to invest in a variety of crops based on seasons; if we could sustainably provide more nourishing foods at an affordable price that benefits both consumers and farmers across the globe, I'm all for it. The question is how.. Agercak 02:23, 14 March 2013 (UTC)
- Question of whether modern ag practice is, in some part, an expression of risk-aversiveness is, in fact, quite interesting (and something that is folded into some economic analyses). Complicated though because of two things: 1) WHOSE risks is determining things (e.g., consumer, grower, supplier -- each of whom might make a different assessment of risk/benefit... probably simpler in a huter-gatherer situation?), and 2) Is the assessment of risks reasonably accurate or biased (i.e., do we put resources into averting risks that aren't actually as big as others that we're not addressing). The latter question is, of course, at the root of LOTS of risk/benefit analyses, and it's clear that we're really quite bad at accurate risk assessment in many situations... Kwoods 20:57, 13 March 2013 (UTC)
- The question of 'who' definitely came to mind while writing this, and you're right: just thinking about the cost-benefit analyses involved is mind-boggling (at least to me!), but definitely something to keep in mind when looking at the choices made regarding agricultural practices both now and in the past. Agercak 02:23, 14 March 2013 (UTC)
A few days ago while reading “Empires in the Dust” by Karen Wright, a particular assumption kept coming up throughout the scientific narrative discussing the potential cause to the end of some Mesopotamian city-states: human societies of this time weren't largely affected by natural forces or disasters. What? That can’t possibly be right, I thought while reading. Yet this wasn't a notion lightly mentioned by Wright only once; she brings it up multiple times throughout the article. It begins with “Prehistoric societies, simple agriculturists—they can be blown out by natural forces, says Weiss. But the early civilizations of the Old World? It’s not supposed to happen.”
Well, why not? It’s clear through archaeological discoveries and other research that humans of this time period were more technologically advanced, with intensified agriculture and improved infrastructure, weapons, tools, etc. However, these advancements pale in comparison to the resources and developments we hold today, and we’re still greatly affected by such disasters (look at Katrina, Sandy, etc.). While current disasters like super storms have not completely obliterated a city to the point of driving out every resident, they (along with other natural forces like droughts, floods, tsunamis, earthquakes, etc.) have certainly done some damage.
Perhaps that is what Weiss, whom Wright was referencing, meant in his assumption: the humans of the Old World had reached a point where their advancements and structural achievements typically prevented natural disasters from forcing a city-state to pick up and leave entirely, or be destroyed. Still, the ending sentence, “And the notion that civilizations are immune to natural disaster may soon be ancient history,” still bothered me. What made this ‘notion’ so mainstream? Where’s the research that shows these humans were so strong against environmental forces? I tried using Google Scholar to find articles that support this apparently wide-believed assumption, but my search came up empty.
Bibliography: Wright, Karen. "Empires in the Dust." Discover Magazine, 01 Mar. 1998. Web. 26 Mar. 2013.
Agercak 23:22, 26 March 2013 (UTC)
- I had similar thoughts while reading that article. Maybe its just that its hard for me to distinguish between my own conceptions and those of the "mainstream" but I would never have identified a notion that civilization is immune to natural disasters. It seems like at most a lot of people have a notion of civilization being immune to, or able to control nature to a (food production, modern medicine, urban development etc). But when it comes to natural disasters like floods and droughts it just seems like there's too many examples of such things kicking peoples' butts for there to be a serious notion of immunity to them.
- This gets to an important question -- or warning? -- about how we're prone to think. It's possible that what was generally accepted, in both popular mind and even scholarly circles, as baseline wisdom as recently as a couple of decades ago just doesn't seem so obvious anymore! Worth thinking about in terms of how what we think obvious might seem in another couple of decades. Of course, it's also possible that Wright is using this rhetoric as a journalistic device -- we're more inclined to think what she/Weiss have to say is really interesting if we're convinced it overturns conventional wisdom... Kwoods 13:55, 13 April 2013 (UTC)
Before taking the time to read Brian Donahue’s “Another Look From Sanderson’s Farm,” I wanted to give my own initial critique and analysis on some of the points made in the 1966 article, “The View from John Sanderson’s Farm,” written by Hugh Raup. The author insists that land does not determine prosperity; human ideas do. In the case of the rise and fall of farming in Petersham, Massachusetts, nothing about the land changed to consequently lead to agriculture’s demise; sparked by the Industrial Revolution, increased transportation systems, and the advanced technology of farming implements, Midwesterners decided to farm more, creating grave competition for the Southeastern New Englanders. Raup claimed that the decision made by farm families like the Sandersons was not because of any change in their land, but rather because of these other conditions of the time. This is interesting to think about: the cause and effect of land use does not always derive from any factor pertaining to the land at hand, but from the use of land elsewhere along with other socioeconomic changes.
While such a point is intriguing to toy around with, other arguments within the article seem clearly attached to an agenda or particular viewpoint. For example, Raup claims that the “future” that a conservationist talks of is not visible to any farmer, manufacturer, or businessman. This was in 1966, a time of much less governmental regulation than we see today, and yet such an attitude can currently still be seen. Many Americans argue that conservation efforts and regulations, though protective of precious natural resources and organisms, hinder the progress of humans through technology and business. To me, Raup seems to identify with this viewpoint (though perhaps I’m misreading him? It would seem odd to me for an ecologist to take an anti-conservationist stance. Is that wrong to assume?), which I suppose is understandable for such a time.
Yet, to me this mindset is sort of inexcusably pessimistic. While I understand that the “horizon” is never completely foreseeable, it seems narcissistic for the human race to shrug away the potential externalities on the natural environment (which may later affect their own well being) for the sake of current advancement (whatever that is). Raup does bring up the fact that it is difficult to properly plan and allocate the use of natural resources and land (is this, in essence, sustainability?) when the ‘future’ is unidentifiable; still (and this may sound biased), I don’t think it makes sense for humans to simply discard possible future implications when deciding how to deal with the needs of both themselves and nature today.
Bibliography: Raup, Hugh M. The View from John Sanderson's Farm, a Perspective for the Use of Land. Forest History, 1966. Print.
Agercak 00:19, 18 April 2013 (UTC)
- I've always wondered whether Raup was really as cynical as he comes across here, or just trying to get people to think about the tension between conservation agendas and day-to-day economic decisions. But it is interesting that he was more or less a contemporary of Aldo Leopold; how does this contrast with Leopold's writings? Kwoods 00:26, 6 May 2013 (UTC)
My first take-away from Aldo Leopold’s “The Land Ethic” was that the individual is a part of a whole community. “The land ethic simply enlarges the boundaries of the community to include soils, water, plants, and animals, or collectively: the land,” he explains; it is a perspective that considers the well being of other natural beings and elements other than ourselves. I appreciate the fact that he points out the contradiction between our supposed ‘love’ of the land and our overall treatment/use of it, and emphasizes the way we utilize natural resources for our own economic benefit, with little thought to how that excessive use and consumption can ultimately damage ourselves: “we have learned (I hope) that the conqueror role is self-defeating.” Before this sentence, he establishes that the land ethic “implies respect for his fellow-members, and also respect for the community as such.”
My thought: if we do not even respect our fellow humans, I’m not surprised if we cannot respect our land! Does environmental degradation differ across countries with different levels of human rights? I would assume a negative correlation between the wellbeing of the environment (forests, water, soil, air quality) and the degree of a country’s oppression. In my logic, if a country fails to provide many of the measures stated in the United Nation’s Universal Declaration of Human Rights, such as protections from slavery and torture, and the rights to equal treatment despite gender, age, religion, etc., then I don’t see why they would even think to champion the right to a clean and protected environment. Many a times, countries that fall to the lower end of the human rights spectrum are countries that struggle to succeed financially. In an extreme case like North Korea, for example, the closed-off authoritarian nation has suffered both environmental and economic struggles since the 1950s during and following the Korean War. According to a 2012 article on the New York Time’s Green Blog by Joanna Foster, North Korea was making progress with reforestation efforts until widespread famine hit in the 1990s, resulting in highly intensified agricultural production and deforestation. These actions were attempts to create better food and fuel security for this nation isolated from the rest of the world, yet they left them with “desertification, soil erosion, nutrient depletion, and epidemics of pests that further contribute to the country’s food insecurity.” Their panicked reaping of the land has also further contributed to greater climate change than expected; during a conference hosted in this country that involved scientists from North Korea, China, and the United States to discuss the state of the environment in North Korea, it was stated that “the average temperature in North Korea had increased by almost 2 degrees Celsius, while globally the average increase over the same period was only about 0.7 degrees. The problem was also clearly worse in the north, where deforestation has been the most severe“ (Foster).
On the contrary though, the United States is quite far from North Korea on a human rights scale, yet environmental degradation still exists through deforestation, pollution, intensified agriculture, etc. While I wasn’t able to find any sort of study or data on the subject of a correlation between human rights and environmental conditions, I would argue that comprehensive research on this hypothesis could provide an interesting look and affirmation of Aldo Leopold’s notion that the land ethic stems from humans who respect all members of the community - from fellow man to the land.
Foster, Joanna. "Q. and A.: North Korea’s Choked Environment." Green Blog. The New York Times, 30 Mar. 2012. Web. 03 May 2013.
Leopold, Aldo. 1949. The Land Ethic
Agercak 21:06, 3 May 2013 (UTC)
In this week’s reading by Navin Ramankutty & Jeanine Rhemtulla, the authors briefly explore a hypothesis that aligns with the title of this article: “Can Intensive Farming Save Nature?” They explain the notion that land sparing may be a more environmentally-friendly practice in that it spares some habitats from being disturbed at all, preserving their biodiversity, while only intensifying production on lands already disturbed.Their following anti-thesis claims that there is a lack of empirical evidence to prove that greater intensification of land already converted to agricultural use actually promotes the sparing of other undisturbed lands. In US history, could it be said that we have intensified AND expanded land for farming? For a while that seems to have been the case in the 19th and even 20th century with westward expansion of the yeomen; but today, is farmland still expanding, or is it dwindling? Even if it is decreasing, is the land necessarily being preserved, or is it merely being disturbed in another fashion, such as urban sprawl?
After doing some research on these questions, I found myself surprised with the results. The Economic Research Service (ERS) of the USDA conducts a study every five years that measures Major Uses of Land in the United States. The National Association of Home Builders summarized the ERS data from 2002, initially making clear that yes - the percentage of urban or developed land in the United States has increased; but, they caution that this alone does not tell us whether or not the amounts of land used for agriculture or preserved for conservation consequently decreased. Interestingly enough, I later read that while urban land has quadrupled since 1945 until 2002 with a doubling of the U.S. population, it makes up only 2.6 percent of total land usage. Forest-use takes up most of the land at 28.7 percent, and grassland pasture/range comes in second with 25.9 percent. Cropland is in third at 19.5 percent, which didn’t particularly surprise me. What did shock me was the rate of increase in not only urban sprawl, but special-use areas which include wildlife areas, national/state parks, and wilderness areas. From 1959 to 1997 for example, land for urban use increased by 144 percent, but wilderness areas increased by 176 percent, national and state parks increased by 230 percent, and land designated for wildlife skyrocketed with an increase of 476 percent. After looking at the data more carefully, it made sense to see when these stark increases in land devoted to conservation took place: the late 1970s and early 1980s, a time when environmentalism and associated policies and regulation were arguably at their strongest, or were at least taking off faster and more successfully than it seems they do today. They then tapered off in the later 1980s and show a similar growth to that of urban land.
So what does this all mean? On a personal level, it means my own assumptions that urban sprawl dominates American land are quite incorrect. Could our nation handle an expansion in land use for sustainable farming of livestock and crops? It seems plausible, considering it isn’t as great of a concrete jungle as I had imagined. The suggestion made by Ramankutty and Rhentulla, to practice land sparing or land sharing based on the type of biome a particular region exists in, seems to make sense; a one-size-fits-all solution to the growing demand for food paired with the conservation of our environment is all but realistic. While some may be quick to judge the role of agriculture in terms of environmental degradation, I would hypothesize that agriculture itself isn’t the problem: it’s a mismatch of approach to region that can result in greater externalities. As the authors mentioned, it would seem more beneficial overall for a land sparing method to be implemented in areas where pristine tropical forests hold a rare treasure of undisturbed biodiversity; intensifying the lands already devoted to agriculture can then feed more malnourished people of these regions without destroying their ecological services of the rainforests. Land sharing, or expanding agricultural production across a greater area but through low intensity, ‘wildlife-friendly’ means in temperate regions such as the US also makes sense, where the population is not as susceptible to food shortages, and the land is not home to such undisturbed biologically diverse resources that must remain completely untouched. Based on this article and the further research I’ve looked into, I wonder what arguments exist against such region-based approaches to solving hunger and environmental sustainability.
Emrath, Paul, PhD. "NAHB: Residential Land Use in the U.S." National Association of Home Builders, 5 Dec. 2006. Web. 12 May 2013.
Ramankutty, N., and J. Rhemtulla. 2012. Can intensive farming save nature? Frontiers in Ecology and the Environment 10:455–455.
Agercak 01:00, 13 May 2013 (UTC)
- Thanks for those numbers Alaina! One thing that came to my mind while reading your abstract was the idea of urban gardens. While the land used for food production within suburban and urban sprawls is probably comparatively minute to the numbers you researched, a movement to use what was once considered marginal space in the city to grow food is certainly present. It seems to me that the presence of urban, rooftop, and vertical gardens might exist within both land sharing and sparing practices. It's interesting to think of sharing/sparing in terms of populated areas rather than just "pristine" natural areas. So, how much food would we need to be growing in cities to create meaningful numbers within a land usage assessment like this one? Are those numbers possible to achieve? Rkelleher