302FCEFE-9797-1FB0-9E7A10FEA500205A
414FD00B-A933-849D-2D591D81D1B2345F

Professors take on commonly held, but not quite correct, assumptions

Illustrations by Richard Mia

One of the perks of membership in the Hamilton College community is sharing that space with 200 or so faculty members who can challenge and inspire us with their knowledge. They can also, should the need arise, set us straight on a thing or two. We thought it would be fun to ask a smattering of them to do a bit of academic myth busting by addressing an oft-held belief in their field that just isn’t true.

And we were right: It was fun. Here’s what the professors had to say.

Jackson Pollock was a master of chaotic motion

Katherine Brown, assistant professor of physics

FRACTAL: any of various extremely irregular curves or shapes for which any suitably chosen part is similar in shape to a given larger or smaller part when magnified or reduced to the same size. (www.merriam-webster.com)

Pollock fractalsIn the 1970s and 1980s, a branch of physics known as chaos theory gained widespread popularity. Books like Benoit Mandelbrot’s The Fractal Geometry of Nature and James Gleick’s Chaos described extraordinary systems that were highly sensitive to initial conditions, demonstrated random evolution and possessed fractional dimension. Ordinary systems, on the other hand, don’t change much for a slightly different set of initial conditions, evolve in a predictable, linear fashion and have integer dimension (e.g., a line is 1-dimensional, square is 2-dimensional, etc.). Filled with copious examples and beautiful pictures, the authors argued that these extraordinary systems and processes were everywhere, from cauliflower to coastlines to the stock market.

In 1999, a group of physicists announced a fascinating interdisciplinary result — the famous drip paintings of the late abstract expressionist painter Jackson Pollock were yet another example of chaotic motion and fractal geometry. They claimed Pollock had mastered the language of nature (chaotic motion) as he danced around the canvas splattering paint, and that when subject to the techniques of fractal analysis, his paintings revealed a unique mathematical signature. This signature was so unique that its technique could be used to distinguish authentic Pollocks from fakes.

It turned out that this claim was entirely false. The fractal characteristics present in Pollock’s work are neither robust nor unique. They can be found in both mundane scribbles made with ordinary, non-chaotic motion and fraudulent drip paintings made by artists emulating Pollock’s style. The problem with the “hypothesis of fractal expressionism” is one of range — Pollock’s paintings, though some are quite large, are still too small to reliably infer fractal characteristics. When one views a system over this limited range, it is bound to look fractal even if it is rigorously known to be non-fractal. Pollock himself would probably not be surprised, as he famously wrote in a telegram to Time magazine, there is, “No chaos, damn it.”

ADHD is found only in children

Tara McKee, associate professor of psychology

As a clinical psychologist who studies Attention-Deficit/Hyper­activity Disorder (ADHD) symptomatology in adults, I am often confronted with the common misconception that ADHD is a disorder that only occurs in childhood. For decades people believed that children with ADHD simply outgrew the disorder. What we now know from a variety of studies that have followed children with ADHD over time is that the majority of children with ADHD will display some continuation of symptoms and functional impairment into adolescence and adulthood.

So, why is there confusion about the persistence of this disorder? One explanation is that the symptoms of ADHD can change over time. Whereas problems with attention seem to be relatively stable, other more obvious signs of hyperactivity/impulsivity can decline with age and may be replaced by less overt symptoms such as feelings of restlessness or difficulty relaxing. For 45 years the diagnostic criteria for ADHD remained relatively unchanged in terms of the focus on symptom presentation in childhood. It wasn’t until 2013, with the publication of the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders, that the diagnostic criteria caught up with the research literature: reducing the number of symptoms that an adult needs to display in order to be diagnosed as compared to a child and including symptom descriptions that can apply to a broader range of ages.

Now that the diagnostic criteria have been updated, will the myth persist? My guess is that it won’t be researchers or clinicians who help change this public misconception; instead, it will be pharmaceutical companies who advertise medications for adult ADHD in magazines and on TV that help change the belief that ADHD is only a childhood disorder.

The Middle Ages were extraordinarily cruel and violent times

John Eldevik, associate professor of history

When people think of the Middle Ages, one of the first things that comes to mind are images of chaos, violence and wanton cruelty. Think of how the medieval fantasy setting of the Game of Thrones series facilitates the show’s infamous graphic violence. How often have violent extremist groups like Al Qaeda and ISIS been described as “medieval?” Popular entertainment continues to promote phenomena like the Crusades and witch burnings as typical of medieval culture, and even eminent intellectuals reinforce this idea. Harvard psychologist Steven Pinker recently asserted in his best-selling book The Better Angels of Our Nature that murder rates in the medieval period were roughly double what they were in later periods when Europeans became more “civilized” and learned to discipline their violent emotions.

Medieval violenceAs medievalists like Oren Falk at Cornell have shown, however, such assertions are problematic, and Pinker fundamentally misunderstands the evidence for actual violence in the Middle Ages. In a period when there was no state that could claim a monopoly on legitimate violence and police crime, people often had to take justice into their own hands. Vendettas were still governed by norms and rules, however, which had to be observed if the killing were to be considered just. Moreover, images of extreme cruelty, just as today, were often used propagandistically to smear the reputation of an opponent. Medieval people were just as horrified as we are by accounts of terroristic violence and unjustified killing. Modern readers cannot take depictions of medieval violence at face value. Some of the most infamous instances of “medieval” cruelty, such as torture and witch burnings, actually took place in the post-medieval period. It was Michelangelo, Galileo and Newton’s world that was filled with panic over witches, not Aquinas’.

Algebra is just solving for x

Courtney Gibbons, assistant professor of mathematics

830 A.D.: al-Khwarizmi finishes his masterpiece. The punch line? Solving for x in a polynomial equation like 2x2 + x - 5 = 0 using al-jabr or “the restoration of broken parts.” This treatise is the first known instance of generalizing fixed quantities (numbers) using unknown quantities (variables). [Note: This isn’t technically true; using variables for numbers first occurred in 1591 A.D. Before that, al-Khwarizmi  would have written something like, “Two times the square plus its root is five units.” In this sentence, “the square” is the unknown quantity x. Yikes!]

1800 A.D.: Algebra becomes “modern” (as in Hamilton’s Modern Algebra course) when Gauss starts generalizing properties of entire number systems. Forget the lonely polynomial equation above. Instead, look at the collection of all ax2 + bx + c. This collection includes 0x2 + 0x + 5 (a.k.a. 5), and 13x2 - 5x + 0, and any other combination you can dream up by putting in numbers for a, b and c. When you add three of these polynomials together, it doesn’t matter which two you add first; this is called the associative property of addition. If we take all the properties that polynomial addition and multiplication have and write them down, we get an algebraic object, A. Careful, though. Too many axioms, and the only number system that satisfies them has only one number: zero! Defining this to be the zero object means that now we can study the equation A = 0.

Today: Variables allow us to use equations like y = 2x + 17 to describe relationships. Algebraic objects let us describe relationships, too. For example, CGI-animators use them to encode how things move through space. In fact, to get the gist of what today’s algebraists do, imagine the movie Inception. Now replace the movie stars with mathematicians and the action with quiet scribbling: the next step is to generalize entire collections of algebraic objects and their properties! By diving into al-Khwarizmi’s dreams, algebraists occasionally catch a glimpse of a perfect world — restored by algebra.

Hamilton was one of the least important of the founders

Carl Rubino, the Winslow Professor of Classics

Alexander Hamilton was a dynamo whose intelligence and industry played a central role in creating our nation. He was born out of wedlock in the Caribbean — John Adams labeled him a “Creole bastard.” In 1765, when he was 8 (or perhaps 10, depending on which birth date you accept), his father abandoned the family. Three years later his mother died, leaving him orphaned and penniless. He was put to work for Nicholas Cruger, a prosperous St. Croix merchant. What happened henceforth seems nothing short of miraculous. The boy proved precocious beyond words, so much so that Cruger, during a bout of ill health, put him in charge of the business. In 1772, a ferocious hurricane swept over the island. The teenaged Hamilton published a brilliant description of the catastrophe, thus securing his future. Money was raised, and he was sent to a preparatory school in New Jersey. In 1774, he entered Kings College, now Columbia University.

In 1776, he left Kings to join the fight against the British and a year later began his long and immensely fruitful association with George Washington. The rest, as they say, is history. Alexander Hamilton went on to become one of America’s greatest lawyers, at the same time dedicating himself to the founding of the nation, where he became, in his early 30s, the first secretary of the treasury.

His accomplishments, far too numerous to list here, include writing the overwhelming majority of the Federalist Papers, establishing the constitutionality of a national bank (today called the Federal Reserve) and founding the Coast Guard and the Bank of New York. He was also a lifelong opponent of slavery and champion of the underdog.

Far from being one of the least important of the founders, Alexander Hamilton, along with George Washington, was perhaps the most important.

A film professor must have a favorite film

Scott MacDonald, visiting professor of art history

As a professor of film history, I am often asked, “What’s your favorite film?” I’m usually dumbfounded by this question and have no idea how to answer it in a way that won’t sound too pretentious.

favorite filmOf course, there are films that are “favorites” of mine, though over the years I’ve seen some favorites get lost in the shuffle and others take their place. So, my favorite film now? My “favorite film” for what? I assume you don’t mean simply for entertainment — because in general, entertainment depends on my mood at the time. I was bored at my first screening of Silence of the Lambs — bored! I was tired that Friday evening, I guess, because Silence is now a “favorite.”

I’m a teacher and some films are particularly my “favorites” as teaching experiences. I’d include Hitchcock’s Psycho, Stan Brakhage’s Window Water Baby Moving, D.W. Griffith’s The Birth of a Nation, Fritz Lang’s M, Kurosawa’s Rashomon, Spike Lee’s Do the Right Thing and Sharon Lockhart’s Double Tide.

But the problem with the “what’s-your-favorite” question is that it’s badly directed. As a student of cinema, what interests me is the experiential and intellectual interplay between motion pictures of various kinds. The continually changing meta-narrative of motion picture history is fascinating, far more fascinating than any single motion picture — though for that history to be exciting, individual films need to be both innovative and accomplished. An exquisite pleasure is discovering the new accomplishment and seeing how it reframes the history that has led up to it.

Films that have given me that pleasure this year include Linklater’s Boyhood, Marie-Hélène Cousineau and Madeline Piujuq Ivalu’s Before Tomorrow  and James Benning’s BNSF (a 3½-hour single-shot film of a desert landscape!). I’m thinking about how I’ll use these films to energize my students’ sense of what film history has been and what it should be — creating that energy is my favorite motion picture experience.

Cleopatra was ugly

Shelley Haley, professor of Africana studies and classics

This is a curious myth because there are no definitive representations of Cleopatra VII’s physical looks in the visual arts. Any representations contemporary with the last pharaoh of ancient Egypt are in very stylized media. You may have read of or seen marble or basalt busts and statuettes purporting to be of Cleopatra VII, but none of these has any attributes that definitively identify them as representing Cleopatra VII. It is true that there are coins, but as we know from current coinage, the images whether Lincoln, Jefferson, FDR or Washington are highly stylized and follow certain numismatic conventions.

Our main sources for Cleopatra’s appearance are various historians and poets from Greece and Rome. The only contemporary source for Cleopatra, the Roman poet Horace, does not mention her physical appearance, choosing to focus rather on her seemingly depraved morality and ending up with begrudging praise for her method of suicide. Of the other sources, the one who is given more credibility than the others is Plutarch. Plutarch, who lived 46-127 C.E., was a Greek who lived in the Roman Empire and wrote his history 140-145 years after Cleopatra died.

So what does Plutarch say? Most of Plutarch’s description (which comes in his Life of Antony) concentrates on the seductive quality of her voice, but the first line of the description is, “For indeed her own [Cleopatra’s] beauty, as they say, was not, in and of itself, completely incomparable, nor was it the sort that would astound those who saw her…” What does that even mean? If you find the English translation contorted and convoluted, believe me, the ancient Greek is even worse!

Many scholars and the general public alike have interpreted Plutarch’s hedging to mean Cleopatra was physically unattractive and frankly downright ugly. I find it particularly interesting that this misreading coincides with the momentum and currency that the African-ness of Cleopatra has gained. The result is that you get this exchange between two white students, cited by Beverly Daniel Tatum in her article “Defining Racism: Can We Talk?”

“Yeah, I just found out that Cleopatra was actually a Black woman.”

“What?”

The first student went on to explain her newly learned information. The second student exclaimed in disbelief, “That can’t be true. Cleopatra was beautiful!”

Most recently, People magazine in April 2009 had the results of a survey about “historical hotties” posted on people.com. With the subtitle “Would Cleopatra be a babe today?” we learn that 62 percent of people.com readers rated as not a hottie the Image Foundry Studio image of Cleopatra as a woman of African descent.

So the basis for this myth seems to be the racism of modern societies and has nothing to do with the historical Cleopatra VII. Certainly, Julius Caesar and Marc Antony thought Cleopatra was a “hottie,” and they admired more than just her physical appearance. After all, she was a linguist, chemist, economist and military strategist. And that is beautiful!

Ignorance is bliss

Marianne Janack, professor of philosophy

Maybe this old adage is not meant seriously, however many of my students seem to think it expresses a fundamental truth. The claim that ignorance is bliss implies that knowledge brings unhappiness — that knowledge is the opposite of bliss — and so that if you want to be happy, you should cultivate ignorance.

Now there are some obvious counterexamples to this: ignorance about how to give an antidote for poison or not knowing that a huge tornado is barreling toward your house probably brings misery rather than bliss in most cases. But Kant recognized that human beings have this tendency. In his Groundwork of the Metaphysic of Morals, he warned against the tendency toward misologie: a hatred of reasoning.

Ignorance is BlissWhat makes this a myth is that it is based, first, on a false dichotomy: that between bliss and unhappiness. It is possible that ignorance gives us a simple kind of pleasure or happiness, but that knowledge gives us a more refined, or higher form, of happiness.

Second, it assumes that all goods are to be compared on the basis of how happy they make us, and so it assumes that only happiness is good for its own sake. It’s possible that knowledge is also to be valued for its own sake, and so cannot be traded for the sake of an increase in happiness. The cuckold may be happy, but it seems a happiness bought at too high a price. The person who goes through life ignorant of injustices suffered by others may well be happy, but it’s not clear that that’s a state one ought to value.

Even if we agree that ignorance is bliss, the question remains open: should we have blissfulness as a goal if the best way to pursue it is to cultivate ignorance?

We could run out of water

Todd Rayne, the Joel W. Johnson Family Professor of Environmental Studies

Several times each year I talk to groups of visitors about water. One question that often comes up is whether we will run out of water. Maybe this question isn’t surprising, given headlines like this one from The Washington Post: “New NASA data show how the world is running out of water” (June 16, 2015).

water The simple answer to the question is that no, we will not run out of water. The quantity of water on Earth is more or less fixed at about 1.4 x 109 cubic kilometers — an enormous amount. In human time scales, we don’t gain or lose water at all, and recent studies show that much of our water has been around since Earth was a proto-planet more than 4.6 billion years ago. The water circulates through the different reservoirs, including the atmosphere, oceans, ice caps and glaciers, fresh surface water like lakes and streams, groundwater, soil and living creatures. Water occurs in these reservoirs in solid (ice), liquid or gaseous forms, and the processes that move water between reservoirs are those that we all know: evaporation, condensation, precipitation, infiltration, runoff, etc. This is the water cycle.

What matters more, and what most people mean when they ask about running out of water, is the amount of fresh liquid water (which makes up less than 1 percent of the total). But even this water won’t run out because of the processes that move water around the water cycle. However, the distribution of easily accessible, fresh liquid water is highly uneven, and many people live with a shortage of water — just ask people in California. This problem is made worse by several interrelated issues: 1) in areas with water shortages, humans are withdrawing already scarce water from surface water (lakes and rivers) and groundwater at rates far greater than the water is replenished, 2) the effects of climate change are impacting the rates of fresh water replenishment by precipitation and runoff, 3) an increasing percentage of fresh water is affected by pollution and is not suitable for human use, and 4) in many areas where water is scarce, outdated laws (based partly on outdated science) and pricing encourage overconsumption and wasteful use.

As with so many environmental problems, an overarching issue is population growth. About 70 percent of the world’s fresh water is used for agriculture, mostly for irrigation. As our population grows, we have become reliant on modern agriculture, with its greatly increased yield per acre that relies on an increased amount of water per acre to feed us. The increasing competition for water between irrigated agriculture and ecosystems will only worsen water scarcity in many places.

So the answer to the question is that we’ll never run out of water. But the uneven distribution of fresh water, the effects of climate change on the supply and the increasing demand for it have created severe water shortages in many parts of the world. Conservation efforts and new irrigation technology can help, but the fundamental problems of supply and demand, coupled with a changing climate, make this one of the great problems facing us.

Animation is for kids only

Kyoko Omori, associate professor of Japanese

From Disney’s legendary Fantasia (1940) to the more recent hit Inside Out (2015), we in the United States commonly associate animation with younger audiences and wholesome content. By contrast, Japanese animé leans toward more mature themes for adult audiences, including sex, madness and violence. Of course, animé is by no means monolithic. For example, Spirited Away (2001) captivated the attention of both young and older audiences around the world with its fantastic visuals and dream-like story. But among the darker animé, cyberpunk (a subgenre of science fiction) has intrigued adult audiences by posing questions that explore the border regions of humanity. What makes us different from machines, especially as we become more dependent upon them for our very lives, from pacemakers and artificial limbs to computers and even smart devices?

Cyberpunk may be set in the future, but it often reflects current society. Recently, for example, the Japanese company Softbank Robotics premiered its cutting-edge robot, Pepper the humanoid. Pepper purportedly reads human emotions and responds empathetically, thereby blurring the boundaries between computation and intuition. As CNN reporter Will Ripley exclaims in the news, “This is the future! You’re looking at it right now.”

If a robot can read human emotions, then what does that mean for the idea of humanity? These issues came to the fore this summer as I worked with Hung Hoang ’17, a biochemistry concentrator, on an Emerson project about post humanism in cyberpunk animé. Hung’s initial encounter with sci-fi animé goes back to Doraemon (1973-present), a Japanese TV animé about a friendly human-sized feline robot that repeatedly saves the day by pulling futuristic gadgets from a magic pocket. Spending his childhood watching Doraemon in socialist Vietnam still haunted by the aftermath of the war, Hung fell in love with the show’s depictions of the technology that entertained and saved people in everyday life. Now that he studies at Hamilton and majors in science, he decided to devote one summer to exploring his long-term questions about post-humanism through his other favorite animé, Ghost in the Shell (1995). We read and discussed German and French philosophy, psychoanalysis and feminist, gender and animation theories. It feels wonderful to know that animé (both as happy as Doraemon and as dark as Ghost in the Shell) opened new realms of intellectual exploration for him.

Jazz is improv and requires no musical knowledge

Doctuh Michael Woods, professor of music

In the early years of jazz, at the turn of the last century, people used to harbor all manner of unfounded ideas about jazz. Some people believed that you could not play jazz unless you were from New Orleans. Some people thought that you had to be high on weed, heroin or booze in order to “get loose.” Others believed that you had to be black, poor or both. Every one of these erroneous beliefs has been proven wrong over the years.

Louis Armstrong was considered the first great jazz soloist. His creative flights broke away from the standard harmonic backing of the rhythm section. He exhibited fantastic range, power, ideas and above all a sense of design to his improvisations. He learned form by listening to and performing with the parade bands in the streets of New Orleans. However, shortly after Louis in New Orleans came Bix Beiderbecke. Bix was white and from Davenport, Iowa. He enjoyed playing with a dark, round tone in the middle register. He did not take as many risks as Armstrong, but he made excellent note choices. Bix actually did something wonderful for American music. If White America perceives that a cultural expression is just for black people, it usually does not cross over well. Jazz is now played by every ethnicity, every age group and at every level in every country in the world. The differences between Louis and Bix started the Hot School vs. the Cool School. This proved that jazz was not one-dimensional and was flexible enough to allow any great soloist to impose his or her personality upon it.

What happens when a person improvises a jazz solo? Is he or she making up notes on the spot? The answer is yes. Could that person play a solo that is completely memorized? The answer is yes. Over time audiences have found that the improvised solos are far more daring, engaging and fun to follow. They call for the musician to internalize far more theory and to be able to mate ideas to the underlying harmonies much faster. Sometimes a musician creates ideas so fast that he or she has to make a save. That means that a note that could be perceived as being wrong has to quickly be spun into the next idea so that it sounds correct by comparison to the previous pattern. Many players increase their vocabulary by clicking and dragging their saves into their standard-ideas folder.

Jazz musicians usually know hundreds of small patterns. They are choosing ways in which they combine both patterns as well as whole phrases. These small units can be likened to menu items. When a short-order cook takes breakfast orders quickly, he does not invent bacon, eggs and toast, but he does have to choose between over-easy and white, wheat or rye. Almost all great jazz solos will fare quite well when examined after the fact. The soloists take their solos at performance speed and then a recording is made. Then people who know music theory study the notes.

Most of the players who are winning “best new artist” are not coming from the United States any more. This proves that jazz is now a worldwide music. We have jazz method books and conservatories. Anyone can study jazz, but the greatest virtuosos have to combine something of a lived experience into their musical voice to rise to stardom.

Help us provide an accessible education, offer innovative resources and programs, and foster intellectual exploration.

Site Search