My book, America's Half-Blood Prince: Barack Obama's "Story of Race and Inheritance," can be ordered here.
My published articles are archived at iSteve.com -- Steve Sailer
My published articles are archived at iSteve.com -- Steve Sailer
The Indianapolis Colts football coach Tony Dungy has retired at age 53, after setting the NFL record for making the playoffs 10 years in a row with the Colts and Tampa Bay Bucs.
My published articles are archived at iSteve.com -- Steve Sailer
“Hey, dark ‘n lovely!”
Gotta love the brothers who show their affection for the dark-skinned girls, even if they are hollering out the window of a passing car.
Gotta love it even more when the brother is the president, and the object of his affection is front and center for the world to see.
It’s true: A lot of black women fell for Barack Obama the moment they saw his wife.
If a black president represents change, a dark-skinned first lady is straight-up revolutionary.
I won’t apologize for taking note of Michelle Obama’s physical appearance. Plenty has already been said about how she, with her double Ivy degrees, six-figure salaries and two adorable daughters, is crushing the image of the struggling black single mother. She is a real life Clair Huxtable! But the true breakthrough here is that sisters who look like Michelle Obama seldom become cultural icons, aesthetic trendsetters—a proxy for the all-American woman.
And don’t roll your eyes and ask why we have to go there; we haven’t completely gotten over our prejudices about skin tone and hair texture. Despite years of scholarly, literary and popular debate—from Dr. Kenneth Clark’s baby-doll tests, to Toni Morrison’s tragic characters in The Bluest Eye, to the showdown between [racial term that doesn't appear in iSteve] in Spike Lee’s School Daze—too many of us continue to accept a standard of beauty that does not favor ebony-hued skin, woolly hair and full lips (and not those surgically enhanced smackers, either).
I know from first-hand experience. I remember being taunted and shunned by some people who didn’t believe that old saying about the blacker the berry. Back when we were Negroes, the word “black” was used to describe the dark-skinned among us, usually not with affection. My mama assured me that I was a pretty black girl, but it was the brothers on the streets, cooing such compliments as dark ‘n lovely, chocolate drop, brown sugar, who convinced me.
I'm not sure how to break the news to this writer, but being a good enough-looking woman to have black guys shout stuff at you while you walk by is a pretty low hurdle...
But consider the complexions of most of the black women who smile or stare seductively at the world from the covers of celebrity and beauty magazines—cream, cafĂ© au lait, golden honey. Gorgeous sisters, yes, but we come in other good flavors, too. The failure to showcase dark-skinned beauties feeds the notion that pretty black girls are an exception. Not so much dark and lovely as dark but lovely.
The light-skinned, long-hair aesthetic reigns.
I think of India.Arie’s song from a few years back.
"I’m not the average girl from your video
And I ain’t built like a supermodel
But I’ve learned to love myself unconditionally
Because I am a queen.”An empowering anthem,
Indeed
but even Arie acknowledges that many of us who don’t look like Barbie dolls—even chocolate-coated Barbie dolls—are not convinced of our beauty.
“I don’t know if young women necessarily think that certain women they see on TV are beautiful, but they do see that certain women are financially rewarded by looking a certain way and therefore that image is reinforced,” Arie told me via e-mail. She thinks that Michelle Obama’s presence on the national stage will “jump-start the challenge of those long-held beliefs. Not only is she naturally and uniquely beautiful, but she demonstrates a great deal of poise, class and style, which I think has and will continue to help capture the nation’s attention in a positive way.”
As much as we’d like to think that everyone will be instantly enlightened, the truth is it might not have much of an immediate or noticeable impact.
Really? Ya think?
A huge amount of female journalism consists of demands that society must be reorganized so that the author is considered more sexually attractive. (This is hardly the most egregious specimen of this vast genre.)
I've always thought that Mrs. Obama was a handsome woman, who spends a lot on her four sessions with her personal trainer each week. Still, there are those expressions ....
You'll definitely want to check out the expression on Mrs. Obama's face in the picture that The Root chose to illustrate Michelle's loveliness. Let's just say that when a husband sees his wife staring up at him with that expression, he knows he's in for a world of trouble.
Barack Obama's autobiography ends with his wedding to Michelle. The last line in this self-pitying book is:
"And for that moment, at least, I felt like the luckiest man alive."
The Luckiest Man Alive would seem like a pretty realistic self-description, but, judging from That Look she's apparently giving him in this picture, maybe he's not ...
My published articles are archived at iSteve.com -- Steve Sailer
Basically, we're broke because for the whole decade we've been buying more stuff from abroad than they've been buying from us. So, what can a President do to help us sell more stuff overseas?
Screwing in lightbulbs and filling potholes won't do it. But what will?
My published articles are archived at iSteve.com -- Steve Sailer
Matthew Yglesias denounces the actions of class traitor Sidley Austin, Michelle Obama's old law firm, in using the environmental laws to slow down environmentalists' plans for that SWPL favorite, light rail. (Trolleys without right-of-ways are barely more efficient than buses, but SWPLs can imagine themselves taking a trolley, but they shudder at the thought of riding a bus with all those ... uh, well, you know ...)
One thing law firms do is take cases on a pro bono basis. You get some prestige for doing so, and it helps underscore the legal profession’s self-conception as serving the higher calling of the law. The general idea here, of course, is that you’re supposed to be helping out indigent clients or some kind of do-gooder causes.
Meanwhile, in DC’s Maryland suburbs we’re inching ever closer to actually starting work on the Purple Line light rail. This would connect several destinations that are already served by transit and walkable transit-oriented development, provide transit access to the University of Maryland’s main campus, and also create the possibility of new transit-oriented development at additional stops along the way. It’s a good idea that will help reduce congestion on the Beltway, reduce carbon emissions, and enhance the region’s ability to keep growing in a sustainable manner. Every environmental group in the city is for it. But a group of NIMBYs centered around the town of Chevy Chase, MD and the Columbia Country Club are trying to block it in order to keep the riffraff out and are offering some spurious environmental claims to try to block construction.
They’ve engaged the large DC firm of Sidley Austin to help them in their fight. And Sidley’s doing the work pro bono — for free — as charity. No doubt in part this is because Joseph Guerra is both a partner in the firm and the husband of the woman co-chairing the NIMBY effort. Perhaps some of the firms partners are members of the Country Club as well. Who knows? But this is certainly a strange definition of charitable work. They might want to ask some of the people working for the firm on the bottom rungs — the janitors and so forth — if they really appreciate these kind of “charitable” efforts to deny poor people any better commuting options than the bus?
Why can’t Sidley Austin figure out that environmental laws are only supposed to slow down bad people, like conservative developers, but not nice liberal people who are trying to build stuff that Matthew Yglesias wants?
Progressives didn’t spend 40 years setting up a vast web of environmental and other land use regulations that make it glacially slow to build anything on either coast in order to hurt progressives. Therefore, environmental laws should _not_ apply to progressives. Any law firm that uses environmental laws to frustrate Matthew Yglesias’s desires is a traitor and should be dealt with. As Lenin said, the eternal question is always “Who? Whom?”
Bulldoze, baby, bulldoze!
You can’t have a bunch of environmentalist red tape holding back Yglesias’s Robert Moses-like ambitions to bulldoze anything standing in the way of his vision of a better future. Now that progressives have power, they must deregulate the environment so nobody can slow down their efforts to save the environment.
My published articles are archived at iSteve.com -- Steve Sailer
Since the cost and the environmental impact of a construction project don't matter anymore as long as Barack Obama is for it, here's a green infrastructure plan expensive enough even for Paul Krugman's taste: The North American Water and Power Alliance (NAWAPA):
Let's generate huge amounts of carbon-free hydroelectric power and turn the western half of the USA green (in the literal sense) by diverting a small fraction of the water from Canada's vast but almost-useless rivers that flow north into the Arctic Ocean. While we're at it, we can turn northern Mexico green, too. (Here's a promotional video from the 1960s.) Thayer Watkins writes:
The North American Water and Power Alliance (NAWAPA) is a project for diverting to the western U.S. and northwestern Mexico water from rivers in Alaska and Canada which now flow into the Arctic Ocean. In addition to providing irrigation water to arid parts of North America NAWAPA would also generate considerable amounts of power and provide some subsidiary benefits such as stabilizing the level of the Great Lakes. The project was formulated by the Los Angeles engineering firm of Ralph M. Parsons Company and got some attention in Congress, particularly from Senator Frank Moss of Utah, but is not politically feasible.
Canadians were outraged that Americans were planning to steal their precious bodily fluids, and were angry when George W. Bush mused about buying water from Canada in 2001. But, that's POT (Pre-Obama Thinking). We don't need that kind of negativism anymore, now that we have Obamagic.
In terms of engineering the project is feasible. A series of dams on the headwaters of the Yukon, Copper, Kootenay, Fraser, Peace, and Columbia Rivers can divert their flows into reservoirs. Included among these is the 500 mile long Rocky Mountain Trench, a natural formation which has 16 times the capacity of Lake Mead on the Colorado River. From the Rocky Mountain Trench the water would flow into Montana and central Idaho. The dams would generate electrical power but not all of it would be marketable. Some of the power would be required to pump the water over some mountains in Idaho to a canal where it would flow south along the border area of Utah and Nevada. Here the water flow would be divided into two branches. One would go southwest to Nevada, California, and northwestern Mexico. The other would go east to Arizona, New Mexico, and Colorado. This is the main element of the project. A subsidiary part would take water from the Peace River by canal to the Great Lakes and thereby linkthe praire provinces of Canada with the St. Lawrence Seaway. Other subsidiary elements could link the system to the Pacific Ocean at Vancouver, British Columbia and link Lake Manitoba to the Hudson Bay.
As envisioned by the R. M. Parsons Co. the system would deliver 120 million acre-feet of water annually; 78 million to the U.S., 22 million to Canada, and 20 million to Mexico. According to Parsons this would enable Mexico to triple her irrigated acreage, irrigate an additional 40 million acres in the U.S. and 7 million in Canada. NAWAPA would generate 70 million kilowatts of power; 38 million for the U.S., 30 million for Canada and 2 million for Mexico. Parsons estimates that all this would cost $100 billion in 1964 dollars. In 1989 dollars that would be about $339 billion.
Stanford computer scientist John McCarthy wrote about a decade ago:
We won't need any such grand projects for the forseeable future, but when and if our descendants need enormous increases in water supply, they can get them, perhaps at expense comparable in relation to per capita GDP to the expense our immediate ancestors spent on water projects. Probably the expense in proportion to the GDP of the region benefitted will not be as great as the 1904 Owens Valley aqueduct was in proportion to the GDP of Los Angeles at the time.
At that time, the population of Los Angeles was 200,000 and the per capita income for the U.S. was $1100. The cost of the project was $23 million. Therefore, it corresponded to 1/10 th of a year's income for the inhabitants of the area. 1/10 th of a year's GDP for the U.S. would come to $800 billion. It doesn't look like we will have to spend that much for increased water supply in the near future, but we'll do it if we have to. ...
Around 1900 people thought in large terms. Recently, it has become fashionable to think small.
The website for the new book by Greg Cochran and Henry Harpending, The 10,000-Year Explosion, includes "deleted scenes" -- chunks of text that were dropped for reasons of space. Some are digressions, such as Henry's encounter with the charging cape buffalo, while others consist of fairly well-known background info, but are worth reading for the level of insight that you won't find in other works. Here's one subsection of a long essay, "Prelude," on the various traits that evolved at some point since humans broke off from apes. I'll just highlight the section on how well humans get along with members of their own sex. Personally, I have two fluffy bunnies living in the backyard, two neutered male rabbits who wouldn't hurt a fly, but we have to keep them separated like the Israelis and Palestinians with a series of fences to keep them from ripping each other to shreds.
Groups of pair bonds - Some primates form durable mating male-female pairs, but the only ape to do so is the gibbon. Other arrangements like harems or troops are more common both among larger primates and among mammals in general.
In a harem there is one reproducing adult of one sex and more than one of the other sex. Gorillas and hamadryas baboons are mostly organized into one-male harems. African Cape Dogs live in one-female harems with a number of males that are related to each other. Troops contain adult reproductive individuals of both sexes, among whom there may be complex competitive games and strategies to achieve access to the other sex. Common baboons live in troops as do chimpanzees. The baboon troop, like troops of most mammals, is predominately a matrilineage, related females, while the males have entered the troop from another troop. Chimpanzee troops, on the other hand, are patrilineages, groups of related males, with females having come from elsewhere. Chimpanzee troops are ordinarily dispersed over a large territory while the more familiar usage of “troop” refers to a group that moves together.
Durable male-female pairs usually live away from other pairs, and when they do join larger groups, they are members of a flock, not involved or minimally involved in social interactions with others of the flock. Almost all the social interaction is between members of the pair. Animals in harems or troops, on the other hand, spend most of their time in same-sex interactions. Ordinarily these would be some variant of social competition for food among females and competition for females among males.
Humans, remarkably, have the ability to maintain durable pair bonds with reproductive exclusivity while living in larger social groups in which most of the day to day social interaction is with members of the same sex (Rodseth et al., 1991). Gibbons almost certainly could not do it: males are intolerant of the presence of other males and females of other females. Something special and now occurred in human evolution the led to our peculiar capacity to maintain pair bonds embedded in larger social groups.My published articles are archived at iSteve.com -- Steve Sailer
The idea of Obama having a blank check to spend billions for two years on "infrastructure" has liberals fantasizing over all the SWPL projects they'd like to build, such as solar powered magnetic levitation trains. The problem, of course, is that liberals have spent the last 40 years making building anything in the the coastal regions of America an extremely slow process, so it's implausible for them to argue that their favorite projects could have non-negligible "stimulus" effect.
So, that's leading liberals to demand that we cut the red tape holding back construction. It's time to deregulate. Bulldoze, baby, bulldoze!
My published articles are archived at iSteve.com -- Steve Sailer
The AP reports:
President George W. Bush told a group of Texas reporters Friday that he regretted immigration policies were not reformed while he was in office.
"I'm very disappointed that it didn't pass," he said in an interview with correspondents from his home state. "I'm very worried about the message that said, 'Republicans are anti-immigrant.'"
Bush said he wanted a comprehensive immigration plan "not for political standing or for Latinos, but because it was best for the country," the Houston Chronicle reported in its online edition Friday.
The outgoing president said that in hindsight he should have pushed his immigration proposal soon after the 2004 election, rather than after partisan squabbling over Social Security began.
My published articles are archived at iSteve.com -- Steve Sailer
... and discovers he has the Bald Gene.
In a long article in the New York Times Magazine, "My Genome, My Self," the author of The Blank Slate recounts all that he has learned about himself from having his genome sampled, which turns out to be unsurprisingly modest.
The most prominent finding of behavioral genetics has been summarized by the psychologist Eric Turkheimer: “The nature-nurture debate is over. . . . All human behavioral traits are heritable.” By this he meant that a substantial fraction of the variation among individuals within a culture can be linked to variation in their genes. Whether you measure intelligence or personality, religiosity or political orientation, television watching or cigarette smoking, the outcome is the same. Identical twins (who share all their genes) are more similar than fraternal twins (who share half their genes that vary among people). Biological siblings (who share half those genes too) are more similar than adopted siblings (who share no more genes than do strangers). And identical twins separated at birth and raised in different adoptive homes (who share their genes but not their environments) are uncannily similar.
Behavioral geneticists like Turkheimer are quick to add that many of the differences among people cannot be attributed to their genes.
But not all variation in nature arises from balancing selection. The other reason that genetic variation can persist is that rust never sleeps: new mutations creep into the genome faster than natural selection can weed them out. At any given moment, the population is laden with a portfolio of recent mutations, each of whose days are numbered. This Sisyphean struggle between selection and mutation is common with traits that depend on many genes, because there are so many things that can go wrong.Penke, Denissen and Miller argue that a mutation-selection standoff is the explanation for why we differ in intelligence. Unlike personality, where it takes all kinds to make a world, with intelligence, smarter is simply better, so balancing selection is unlikely.
Is smarter simply better? If it takes, say, bigger brains, the answer isn't terribly clear. Analogously, Intel assumed that faster clockspeed computer CPU chips were simply better for about 15 years. But the struggle to break the 4.0 gigahertz barrier proved overwhelming and so Intel has given up and gone in different directions in recent years, when most chips sold seem to be between 2.0 and 3.0 gigahertz, although performance keeps improving.
Keep in mind that Intel has big advantages over natural selection in getting from one performance peak to another. For example, if Intel decides that its strategy of single chips with ever faster clockspeeds is heading toward a deadend, it can simultaneously start working on an R&D project for double core and quad-core chips with moderate clockspeeds. At first, the new type of CPUs won't be as good as the old type, but it doesn't have to sell the beta versions of the changed design. It can keep making them and throwing them away until the new style chips are as good as the competition's old style chips.
In contrast, natural selection doesn't provide you much of a laboratory in which to putter around while you're working the kinks out of your next model while your factory keeps churning out the satisfactory current model.
Similarly, bigger brains require more food. They make you more likely to tip over and hurt yourself. They require your mother to have a wider pelvis so she won't die in childbirth, which makes her a slower runner.
But intelligence depends on a large network of brain areas, and it thrives in a body that is properly nourished and free of diseases and defects. Many genes are engaged in keeping this system going, and so there are many genes that, when mutated, can make us a little bit stupider.
At the same time there aren’t many mutations that can make us a whole lot smarter. Mutations in general are far more likely to be harmful than helpful, and the large, helpful ones were low-hanging fruit that were picked long ago in our evolutionary history and entrenched in the species. One reason for this can be explained with an analogy inspired by the mathematician Ronald Fisher. A large twist of a focusing knob has some chance of bringing a microscope into better focus when it is far from the best setting. But as the barrel gets closer to the target, smaller and smaller tweaks are needed to bring any further improvement.
The Penke/Denissen/Miller theory, which attributes variation in personality and intelligence to different evolutionary processes, is consistent with what we have learned so far about the genes for those two kinds of traits. The search for I.Q. genes calls to mind the cartoon in which a scientist with a smoldering test tube asks a colleague, “What’s the opposite of Eureka?” Though we know that genes for intelligence must exist, each is likely to be small in effect, found in only a few people, or both. In a recent study of 6,000 children, the gene with the biggest effect accounted for less than one-quarter of an I.Q. point. The quest for genes that underlie major disorders of cognition, like autism and schizophrenia, has been almost as frustrating. Both conditions are highly heritable, yet no one has identified genes that cause either condition across a wide range of people. Perhaps this is what we should expect for a high-maintenance trait like human cognition, which is vulnerable to many mutations.
The hunt for personality genes, though not yet Nobel-worthy, has had better fortunes. Several associations have been found between personality traits and genes that govern the breakdown, recycling or detection of neurotransmitters (the molecules that seep from neuron to neuron) in the brain systems underlying mood and motivation....
Even if personal genomics someday delivers a detailed printout of psychological traits, it will probably not change everything, or even most things. It will give us deeper insight about the biological causes of individuality, and it may narrow the guesswork in assessing individual cases. But the issues about self and society that it brings into focus have always been with us. We have always known that people are liable, to varying degrees, to antisocial temptations and weakness of the will. We have always known that people should be encouraged to develop the parts of themselves that they can (“a man’s reach should exceed his grasp”) but that it’s foolish to expect that anyone can accomplish anything (“a man has got to know his limitations”). And we know that holding people responsible for their behavior will make it more likely that they behave responsibly. “My genes made me do it” is no better an excuse than “We’re depraved on account of we’re deprived.”
Many of the dystopian fears raised by personal genomics are simply out of touch with the complex and probabilistic nature of genes. Forget about the hyperparents who want to implant math genes in their unborn children, the “Gattaca” corporations that scan people’s DNA to assign them to castes, the employers or suitors who hack into your genome to find out what kind of worker or spouse you’d make. Let them try; they’d be wasting their time.
The real-life examples are almost as futile. When the connection between the ACTN3 gene and muscle type was discovered, parents and coaches started swabbing the cheeks of children so they could steer the ones with the fast-twitch variant into sprinting and football. Carl Foster, one of the scientists who uncovered the association, had a better idea: “Just line them up with their classmates for a race and see which ones are the fastest.” Good advice. The test for a gene can identify one of the contributors to a trait. A measurement of the trait itself will identify all of them: the other genes (many or few, discovered or undiscovered, understood or not understood), the way they interact, the effects of the environment and the child’s unique history of developmental quirks.
My published articles are archived at iSteve.com -- Steve Sailer
Mining my Comments section, here's Dennis Dale's of Untethered's take on the critically-lauded Jonathan Demme movie starring Anne Hathaway, which ends with an interminable wedding celebration featuring various world music acts:
Dennis Dale said...
"Rachel" is an expression of the self-consciousness that is now central to the soluble identity of a certain cultural set, affluent and influential (affluential?) liberal white Americans.
Here we have the most vital expression of any given culture, the marriage ceremony, and it is imagined as a palimpsest featuring any culture but one's own. The effect is not broadening, as they imagine, but deadening.
Toward the end of this interminable mess (how much longer is bad lighting and hand-held cinematography going to pass for naturalistic authenticity with our hopeless critical class?), during the wedding party, after we've been subjected to this calvalcade of pretentious multicultural references, I think they trotted out some mardi gras dancers. I refuse to believe they're not joking.
Think of two classic films--the Godfather and The Deer Hunter--and the remarkable wedding scenes that anchor their first acts, defining a community in a given time and place, and compare them to this film, which expresses nothing so much as the profound lack of confidence that these people brandish as proof of their moral superiority.
Compare those exuberant scenes to the prolonged shambling of this (Robyn Hitchcock and his friggin' lute?! Are you kidding me?!).
And all the scrupulous inclusion only adds up to condescension in the end. Note the black people in the film--smiling caricatures, for all--or because of-- the effort to portray them counter-stereotype. The groom is so well-mannered and mild that he's barely there. And don't get me started on Bill Irwin's interpretation of a kind father as an oozing nipple.
But Anon is wrong--Anne Hathaway rules, freakishly oversized eyes, scrawny neck and all. She superbly portrays youthful self-destruction. They should have done it right--there's no need for a dead sibling back story. People set out to self-destruct for no good reason all the time--and that would have been a fitting synecdoche for the setting of the story, a people self-destructing for no good reason.
My published articles are archived at iSteve.com -- Steve Sailer
Here's another outtake from the upcoming book The 10,000 Year Explosion: How Civilization Accelerated Human Evolution by Greg Cochran and Henry Harpending. If the stuff that wasn't good enough to make the book is this good, how good is the actual book going to be?
The first Neanderthal skeleton recognized as such was found in a limestone quarry in the Neander Valley in Germany in 1856. At first, this rather odd skeleton was thought to be that of some medieval guy crippled by arthritis (or a Celt, or a diseased Cossack): it was only identified as a representative of an extinct type of human somewhat later. This other human race was named after the site: spelling reform later changed its name to Neandertal and eventually most paleontologists followed, driven by obscure interdepartmental struggles. We’re sticking with ‘Neanderthal’, though: an archaic spelling seems only appropriate for a vanished species.
We know quite a bit about the Neanderthals, more in fact than we know about our anatomically modern African ancestors of that period. In part this is because physical conditions in Western Europe favored preservation so that there really are more fossil remains. In addition, a high level of general education meant that farmers and quarrymen were more likely to call in a professor when they found cave paintings or an odd-looking skeleton: that is, more likely in nineteenth-century Germany or France than in nineteenth-century Africa or China. There were plenty of professors close by, since Neanderthals left their remains conveniently close to famous universities and five-star restaurants. Many Neanderthal skeletons, site, and artifacts have been found – remains from a few hundred individuals, in sharp contrast to the handful of known African human fossils from the same period.
The Neanderthals had big brains (averaging about 1500 cubic centimeters, noticeably larger than those of modern people) and a technology like that of their anatomically modern contemporaries in Africa, but were quite different in a number of ways: different physically, but also socially and ecologically. Neanderthals were cold-adapted, with relatively short arms and legs in order to reduce heat loss - something like Arctic peoples today, only much more so. Considering that the climate the Neanderthals experienced was considerably milder than the high Arctic (more like Wisconsin), their pronounced cold adaptation suggest that they may have relied more on physical than cultural changes. Of course they spent at least six times as many generations in the cold as any modern human population has, and that may have had something to do with it as well.
We don’t yet know for sure, but it seems likely that, as part of their adaptation to cold, Neanderthals were furry. Chimpanzees have ridges on their finger bones that stem from the way that they clutch their mother’s fur as infants. Modern humans don’t have these ridges, but Neanderthals do. Moreover, we know that humans can become furry with very simple genetic changes, since there are a few people working in the circus in which such a change has already taken place.
They were chinless and had big honking noses, heavy brow ridges, a pulled-forward face, and long, low skulls that tended to bulge outwards at the sides. The body form differences are seen in children – and so they were innate, rather being a consequence of their way of life. They were heavily built and muscular, judging from their skeletons, which had larger areas of muscle attachment than those seen in people today or in their contemporary African cousins. This means that they were stronger than us, probably much stronger. You could think of them as being born wrestlers.
Those conclusions about Neanderthal physique come from studying fossils, but modern methods have allowed researchers to draw less obvious conclusions as well. ...
Judging from the isotopic composition of their remains, Neanderthals were meat-eaters, pure top carnivores, comparable to lions or wolves. This is very different from most contemporary hunter-gatherers, who usually depend more on plant foods than meat.
It also seems that they ate almost no fish, which is somewhat surprising, considering that a number of the Neanderthal sites have been along rivers with strong salmon runs.
Like other top carnivores, Neanderthals were thin on the ground. The typical Neanderthal site has relatively more remains of cave bears than later sites occupied by modern humans, which suggests that Neanderthals were scarcer than later humans. We think that there may have been as few as 30,000 of them in all of Europe.
Neanderthal sites are generally found in caves and beneath rock overhangs, which functioned as shelters and also tended to preserve their remains. We have found signs of open-air camps as well, but any structures built seem to have been very simple. Such camps may have been common, but were far less likely to be preserved than sites in caves and rock shelters: as so often in archaeology, differences in preservation can completely obscure the original distribution of things.
Neanderthals used rather sophisticated stone tools, but used almost no bone, ivory, or shell. They had fewer types of tools than their successors. Their “Mousterian” tools (named for the southwestern French site Le Moustier), consisted of carefully shaped flake tools and small hand axes, almost always made from local materials. We find Mousterian tools over vast areas of Europe and western Asia, but there is little variation in space or time, which supports the general impression that their capacity to innovate was low. François Bordes, a famous French archaeologist, has said their technology consisted of “beautiful tools made stupidly” - by rote, or perhaps by instinct.
There are indications that they often used those stone tools to make things out of wood, but only a few such wooden objects have been preserved. We find awls that could pierce animal hides, but no bone needles with eyes, which their successors used to make tailored clothing. There is evidence that they processed hides, presumably for clothing, but they must have been used for blankets or ponchos rather than coats or parkas. That kind of clothing was good enough to let them survive in Ice Age Europe, but evidently not good enough to allow the Neanderthals to settle the high Arctic: ultimately this also kept them out of the Americas. Their front teeth show an unusual pattern of wear – apparently they were used as a third hand or vice, or perhaps to prepare animal skins. Somewhat similar patterns of wear have been observed in peoples like Eskimos that prepare hides by chewing.
Neanderthals had fire, and probably could not have settled ice-age Europe without it. However, they didn’t do anything fancy with it: there were no specialized hearths and there is no evidence that they used lamps.
Neanderthals are the first humans known to have buried their dead, but there is no clear evidence of ceremony or ritual in those burials. We don’t find weapons or decorative objects associated with those graves as we often do with the graves of modern humans. It may be that burial was for them more a way of disposing of unpleasant remains than a ritual occasion. It may have been more like flushing a goldfish down the toilet.
We know that Neanderthals hunted big game (red deer, European bison, sometimes mammoths and rhinos) and took big risks in the process, judging from their many healed fractures. The injury pattern is like that of bronco riders, as documented by Eric Trinkaus. At root, these high risks were a consequence of their lack of projectile weapons, which allow hunters to bring down big game without getting dangerously close – that and a lack of any other safe way of making a living. Neanderthals used stabbing spears and were probably ambush hunters. Using this strategy, they had to get up close and personal with desperate animals that outweighed them several fold, a good way to get hurt.
Since they were so often injured, they had to help each other. That's the only way in which they could have survived while recovering from those serious injuries. You see a similar pattern in some other cooperative-hunting species such as lions: injured members of the pride manage to feed off the kills of others while they recover. We see lots of healed injuries in saber-tooth tigers as well, whose hunting pattern of stabbing from ambush was rather similar to that of Neanderthals. Come to think of it, their heavy, almost bear-like build (compared to that of the other big cats) was also similar. In some cases, Neanderthals carried this cooperation very far, providing care that allowed permanently crippled individuals to reach advanced ages. The most famous example of this is the skeleton of a forty-something man found in Shanidar, in northern Iraq. His right arm was withered and had suffered multiple fractures, while the lower arm and hand had been lost. He had a crippled and withered right leg. In addition, he had suffered a crushing blow to the face that likely left him blind in one eye. All of these were long-healed injuries. Clearly, this guy had been around the block – twice, on his face.
Groups of Neanderthals were able to kill big game, but that would have been far harder for individuals. As a member of the group, they also received the Paleolithic equivalent of health insurance, a necessity in their kind of high-risk hunting. Because of group efficiency in hunting and the high degree of within-group cooperation, membership in a Neanderthal band or tribe was valuable. It would have been almost impossible to survive outside such a group. As Bill Hamilton has said, such arrangements are vulnerable to free-riders, individuals that take advantage of the benefits but don’t pull their weight – which in this case would have meant avoiding hunting and its risks. A high degree of within-band relatedness would have mitigated this tendency – due to kin selection, the principle that behaviors that cost the individual but help close relatives can be favored by selection. This suggests that Neanderthal bands may have been reluctant to accept outsiders, especially males, if we assume that males had the primary responsibility for hunting. Unrelated newbies would have had the most to gain by shirking dangerous duties.
Hamilton pointed out that, among social carnivores, the would-be immigrant often has to go through a difficult probationary period without necessarily succeeding – we see a similar pattern in some recent hunter-gatherers. There would have to have been a way in which new individuals could join the tribe, in order to avoid dangerous levels of inbreeding. It may be that only females changed bands, which is apparently the case in chimpanzees and is the most common pattern in humans.
Trends of this sort may have existed in anatomically modern humans as well, but the high risks associated with Neanderthals’ specialized big-game hunting may have taken things much further. We expect that they were more cooperative than our African ancestors and more clannish. This might have interfered with inter-band relations and made the development of trade and other inter-band social interactions more difficult.
` If you take too many chances in the process of making a living, you'll get yourself killed before you manage to raise a family. Therefore there is a maximum sustainable risk per calorie acquired from hunting. If the average member of the species incurs too much risk, more than that sustainable maximum, the species goes extinct. The Neanderthals must have come closer to that red line than anatomically modern humans in Africa. Risks were particularly high because the Neanderthals seem to have had no way of storing food – they had no drying racks or storage pits in frozen ground like those used by their successors. Think of it this way: storage allow more complete usage of a large carcass such as bison that might weigh over a thousand pounds – it wouldn’t be easy to eat all of that before it went bad. Higher utilization - using all of the buffalo - drops the risk per calorie.
Since women in Africa were probably gathering vegetable foods, men there didn't have to produce as much food or produce it as steadily, which meant that they could choose game animals that were safer but less abundant. And that's what they did in Africa's Middle Stone Age (MSA). They went after relatively uncommon but mild-mannered eland, rather than abundant, deadly dangerous Cape buffalo. And as corollary, anatomically modern humans in Africa didn't have as heavy a build as Neanderthals - they didn't need it.
Although Neanderthals must have had very high levels of within-group cooperation, they were no angels: they had a weakness for long pork. As we said before, they may have experienced evolutionary pressures favoring clannishness or even hostility to outsiders. That's natural: reduced competition at one level allows more competition at a higher level. If the members of some ethny could 'all just get along', they could conquer the world, and likely would. We have found clear-cut evidence of cannibalism at several Neanderthal sites. At Krapina, every long Neanderthal bone found had been split open for marrow.
Like other early humans, Neanderthals were relatively uncreative; their tools changed very slowly and they show no signs of art, symbolism, or trade. Their brains were large and had grown larger over time, in parallel with humans in Africa, but we really have no idea what they did with them. Since brains are metabolically expensive, natural selection wouldn't have favored an increase in brain size unless it increased fitness, but we don't know what function that those big brains served. Usually people explain that those big brains are not as impressive as they seem, since the brain-to-body weight ratio is what’s really important, and Neanderthals were heavier than modern humans of the same height.
You may wonder why we normalize brain size by body weight. We wonder as well.
Among less intelligent creatures, such as amphibians and reptiles, most of the brain is busy dealing with a flood of sensory data. You’d expect that brain size would have to increase with body size in some way in order to keep up. If you assume that the key is how much surface the animal has, in order to monitor what’s causing that nagging itch and control all the muscles needed for movement, brain size should scale as the 2/3rds power of weight. If an animal has a brain that’s bigger than predicted by that 2/3rds power scaling law, then maybe it’s smarter than average. That argument works reasonable well for a wide range of species, but it can’t make sense for animals with big brains. In particular it can’t make sense for primates, since in that case we know that most of the brain is used for purposes other than muscle control and immediate reaction to sensation. Look at this way - if dividing brain volume by weight is a valid approach, Nero Wolfe must be really, really stupid.
We think that Neanderthal brains really were large, definitely larger than those of people today. This doesn’t necessarily mean that they were smarter, at least not as a culture. The archaeological record certainly indicates that they were not, since their material culture was definitely simpler than that of their successors. In fact, they may have been relatively unintelligent, even with their big brains. Although brain size certainly is correlated with intelligence in modern humans, it is not the only factor that affects intelligence. By the way, you may have read somewhere (The Mismeasure of Man) that brain volume has no relationship to intelligence, but that’s just a lie.
One paradoxical possibility is that Neanderthals lacked complex language and so had to be smart as individuals in order to learn their culture and technology, while that same lack severely limited their societal achievements. Complex language of the type we see in modern humans makes learning a lot easier: without it, learning to create even Mousterian tools may have been difficult. In that case, individuals would have to repeatedly re-invent the wheel (so to speak) while there would have been little societal progress.
It could also be that Neanderthal brains were less powerful than you’d expect because there just weren’t enough Neanderthals. That may sound obscure, but bear with us. The problem is that evolution is less efficient in small populations, in the same way that any statistical survey – polls, for example -becomes less accurate with fewer samples. Natural selection is pretty good at eliminating a defective gene when its disadvantage is significantly great than the inverse of the population size. When the disadvantage is smaller than that, the defective gene has a reasonable probability of reaching high frequency by drift. It can even become universal in that population. This tendency is insignificant in large populations, but it can lead to problems in small ones, as more and more slightly deleterious mutations accumulate. There is a countervailing tendency – the generation of favorable mutations, which are likely to spread – but that tendency becomes weaker and weaker as the population becomes smaller. Thus, over the long term, a population that is too small is likely to go extinct for purely genetic reasons, if some other disaster doesn’t strike first. This is an issue that concerns conservationists who are trying to maintain endangered species such as the whooping crane or Florida panther.
Neanderthals were not so rare as to risk extinction by genetic load. But the same argument has other implications. Even if a population is big enough for long-term survival, it may still suffer some genetic load. This would matter most for extremely complicated adaptations that relied on precise action of many genes: the more complicated the adaptation, the more vulnerable it would be to this kind of genetic sand in the gears. As it happens, the most complicated human adaptation is the brain, and one might expect that it would show the greatest vulnerability to such problems. There is some direct evidence of this, concerning a different kind of mutation load. Children whose parents are closely related, first cousins or closer, are significantly more likely to have two copies of deleterious recessive mutations – and their IQ is affected to a greater extent that other traits such as height. We think that the long-term effective population size of Neanderthals was less than that of anatomically modern humans, since Africa was less affected by the ice ages, which at their worst made most of Europe uninhabitable – so Neanderthals may have had more problems with genetic load. Because of this, Neanderthals may have had less efficient brains than their anatomically modern contemporaries or humans today.
Our favorite hypothesis is that Neanderthals and other archaic humans had a fundamentally different kind of learning than moderns. One of the enduring puzzles is the near-stasis of tool kits in early humans - as we have said before, the Acheulean hand-axe tradition last for almost a million years and extended from the Cape of Good Hope to Germany, while the Mousterian lasted for a quarter of a million years. Somehow these early humans were capable of transmitting a simple material culture for hundreds of thousands of years with little change. More information was transmitted to the next generation than in chimpanzees, but not as much as in modern humans. At the same time, that information was transmitted with surprisingly high accuracy. This must be the case, since random errors in transmission would have caused changes in those tool traditions, resulting in noticeable variation over space and time – which we do not see.
It looks to us as if toolmaking in those populations was, to some extent, innate: genetically determined. Just as song birds are born with a rough genetic template that constrains what songs are learned, early humans may have been born with genetically determined behavioral tendencies that resulted in certain kinds of tools. Genetic transmission of that information has the characteristics required to explain this pattern of simple, near-static technology, since only a limited amount of information can be acquired through natural selection, while the information that is acquired is transmitted with very high accuracy.
My published articles are archived at iSteve.com -- Steve Sailer
A movie likely to be overlooked in the Oscar nominations is the fall cowboy flick, "Appaloosa," but Viggo Mortensen deserves serious Best Supporting Actor consideration. Here's my review in The American Conservative:
The bald and square-jawed actor Ed Harris has played American heroes and psycho killers since first drawing notice as astronaut John Glenn in 1983's "The Right Stuff." He's now written and directed "Appaloosa," an amiable Western about masculine camaraderie and honor adapted from the book by Robert B. Parker, the genre novelist who created Spenser, the Boston private eye. "Appaloosa" furnishes Harris and Viggo Mortensen (the King in "The Return of the King") with plenty of wry lines for their portrayals of itinerant lawmen in the New Mexico of the 1880s.
Fish do not feel wet, we are told (although on what authority, I cannot say), and cowboys and Indians movies once felt no more awkward than cops and robbers films do today. Westerns were then less a genre than a natural, default mode. In the early 1970s, however, urban crime dramas, such as "The French Connection" and "The Godfather" replaced Westerns as the norm. The Western has since become a highly self-conscious genre, one almost immobilized by the weight of its pre-1970 cinema history.
As an actor, however, Harris appears unburdened by all the film school baggage the genre has accumulated. The straightforward "Appaloosa" provides two outstanding roles and sundry old-fashioned pleasures.
By churning out countless cowboy movies, Hollywood had helped enshrine the idea that America was built by frontier settlers. The decline of the Western coincided with the rise in self-consciousness of the descendents of Ellis Island immigrants. By 1970, the grandchildren of Ellis Island wished to assert a new vision. America, their movies implied, was built not by pioneers, but by Catholic and Jewish immigrants, especially the gangsters and policemen of the big cities.
Thus, Martin Scorsese spent over 30 years and more than $100 million to film the 1928 book Gangs of New York to push his mobocentric theory of American history back into the mid-19th Century. The tagline for his movie was "America Was Born in the Streets."
In "Appaloosa," Harris portrays the marshal, a man whose gun hand gets steadier the more the adrenalin flows. He's honest, courageous, professional (he always reloads his six-shooter instantly after killing a bad guy -- you never know when you might need to shoot another one), and perhaps not quite right in the head. Mortensen is his deputy, better educated (a West Point grad), but content to follow his boss's lead because the marshal's slightly demented heroism provides him with a moral compass.
Both Westerns and Urbans offer promising plots for movies because they depict a Hobbesian world where life is full of interest. Modern crime movies are about the grim business of maintaining order. Westerns, in contrast, tend to be sunnier because they are about establishing order, forging a legitimate monopoly on violence.
The burghers of Appaloosa hire the pair to bring the law to their dusty town terrorized by the gang of a rich rancher turned brigand (played by Jeremy Irons, using Daniel Day-Lewis's I-drink-your-milkshake Mid-Atlantic accent). Harris and Mortensen pin on their silver stars, ask a few hotheads to come quietly, shoot those who won't, and soon order is instituted.
Then, disorder arrives on the train in the comely form of a tightly corseted widow, Renée Zellweger of "Chicago." The actress endures a lot of flack for her scrunched-up facial features, but she's well cast here as a seemingly refined lady. The widow is looking for a Wild West town with such a high male-female ratio that nobody will notice she's not Lillie Langtry while she's on the prowl for the reigning alpha male.
The deputy (Viggo) is immediately smitten by her, but she doesn't notice him because he's only a beta. When the marshal (Ed) briefly goes off his rocker and brutally beats a harmless barfly for using vulgar language in front of a lady, he wins her heart and they quickly marry. To her disappointment, that savage moment proves anomalous. Mostly, the marshal is good at keeping the peace. Bored, the missus starts looking for trouble, which, quickly enough, finds her.
Westerns usually have happy, yet bittersweet, endings. The law-enforcing man of violence triumphs, making the settlement finally safe for children and schoolmarms. The tamed town no longer needs a hero, so he rides off into the sunset, obsolete but majestic.
Ed Harris isn't the most expert of directors, but his chemistry with Mortensen overcomes the occasionally off-kilter editing and inadequate score, making "Appaloosa" the best traditional Western since Kevin Costner's "Open Range."
Rated a soft R for some violence and language.
My published articles are archived at iSteve.com -- Steve Sailer
Hollywood likes to squeeze a little more milk out of the DVD cow by occasionally re-releasing an old movie as an (inevitably longer) "Director's Cut." Sadly, we never get to buy a shorter "Editor's Cut." With luck, director Jonathan Demme's "Rachel Getting Married" will be the first. Buried under more than an hour of Demme's Sixties noodling is a nifty sixty-minute family drama.
Demme, who was born in 1944 (in between George Harrison and Keith Moon), was a sort of idiot savant music video genius, who in 1984 made the best ever rock concert movie, Talking Heads' "Stop Making Sense." His 1986 masterpiece "Something Wild" incorporated the nascent "world music" trend delightfully. Unfortunately, the title "Stop Making Sense" proved prophetic. Demme's shambolic 1992 Academy Award acceptance speech for "Silence of the Lambs" may be the most incomprehensible yet.
As Demme's musical-visual gifts dimmed, he turned to "liberal humanist" (i.e., boring) message movies such as "Philadelphia," in which Tom Hanks proves that homophobia caused the AIDS epidemic (rather than, say, industrial-scale gay promiscuity). After Demme's useless remakes of "Charade" in 2002 and "The Manchurian Candidate" in 2004, the industry seems to have concluded that he doesn't have enough brain cells left to handle a big production. Thus, the low budget "Rachel Getting Married" looks like an amateur wedding video. Half the film consists of Demme's not-as-hip-as-they-used-to-be friends improvising tedious toasts and mediocre music.
The movie's better half stars a charismatic Anne Hathaway (a heretofore-bland leading lady whose dark eyebrows made most of the impression in "The Devil Wears Prada") as Kym, an attentionaholic part-time model turned full-time drug addict who is furloughed from a posh rehab clinic for her sister's wedding. Exactly as her levelheaded sister Rachel dreads, Kym's self-destructive antics enthrall the multicultural throngs crowding the grounds of their father's Connecticut estate to prepare for Rachel's big day on which the Reform rabbi is to marry her to a tall, gentlemanly black man from Hawaii.
The highlight of the ceremony is the groom singing his bride a Neil Young ballad. White liberals critics have gone nuts over "Rachel" because the interracial marriage reminds them of a certain black Hawaiian's promise that promoting "mutual understanding" is "in my DNA." I fear, though, that even electing Obama President won't get many black guys to understand the appeal of whiny Canadian folk rockers from the Sixties.
First-time screenwriter Jenny Lumet named the groom "Sidney." She is presumably referencing both Sidney Poitier in Stanley Kramer's "Guess Who's Coming to Dinner," and her father, Sidney Lumet, director of 1957's "Twelve Angry Men," one of Kramer's successors as a liberal warhorse.
Various shocking revelations about Kym's culpability in the death a decade before of their little brother ensue, culminating in a confrontation with her mother (1980s legend Debra Winger of "An Officer and a Gentleman" making one of her myriad, but still welcome, comebacks). "Rachel Getting Married" has a decent little plot if you like upscale suburban family tragedies in the tradition of "Ordinary People." Lumet handles the disclosures about the death of the child realistically and effectively. Rather than build up to stagey moments, jagged shards of information are blurted out before you can prepare your emotional defenses.
Still, a more entertaining screenplay could be written about the star's off-screen misadventures. Hathaway was in the news in June when the FBI hauled away her suave Italian boyfriend, Raffaello Follieri. Outfitted with clerical cassocks and a claim to be the Vatican's chief financial officer, Follieri had wormed his way into a $100 million deal with Bill Clinton and Ron Burkle to sell off Roman Catholic churches in America to pay for sex scandal settlements. On a rented yacht in Montenegro, the bipartisan cute couple also hosted the 70th birthday party of John McCain.
An equally entertaining movie could be made about the real-life Lumet sisters (who are granddaughters of famed jazz vocalist and beauty Lena Horne). When their dad received his Lifetime Achievement Oscar in 2005, screenwriter Jenny, the sensibly dressed old-fashioned leftist, had the global television spotlight stolen from her by the startling new cleavage of her sister Amy, a would-be model and 1992 National Review contributor ("Baby Cons of America, unite: You have nothing to lose but your parents' guilt.") Interestingly, Amy Lumet's marriage to hard-partying conservative satirist P.J. O'Rourke broke up about when she is said to have worked for John McCain.
Now, Jenny / Rachel has taken sibling rivalry to a new level.
Rated R for language and brief sexuality.
Greg Cochran and Henry Harpending now have a website up for their new book The 10,000 Year Explosion: How Civilization Accelerated Human Evolution.
They've posted four outtakes from the book that didn't make the final draft for reasons of length. Here's part of a section intended to help readers understand what it must have been like for early humans to hunt big game with just spears:
Probably most of our readers don’t have personal experience with old-fashioned, Pleistocene-style big game hunting. The only place in which it is still possible - not for much longer, at that – is Africa, where the big game had a chance to adapt as mankind gradually became formidable hunters and thus managed to survive until today. Without that experience, it’s hard to realize how remarkable Neanderthals were, how difficult hunting bison and elk with thrusting spears must have been. It’s not easy to appreciate the risks stone-age hunters had to take when they went after mammoths, rhinos, or Cape buffalo: it’s not exactly safe today, even with modern weapons. One of us, however (Henry Harpending) does have that experience, and the following note gives a flavor of what it’s like – particularly when you don’t have the faintest idea what you’re doing.
Encounter with a Buffalo
When I (HCH) was a graduate student in the 1960’s I spent a year and a half in the northern Kalahari desert doing fieldwork with !Kung Bushmen, foragers who lived by foraging wild foodstuffs and hunting game animals. With several other graduate students we had a base camp near the border with Southwest Africa (now Namibia) about 100 miles south of the Caprivi Strip on the northern border of Botswana. The nearest source of supplies was a two-day trip from their camp by four wheel drive truck.
Several weeks after the rainy season ended there were reports in the neighborhood of a cape buffalo that was harassing people and animals. Often older males lose rank and leave herd to wander by themselves, angry and uncomfortable. They are a threat to people and stock, especially horses.
We were out of meat in our camp, and so with the confidence and foolishness of youth we decided to hunt down the buffalo. We had visions of steaks and chops as well as many pounds of dried meat for travel rations and dog food. At that time permits for Buffalo were only a few dollars from the Botswana game department, and we had several. Although there were stories of Buffalo being aggressive and dangerous to hunt, to my eye they were simply large cattle. Bushmen never hunted them with their poison arrow and spear technology, but they too were naĂŻve and had great faith in our high-powered rifle.
One morning we set off to where the animal had last been reported. The party was a colleague, several young Bushman males, and myself. We soon picked up its tracks and for several hours followed its wanderings through the low thorny scrub. To me the tracks looked exactly like those of a cow but the Bushmen never hesitated. When it was apparent at one point that there were no tracks at all in view I asked, and the Bushmen told me that there was no point in following the tracks since they knew exactly where it was going. We often saw this hunting with Bushmen–they used actual tracks as a guide but knew the habits of animals so well that they often proceeded on their own to pick up actual tracks later on.
This went on for hours until, suddenly, a young man grabbed my shoulder and said “there it is.” I looked long and hard until I saw it, well camouflaged behind several yards of thick brush, sideways, staring hard at us with its bright pig eyes. It was about forty yards away.
As I brought the rifle up I was dismayed to realize that it still had a powerful telescopic sight. I should have removed it and use open iron sights in thick bush but I had forgotten. With the magnification of the scope I saw a black mass surrounded by brush. It took a moment to locate the front legs, then the chest. Oriented, I aimed and fired. “Bang-whump”, the bang from the rifle and the whump as the bullet struck the buffalo. He jerked a little, then simply stood there staring at me. “Bang-whump, bang-whump” as I fired two more rounds.
Now he tossed his head and snorted, then started running toward us. Buffalo charge with their nose high, only lowering their head to use their horns on contact. I fired one more round at the charging animal, head on, simply pointing at him because he was so close, then turned and ran. We discovered later that the bullet had struck his shoulder, ricocheted off his scapula, and exited through the skin on his side. It certainly didn’t slow him down at all: I might as well have been shooting at a railroad locomotive.
There were three of us running away now from the charging animal: my colleague, our camp dog, and myself.
You can find out what happened here.
My published articles are archived at iSteve.com -- Steve Sailer
You may have wondered what is this movie "Happy-Go-Lucky" that keeps winning year-end awards from critics -- e.g.:
British comedy "Happy-Go-Lucky" has almost swept the 43 Annual National Society of Film Critics Awards on Saturday, taking home four trophies including Best Director for Mike Leigh.
Here's my review from a couple of months ago in The American Conservative:
“Happy-Go-Lucky,” five-time Oscar nominee Mike Leigh’s “quirky” and “offbeat” comedy about a young London schoolteacher who is, yes, happy-go-lucky, has enjoyed the most unanimous critical acclaim of any film this year. All 31 “Top Critics” on the Rotten Tomatoes website have given “Happy-Go-Lucky” their personal thumbs up. Indeed, star Sally Hawkins has a shot at an Oscar nomination because Academy members like to vote for obscure British actresses in low budget movies nobody has seen, such as Imelda Staunton’s Best Actress nod for Leigh’s last film, “Vera Drake.”
Leigh, a Best Director nominee for 1996’s “Secrets and Lies,” prides himself on improvising slice-of-life leftwing movies about the English working class, which this Royal Academy of the Dramatic Arts graduate knows all about because his physician father had proletarian patients.
Since he doesn’t work from a script, investors are cautious about investing in Leigh’s vague ideas. "My tragedy as a filmmaker now," he declaims, "is that there is a very limited ceiling on the amount of money anyone will give me to make a film.” So, the British National Lottery obligingly kicked in some of “Happy-Go-Lucky’s” budget.
Lotteries are notoriously a tax on stupidity; evidently, they are also a subsidy for vapidity because “Happy-Go-Lucky” is the worst movie by a prominent director since M. Night Shyamalan’s allergy allegory “The Happening.” Leigh’s film is smug, boring, plotless, and pointless, the perfect embodiment of the Obama Era of liberal self-congratulation.
To Leigh, Hawkins’s character “Poppy” is as adorable as the two Audreys: Hepburn in “Breakfast at Tiffany’s" and Tatou in “AmĂ©lie.” To me, Hawkins is insufferable. Imagine a “Star Wars” prequel in which a female Jar-Jar Binks hogs the screen for the entire two hours. Poppy smirks, snickers, and sniggers, mugging like Jim Varney in those old “Hey Vern” movies, an overgrown class clown laughing relentlessly at her own jokes, which are never, ever funny.
There’s nothing more excruciating than watching people onscreen laugh, especially when they crack themselves up. (What’s really funny is seeing characters mortified with embarrassment.) In general, happy people aren’t very funny and funny people aren’t very happy. A friend had dinner in the 1990s with the famous comic Jackie Mason, and reported that it was a grim ordeal. Mason spent the evening complaining about how Ed Sullivan had “ruined his career” in 1964.
And how exactly did Poppy, a North Londoner, acquire her quasi-Australian accent? Her youngest sister, a drunken law student, talks like Sid Vicious, but Poppy sounds like the Crocodile Hunter. In a male actor, a working class Australian accent sounds manly yet affable (that’s why the U.S.-born Mel Gibson normally plays his American roles with an unexplained hint of Down Under in his voice), but on a woman it just sounds tomboyish and goofy.
Most of Leigh’s movies have been about the oppression of the proletariat, but by 2008 their values are apparently ascendant in London. Any character who thinks about the future—such as Poppy’s one married, home-owning sister—is scorned as a buzz-kill.
Most people in “Happy-Go-Lucky” have pleasant government jobs. Judging from this movie, the British welfare state exists mostly so people with soft college degrees can have some place to hang out together while making plans for which pub or disco to go to after work.
The only plot device consists of Poppy’s weekly driving lessons with a tightly wound little fundamentalist Christian with bad teeth, played by Eddie Marsan. I initially assumed these two equally unattractive single people would wind up settling for each other, but when he insists she lock the car doors when two black youths bicycle by, he demonstrates (in Leigh’s mental universe) that he is morally unworthy of her, and probably a dangerous psycho to boot.
Instead, Leigh hooks her up with a school social worker, who is played by a ludicrously handsome young actor who looks like one of those towering Olympic swimming medalists with massively masculine jawlines molded by years of Human Growth Hormone abuse.
One vignette of this momentum-free movie unwittingly exemplifies the female cluelessness that has made Britain’s schools a dystopia of juvenile male thuggishness. When one of her students starts punching other children, does Poppy punish him? No, she signs the bully up for counseling, which consists of three adults—the headmistress, Poppy, and her future boyfriend—sitting around praising the little lout and asking him what’s the real reason he hits people. (Actual answer: it’s fun.)
Rated R for language.
Karl Rove writes in the Wall Street Journal that the Mortgage Meltdown wasn't Bush's fault:
President Bush Tried to Rein In Fan and Fred:
Democrats and the media have the housing story wrong.... Some critics blame Mr. Bush because he supported broadening homeownership. But Mr. Bush’s goal was for people to own homes they could afford, not ones made accessible by reckless lenders who off-loaded their risk to GSEs [Government-Sponsored Enterprises]....
As one of those critics, let me point out to Mr. Rove that Mr. Bush didn't attempt to rein in Fannie Mae and Freddie Mac when it came to minority lending. In fact, he egged them on. From Bush's June 17, 2002 speech to St. Paul's African Methodist Episcopalian church:
Now, we've got a problem here in America that we have to address. Too many American families, too many minorities do not own a home. There is a home ownership gap in America. The difference between Anglo America and African American and Hispanic home ownership is too big. (Applause.) And we've got to focus the attention on this nation to address this.
And it starts with setting a goal. And so by the year 2010, we must increase minority home owners by at least 5.5 million. In order to close the homeownership gap, we've got to set a big goal for America, and focus our attention and resources on that goal. (Applause.)...
I want to thank Franklin Raines, of Fannie Mae and Leland Brendsel of Freddie Mac. Thank you all for coming. (Applause.)...
Three-quarters of white America owns their homes. Less than 50 percent of African Americans are part of the homeownership in America. And less than 50 percent of the Hispanics who live here in this country own their home. And that has got to change for the good of the country. It just does. (Applause.) And so here are some of the ways to address the issue. First, the single greatest barrier to first time homeownership is a high downpayment. It is really hard for many, many, low income families to make the high downpayment. ...
And let me talk about some of the progress which we have made to date, as an example for others to follow. First of all, government sponsored corporations that help create our mortgage system -- I introduced two of the leaders here today -- they call those people Fannie May and Freddie Mac, as well as the federal home loan banks, will increase their commitment to minority markets by more than $440 billion. (Applause.) I want to thank Leland and Franklin for that commitment. It's a commitment that conforms to their charters, as well, and also conforms to their hearts.
This means they will purchase more loans made by banks after Americans, Hispanics and other minorities, which will encourage homeownership. Freddie Mac will launch 25 initiatives to eliminate homeownership barriers. Under one of these, consumers with poor credit will be able to get a mortgage with an interest rate that automatically goes down after a period of consistent payments. (Applause.)
Fannie Mae will establish 100 partnerships with faith-based organizations that will provide home buyer education and help increase homeownership for their congregations. I love the partnership. (Applause.)My published articles are archived at iSteve.com -- Steve Sailer
Second: You can make a tax deductible contribution via VDARE by clicking here. (Paypal and credit cards accepted, including recurring "subscription" donations.) UPDATE: Don't try this at the moment.
Third: send money via the Paypal-like Google Wallet to my Gmail address (that's isteveslrATgmail.com -- replace the AT with a @). (Non-tax deductible.)
Here's the Google Wallet FAQ. From it: "You will need to have (or sign up for) Google Wallet to send or receive money. If you have ever purchased anything on Google Play, then you most likely already have a Google Wallet. If you do not yet have a Google Wallet, don’t worry, the process is simple: go to wallet.google.com and follow the steps." You probably already have a Google ID and password, which Google Wallet uses, so signing up Wallet is pretty painless.
You can put money into your Google Wallet Balance from your bank account and send it with no service fee.
Or you can send money via credit card (Visa, MasterCard, AmEx, Discover) with the industry-standard 2.9% fee. (You don't need to put money into your Google Wallet Balance to do this.)
Google Wallet works from both a website and a smartphone app (Android and iPhone -- the Google Wallet app is currently available only in the U.S., but the Google Wallet website can be used in 160 countries).
Or, once you sign up with Google Wallet, you can simply send money via credit card, bank transfer, or Wallet Balance as an attachment from Google's free Gmail email service. Here's how to do it.
(Non-tax deductible.)
Fourth: if you have a Wells Fargo bank account, you can transfer money to me (with no fees) via Wells Fargo SurePay. Just tell WF SurePay to send the money to my ancient AOL email address steveslrATaol.com -- replace the AT with the usual @). (Non-tax deductible.)
Fifth: if you have a Chase bank account (or, theoretically,other bank accounts), you can transfer money to me (with no fees) via Chase QuickPay (FAQ). Just tell Chase QuickPay to send the money to my ancient AOL email address (steveslrATaol.com -- replace the AT with the usual @). If Chase asks for the name on my account, it's Steven Sailer with an n at the end of Steven. (Non-tax deductible.)
| This content is not yet available over encrypted connections. |
| This content is not yet available over encrypted connections. |
| This content is not yet available over encrypted connections. |