January 20, 2014

Ian McEwan: Don't throw out the baby with the bathwater

I've been pointing out essays by the Edge consortium on the topic of "What Scientific Concept Should Be Retired?" that I have disagreements with, so here's one I like by the science-minded British novelist Ian McEwan, author of Atonement.
Ian McEwan 
Novelist; Author, Sweet Tooth; Solar; On Chesil Beach; Amsterdam 
[Questioning the Question:] Beware of arrogance! Retire nothing! 
... A great and rich scientific tradition should hang onto everything it has. Truth is not the only measure. There are ways of being wrong that help others to be right. Some are wrong, but brilliantly so. Some are wrong but contribute to method. Some are wrong but help found a discipline.  
Aristotle ranged over the whole of human knowledge and was wrong about much. But his invention of zoology alone was priceless. Would you cast him aside? You never know when you might need an old idea. It could rise again one day to enhance a perspective the present cannot imagine. It would not be available to us if it were fully retired. Even Darwin in the early 20th century experienced some neglect, until the Modern Synthesis. [Darwin's] The Expression of Emotion [in Animals] took longer to be current. William James also languished, as did psychology, once consciousness as a subject was retired from it. Look at the revived fortunes of Thomas Bayes and Adam Smith (especially 'The Theory of Moral Sentiments') We may need to take another look at the long-maligned Descartes. Epigenetics might even restore the reputation of Lamarck. Freud may yet have something to tell us about the unconscious. 
 
In general, the big boys of the past were extremely smart. Their best ideas were often dreamed up to deal with current situations, which, after awhile, stopped being current. But, sometimes, situations continue to cycle until there is some relevance between what he was writing in response to and what we face.

For example, in the middle of the last decade, I was developing my theory of affordable family formation, which struck some observers as fairly novel. Yet, much of the basis for it turned out to have been anticipated by Benjamin Franklin in his 1754 essay Observations Concerning the Increase of Mankind, a short essay that I had only heard of mentioned derisively for its immigration restrictionism and for Franklin's clumsy attempt to define the object of his concern.

Franklin's goal was to obtain for his people a higher standard of living, defined as higher wages and lower prices, especially for land, via having more land per person.

Now Ben Franklin is not a forgotten figure in history. He's on the $100 bill, the most closely examined form of money in the world, since it's the most tempting target of counterfeiters.

Moreover, this essay played a key role in intellectual history, since it preceded much of what Malthus had to say by almost a half century (as Franklin's admirers forced Malthus to concede in his second edition), and Malthus inspired both Darwin and Wallace to separately conceive of the theory of natural selection. (Wallace famously came up with it during a malarial dream following an evening reading Malthus.)

Still, the import of Observations Concerning the Increase of Mankind has largely been forgotten. In part, this is because Franklin himself lost interest in the subject in the subject of maintaining a larger amount of land per person through immigration restriction in 1756.

Why? Because obtaining more land per capita through war suddenly became feasible two years later. The advent of the French & Indian War in 1756 brought up the possibility of Franklin's people spreading into the middle of North America, and Franklin became obsessed with keeping the British government from botching this opportunity by trading back North American conquests to the French for sugar islands. Franklin argued in 1760 that whichever power settled the Great Lakes and the Mississippi River Valley would dominate the world in the 20th Century.

Then, London's Proclamation of 1763 restricting colonists from settling west of the Appalachians started to incline the previously highly loyal Franklin on the road to the Revolutionary War and America obtaining vast lands by defeating the Indians and Mexicans. So, Franklin lost interest in arguing against Invite the World and started arguing for Invade the World (or at least the Western three-fourths of the temperate zone of North America).

Personally, here in 2014, I don't want more war. So, the peaceful, let's-mind-our-own-business Franklin of 1754 strikes me as more appealing and more intellectually relevant to America's current situation than the militarily aggressive Franklin of 1756 onward.

But then I'm some sort of weirdo extremist who is skeptical of the Invade-the-World / Invite-the-World conventional wisdom, so I would think that, wouldn't I?
 

The Flynn Effect of NFL Passing

Over the years, NFL football teams have gotten visibly better at passing the ball. This graph shows NFL passing stats from 1950 to 2013. For example, the percentage of forward passes completed (red line) has improved from under 50% in the early 1950s to a record-tying 61.3% in 2013.

Or compare touchdown passes to interceptions. The average number of touchdown passes per team per game (green line) hadn't changed all that much, until shooting up the last few years, tying the record of 1.6 in 2013. Most notably, the number of intercepts surrendered per team per game (purple line) has dropped from over 2 to about 1 in recent years. 

There are many reasons for this, such as changes in rules to preserve receivers and quarterbacks and improvements in game conditions (such as domed stadiums and better turf). It would appear, however, that NFL teams are just better at passing the ball than they used to be. Although defenses are evolving too, offenses make fewer unforced errors than they used to. Thus the big improvement in the touchdowns to interceptions ratio from the Broadway Joe Namath days when advanced offensive strategic thinking called for the QB to heave it deep and see what happens.

The legendary season Namath led the upstart New York Jets to the AFL's first Super Bowl win in January 1969, the future Hall of Famer averaged 16.8 yards per pass completion, compared to Peyton Manning's mere 12.2 yards per completion this season. But Namath only threw 15 touchdown passes compared to 17 interceptions (in 14 games) compared to Manning's 55 touchdowns to 10 interceptions (in 16 games) in 2013.

When you see "Flynn Effects" like that, it's interesting to try to determine whether the improvements come more from change in personnel or from change in how they perform. 

For example, Bill Simmons notes in his big book of basketball a 1950s NBA center, a white guy listed at 6'8" whose name slips my mind (Neil Johnston?). He was a major scorer using a side arm hook shot. Then Bill Russell, a black guy listed at 6'11" arrived in the league and tried really hard on defense. (The NBA has always had a professional wrestling side to it, so trying hard to win on defense was a novelty in the pro game.) Russell blocked so many of his sidearm shots that he was soon out of the league. 

In general, though, it's not that easy to find all that many clear examples in sports history of players' careers being shortened by the overall quality of competition in the league going up. One inflection point might be when Maury Wills stole 104 bases for the Los Angeles Dodgers in 1962: that is said to have had a big impact on the careers of catchers in the National League: the good hit / no throw ones were gone pretty fast.

But you also get a lot of examples of star players who just keep racking up bigger numbers as the game evolves. Manning, for instance, is 37 and can't throw the ball very hard anymore. He has to throw many of his passes at an arc, which ought to give the defense more time to break them up. Since entering the NFL in the last century, he has heard quite a few other quarterbacks proclaimed The Future of Football.

But this season Manning set records for most yards passing and most touchdowns in one season, and threw for 400 yards on Sunday to go to the Super Bowl. 

It's curious.
  

January 19, 2014

Paul Krugman notices he's in the One Percent

Paul Krugman writes in "The Undeserving Rich:"
The Occupy movement popularized the concept of the “1 percent,” which is a good shorthand for the rising elite, but if anything includes too many people: most of the gains of the top 1 percent have in fact gone to an even tinier elite, the top 0.1 percent.

I've never quite understood the political appeal of defining the Class Enemy so broadly as to include over 3 million people. I would think that you'd want to define it much more narrowly: The 400, say, the number who could fit into Mrs. Astor's ballroom, which Forbes appropriated a third of a century ago.

Perhaps one problem is that if you start narrowing the number of Class Enemies down to the point where it gets personalized, then you run into the problem that lots of people like famous rich individuals. Billionaires can afford appealing public relations strategies.

But I'm probably overthinking this. I doubt if all that many people denouncing the One Percent have ever even calculated what is one percent of the the population of the country (what's that?). One Percent is just a numeric-looking shorthand for a Really Small Number.
   

The triumph of hope over experience: $17-$20 million for Obama's 3rd autobiography

David Remnick writes in The New Yorker:
When Obama leaves the White House, on January 20, 2017, he will write a memoir. “Now, that’s a slam dunk,” the former Obama adviser David Axelrod told me. Andrew Wylie, a leading literary agent, said he thought that publishers would pay between seventeen and twenty million dollars for the book—the most ever for a work of nonfiction—and around twelve million for Michelle Obama’s memoirs. (The First Lady has already started work on hers.) 

Which one do you think will have the even lower finished-to-purchased ratio?

Remnick's long article features extensive analyses from Obama about how, on the one hand, the President should never say anything interesting, but, on the other hand, interesting things are those the President should never say. The article can't have set literary agents on fire with anticipation.

One possibility is that, having never said anything interesting in decades in anticipation of becoming President, Obama has been saving up the huge number of brilliant insights and hilarious zingers he's dreamed up, and will unload them all in his $20 million book. An alternative is that he hasn't actually thought up anything all that interesting, and that's what the people who will pay for this book want more of.
   

Diversity Before Diversity: Oklahoma's 1907 senators

In Gore Vidal's historical novel Empire, President Theodore Roosevelt exclaims in 1907 about the new state of Oklahoma and its two new Democratic senators:
"They have also, in their infinite Western wisdom, sent us a blind boy for one senator, and an Indian -- an Indian! -- for another."

The blind senator, Thomas Gore, was Vidal's maternal grandfather (but only a distant relation of Al Gore). Senator Gore had gone blind in a couple of childhood accidents. 

My general impression is that the number of blind people is declining. When I was young, blind musicians were prominent (e.g., Ray Charles, Stevie Wonder, Jose Feliciano, and Ronnie Milsap were all famous around 1970), but that seems to be less true these days. The Lt. Governor of New York recently was blind, but he ran into trouble when he moved up to the top job after Gov. Spitzer's resignation. It's hard being blind. 

The other 1907 Oklahoma senator, Robert L. Owen, is identified on the Cherokee rolls as 1/16th Cherokee, although that might have been an exaggeration. 

Still, he appears to have inherited some good hair genes from his Indian ancestor. Wikipedia reports: "He was a tall man of erect bearing, who kept a full head of hair to the end of his life." He looked like the kind of slightly Indian guy whom 1930s Hollywood directors would have been happy to cast as an Indian chief.

He publicly identified as part-Cherokee and was thus employed in various Indian affairs for years in the late 19th century. From 1900 to 1906 he represented the Eastern Cherokee in their famous case over having their lands stolen from them by Andrew Jackson, winning almost $5 million in a 1906 Supreme Court decision.

As Thomas Babington Macaulay pointed out in regards to English attitudes toward Scottish Highlanders, fear and loathing rapidly changed to thinking Scots were glamorous once they were subdued and longer likely to go on the warpath. So, by early 20th Century, being a little bit Indian was cool.

This was very different from the attitudes toward blacks in the U.S. in 1907. Latins who were somewhat black like the Dumas novelists, father and son, were okay as long as there wasn't much talk about it (e.g., some partly black Cubans broke the major league baseball color line playing for the Washington Senators in the late 1930s and early 1940s long before Jackie Robinson, but almost nobody mentioned it, so it wasn't a Thing). A few politicians were a little bit black (various Southern legislatures drawing up Jim Crow laws in the late 19th Century avoided "one drop" definitions to not disqualify some of their members). But it wasn't to be talked about, while Senator Owen or Vice President Charles Curtis being partly Indian was celebrated.

The causal mechanism was the reverse of what we are supposed to believe now. We are always being told that white bigotry against blacks was driven by "hate," but 19th Century whites had hated Indians far more than they had hated blacks. Just about everybody who contact with Indians on the frontier hated them, with only a few exceptions (e.g., Sam Houston), while most whites who came into contact with black slaves liked them. Mark Twain is extremely representative: compare Jim in Huckleberry Finn to Injun Joe in Tom Sawyer.

Both senators were highly successful, Owen serving straight through from 1907 until 1925 (he co-sponsored the Glass-Owen act setting up the Federal Reserve system in 1913), while Gore was in and out of the Senate until 1937.
    

NYT: "Russians: Still the Go-To Bad Guys"

Movies these days tend to be extremely accurate visually about what the past looked like or would have looked like if people back then had more money and time to work on their looks (e.g., all the work put into, say, American Hustle to look like 1978-80).

But the kind of people who write about movies are generally pretty clueless about how people thought or behaved in the past. Cultural pundits today mostly absorb some generally acceptable lessons about the evil attitudes of the past and don't look for nuance.

Thus, from the New York Times:
Russians: Still the Go-To Bad Guys 
By STEVEN KURUTZ
THE movie “Jack Ryan: Shadow Recruit,” which opens in theaters this weekend, revolves around an American C.I.A. analyst first introduced in Tom Clancy’s 1984 novel “The Hunt for Red October.” The source material isn’t the only thing that’s a little creaky. Ryan’s destination is Moscow, his target a Russian businessman plotting to crash the American economy through a terrorist attack. 
In portraying the diabolical oligarch Viktor Cherevin, Kenneth Branagh delivers his lines in the thick, menacingly slow accent that defines Eastern European baddies on screen: “You think this is game, Jack?” 
Nearly 25 years after the Berlin Wall fell and marked the end of the Cold War, Hollywood’s go-to villains remain Russians.

That assumes that Russians were the movies' go-to villains during the Cold War, which was hardly true. An obvious example is the James Bond movies, in which the novels' original bad guys, the Soviet SMERSH agency, were replaced by the nonideological for-profit multinational SPECTRE.

In general, Cold War movie bad guys were far more often the CIA, the oil companies, the military-industrial complex, the rich, and so forth and so on. (Among powerful American institutions, the Marine Corps and the FBI spent a lot of money and effort schmoozing Hollywood to keep from being portrayed badly.) Overall, the Soviets didn't figure much in American entertainment, and when they did, were usually seen as not the real problem.

The use of the Soviets as bad guys tended to be a 1980s idiosyncrasy of a handful of out-of-the-closet conservative action stars (e.g., Sylvester Stallone) or writer-directors (John Milius -- Red Dawn). Their anti-Communist movies were extremely controversial at the time since they were much more popular with the public than with the culturesphere.

In general, Hollywood saw the Nazis as having agency, while the Soviets did not. They were a mere unfortunate reaction to our own agency. That's not a wholly unreasonable interpretation of history, but you can see why it wasn't very stimulating for making movies, so there were few anti-Soviet movies.

Even then, it is difficult to recognize any sort of negative ethnic stereotyping of Russians of the type we see today. Throughout the Cold War, American culture producers tended to view Russians as a cultured people as seen in all the great Russian novels and plays of the pre-Communist era. Many of the Russians in the West during the Cold War were refugee aristocrats of impeccable manners: some became head waiters, others novelists (Nabokov). For example, Ensign Chekov on Star Trek was named to call to mind the great Russian playwright.

The current notion of Russians as flatheaded goons didn't exist in America before the Berlin Wall came down.

While Soviet government officials were seen as boring and badly dressed, Americans during the Cold War tended, if anything, to overrate Russian culture as more elegant than American culture. For example, the Bolshoi Ballet was hugely famous during the Cold War. Similarly, young Texan pianist Van Cliburn winning the Tchaikovsky Competition in Moscow in 1958 was gigantically famous at the time. Russian figure skaters and gymnasts were highly admired. Even the U.S. victory over the Soviet ice hockey team at the 1980 Olympics was seen as bumptious upstarts somehow knocking off a team that was far more elegant than the brutal North American style.

At the upper end of Hollywood prestige, note Dr. Zhivago, the second biggest grossing movie of the 1960s, which depicts Russians as soulful, literary, and romantic but sadly victimized by Stalin. The movie ends with a travelog of a giant new Soviet hydroelectric dam showing the bad times are over and progress is being made.

The current stereotype associating Russians with organized crime simply didn't exist during the Cold War.
The last few years alone have seen a sadistic ex-K.G.B. agent (“The Avengers”), crooked Russian officials (“A Good Day to Die Hard”), Russian hit men (“The Tourist”), a Russian spy (“Tinker Tailor Soldier Spy”), a Russian-American loan shark (“Limitless”) and so many Russian gangsters they have displaced Italians as film’s favored thugs (“Jack Reacher,” “Safe,” “A Very Harold & Kumar 3D Christmas,” among others). 
I suspect screenwriters and studio executives have deemed Russians to be politically safe villains. No advocacy group will protest.

E.g., Steven Spielberg thought Hindus were a safe set of villains in Indiana Jones and the Temple of Doom 30 years ago, but quickly discovered that he'd better go back to Nazis.
No foreign distribution deal will be nixed. Russian moviegoers here and abroad are probably inured to seeing themselves portrayed as Boris Badenovs on screen.

Russia is second only to China as a growth market for Hollywood movies. The movie industry is very concerned about Chinese sensitivities, so it would be interesting to see why it doesn't seem concerned about Russian sensitivities.
Why make a TV show about modern-day surveillance and wiretapping when you can do a Red-scare period piece and offend or provoke no one?
Still, it doesn’t make for as powerful drama as it once did.

If you grew up during the Cold War, you viewed Russians with a potent mix of hatred and fear, and felt in your gut that a nuclear war between our countries could erupt any second, obliterating everybody and everything. That’s why movies like “The Day After” and “Threads” were so visceral.

No, these were basically movies about how Ronald Reagan was going to blow up the world. It's funny how the gigantic Nuclear Freeze movement of the early 1980s is so forgotten that it doesn't even have its own Wikipedia page.
I doubt today’s teenage moviegoers are walking around with the same mixed-up feelings about the Russians. Ivan Drago, the Soviet-bred fighting machine who battled Rocky Balboa in 1985, may have been absurd but he was a fall guy for his time. Has our pop culture not moved beyond “Rocky IV”?  

Once again, Stallone was notoriously out of step with the rest of Hollywood in the 1980s by making anti-Communist movies, from which he made lots and lots of money, much as Mel Gibson made lots of money off the Mexican-American market with The Passion of the Christ. And don't forget that Rocky IV still ends with Rocky Balboa negotiating world peace with the young new Soviet premiere.

Now, in defense of post-Berlin Wall Hollywood screenwriters, let me point out that they were faster at sniffing out that something funny was going on in Yeltsin's new free market democracy than was, say, Stanley Fischer.
 

NYT, 1986: Boyhood Effeminacy and Later Homosexuality

Here is a very interesting New York Times article from over 27 years ago on a classic long-term study. I'm not aware of this type of study ever being done again. 

The article is quite good on the complexities of nature and nurture, of glasses that are part full and part empty. The NYT science section remains quite strong (despite the complaints of psychometricians who ranked the overall institution far behind my blog in accuracy), but there are simply so many more minefields these days that it's hard to imagine reading a single article this dense with information and analysis without long stretches of moralizing filler. In other words, you can still learn a lot reading the NYT, but you now have to home in on the nuggets of intelligence deposited here and there amongst the rubble and dust.
BOYHOOD EFFEMINACY AND LATER HOMOSEXUALITY 
By JANE E. BRODY 
Published: December 16, 1986

MOST young boys who persistently act like girls grow up to be homosexuals or bisexuals, a 15-year study of ''sissy boys'' has shown. According to the findings, neither therapy designed to discourage the extremely feminine behavior nor ideal child rearing could guarantee that the boys would develop as heterosexuals, although parental discouragement of the boys' girlish behavior tended to result in a more heterosexual orientation. 
Three-fourths of 44 extremely feminine boys followed from early childhood to adolescence or young adulthood matured as homosexuals or bisexuals, as against only one bisexual among a comparison group of more typically masculine boys. 
In many cases parents either overtly or subtly encouraged the feminine behavior. But when parents actively discouraged it and took other steps to enhance a male self-concept, homosexual tendencies of the feminine boys were lessened, although not necessarily reversed. Neither did professional counseling divert a tendency toward homosexuality, although it resulted in more conventional masculine behavior and enhanced the boys' social and pyschological adjustment and comfort with being male. 
The study was conducted by Dr. Richard Green, a noted sex researcher who is professor of psychiatry at the University of California, Los Angeles and director of its Program in Psychiatry, Law and Human Sexuality. Details of the findings and implications are described in Dr. Green's new book, ''The 'Sissy Boy Syndrome' and the Development of Homosexuality,'' to be published in February by Yale University Press. 
Although the study examined extreme cases of boyhood effeminacy, Dr. Green believes the findings may have relevance to lesser degrees of feminine behavior in boys. Such boys, who may, for example, be athletically inept or prefer music to cars and trucks, often have difficulty making friends with other boys and identifying with typically male activities. Dr. Green suggested that to help the boys think of themselves as male, parents might assist them in finding boy friends who are similarly unaggressive and that the fathers might share in activities the boys enjoy, such as going to the zoo or a concert, rather than insist on taking the boys to athletic events. Counseling to guide such parents and enhance the child's masculine self-image may also be helpful, Dr. Green said. 
The study did not examine the development of homosexuality in boys whose childhoods are typically masculine. About one-third of homosexual men recall such masculine boyhoods. Nor does the study suggest that all boys with the sissy-boy syndrome are destined for homosexuality. Indeed, one-fourth of the extremely feminine boys followed to maturity developed as heterosexuals. 
According to Dr. Green and other experts familiar with his study, the findings indicate that some children may have an inborn ''receptivity'' to environmental factors that encourage a homosexual orientation. Whether such a predisposition is genetic or the result of prenatal factors, or both, is not known. Recent research in animals suggests that prenatal hormonal influences can interfere with programming the brain of the male fetus and result in the birth of males that act like females. 
The study supports a recent Kinsey Institute survey of 1,500 adults that singled out ''gender nonconformity'' in childhood as the most important predictor of homosexuality. Dr. Alan Bell of Indiana University, a director of the Kinsey study, said he was pleased and not surprised that the findings of Dr. Green's prospective study corresponded with the retrospective Kinsey data. 
''The pendulum is swinging back to biology,'' Dr. Bell remarked. ''Apparently there is a very important physiological component that plays a big role in determining one's sexual identity.'' 
As have other recent investigations, including the Kinsey study, the new research challenges long-held psychoanalytic beliefs that dominant, overprotecting mothers and ineffectual fathers are primary ''causes'' of a son's homosexuality. 
Rather, the study suggests that some boys are born with an indifference to rough-and-tumble play and other typical boyhood interests and that this indifference alienates and isolates them from their male peers and often from their fathers as well. Dr. Green believes that such boys may grow up ''starved'' for male affection, which prompts them to seek love from men in adolescence and adulthood. To Dr. Bell, however, a sense of difference and social distance from males during childhood is what leads to the romantic and erotic attraction to other males. 
Dr. Richard Isay, a New York psychoanalyst whose practice is largely homosexual men, said: ''I would agree with Dr. Green. I too see no support for the notion that binding mothers produce homosexual sons, nor do I see any consistent pattern for absent fathers that I don't also see among heterosexual men in analysis.'' Dr. Isay, who is affiliated with Columbia Psychoanalytic Institute and New York-Cornell Medical School, suggested that the common depiction among homosexual men of an absent, distant father is in fact a defense against an underlying erotic attachment to their fathers.

Ah, good old Freudians ...
Dr. Green, who is now studying the development of tomboy girls, said the issue for girls who act like boys is very different. ''There are far fewer sissy boys than tomboy girls, but many more homosexual and transsexual men than women,'' he said.

I believe Green's study of tomboys eventually came up with the common sense finding that not all that many tomboy girls grow up to be lesbians, but a lot of lesbians were tomboys as girls. The number of models, actresses, and other glamor girls who tell interviewers they were tomboys as adolescents is noticeably high. My guess would be that long legs correlate with success in modeling, and long legs correlate with delayed puberty because girls typically stop getting taller around puberty.
Asked to comment on Dr. Green's findings, Dr. Judd Marmor, professor emeritus of psychiatry at the University of Southern California and the University of California at Los Angeles, said that they ''are another indication there is a biological element involved in the genesis of homosexuality, at least for those homosexuals with effeminate qualities.'' 
He added: ''Some children really feel different from earliest childhood; they are born without the aggressive masculinity other boys have. This is not something created by an overprotective mother or an absent or ineffectual father.'' 
Although the study involved a relatively small number of boys, Dr. Marmor, who is a past president of the American Psychiatric Association and an authority on homosexuality, called the research ''most important'' in what it revealed about the development of sexual orientation. ''Society tends to treat male homosexuals as if they had a choice about their sexual orientation, when in fact they have no more choice about how they develop than heterosexuals do,'' he said. ' 
An innate sissiness is ''not the answer to all homosexuality,'' Dr. Marmor said, ''but it is a factor that plays a role in a substantial number of male homosexuals.'' He added that homosexuality could also develop from a seriously distorted family environment but that ''it is much harder to develop that way, without a biological predisposition.'' 
Boys who participated in Dr. Green's study were first examined in early childhood, when their parents became concerned about the boys' persistent feminine behaviors and dislike of activities boys usually like. Many of the boys also repeatedly said they wanted to be girls. At the outset, Dr. Green thought he was examining the origins of male transsexuals -boys who grow up thinking they are girls trapped in male bodies and who may later seek sex-change surgery. 
However, only one of the feminine boys became a transsexual. 
In an interview, Dr. Green pointed out that the boys he studied were notably different from other children. While many, if not most, young children - boys as well as girls - occasionally dress up in their mothers' clothes, put on makeup or jewelry, play with dolls or assume the role of the opposite sex in fantasy play, the boys in Dr. Green's study did so almost exclusively. They spurned typical boy games, rough-housing and sports and instead would play with Barbie dolls for hours, frequently don female clothing and nearly always assume a female role when playing house. Many followed their mothers around the house, mimicking the mothers' activities. 
The boys and their parents were interviewed every few years, and some were seen several times a year in therapeutic counseling aimed at intercepting the boys' feminine tendencies and encouraging more ''gender-appropriate'' activities. 
Although Dr. Green found no evidence that the parents ''created'' feminine boys (many, in fact, had other sons who were normally masculine), certain parental attitudes and actions were correlated with a stronger homosexual orientation. 
One of the earliest influences was the prenatal desire on the part of either parent, and the father in particular, that the child be a girl. After the boy was born, the parents often considered their son to be an especially beautiful infant. Even strangers who admired the baby tended to make comments like ''what a pretty little girl.'' 
One of the most important factors related to a more homosexual orientation in adolescence and adulthood was how parents responded to the boys when they dressed up as girls and pretended to be girls. Many of the parents, Dr. Green said, thought it was cute and directly or indirectly encouraged the cross-gender behavior. For example, photographs of the boys dressed as girls were found in many family albums of feminine boys but in none of the albums of the comparison group of masculine boys.
No relationship was found between later homosexuality and the amount of time a boy spent with his mother. In fact, many of the feminine boys spent less time with their mothers than did the masculine boys. Nor was there any link to a mother-dominated household. 
However, less time shared between father and young son was an important factor. In the first year of life, the fathers tended to spend somewhat less time with their effeminate sons than did the fathers of masculine boys. During the next four years, however, the differences increased. By the time the boys were 3 to 5 years old, fathers of feminine boys were spending significantly less time with their sons than were fathers of the masculine boys. 
This does not mean, however, that the father rejected the son and that this rejection turned the boy into a sissy. Rather, Dr. Green suggested that the boys' feminine behaviors and rejection of male activities contributed to the fathers' indifference. ''It's not just a question of how parents impinge on a child; the child also impinges on the parents,'' he explained. 
In an earlier developmental study of 50 effeminate boys seen at a children's psychiatric clinic in Greenwich, Conn., Dr. Bernard Zuger, a psychiatrist, reported that the boys' ''closeness to the mother and distance from the father spring from their own needs.'' He suggested that parents need not feel guilty if their effeminate sons turn out to be homosexual. 
Another factor that interfered with the father-son relationship in Dr. Green's study was that the feminine boys were likely to be sick more often and more seriously than the masculine boys.

That's pretty interesting. Having been a sickly child was a pretty common theme in 19th and early 20th Century culture: e.g., Teddy Roosevelt was famous for having overcome his childhood infirmities through force of will. Many writers and artists had been sickly lads.
In most cases, it was the boys' mothers who cared for them when they were ill, especially if a lot of time was spent in hospitals. This also encouraged a more protective parental attitude toward the feminine boy. 
The culture may also play a role, Dr. Green said, though its effects are harder to measure. ''If the culture were less condemning of cross-gender behavior, social stigmatization would be less and perhaps these boys could socialize more with other boys,'' he remarked. ''Certainly that is the case with tomboys, who are treated by society as normal girls.'' 
On the other hand, he cited studies in several different cultures by Dr. Frederick Whitam, sociologist at Arizona State University in Tempe, who found that homosexuals in these cultures were more likely to remember cross-gender behavior in childhood than were heterosexuals. Dr. Marmor pointed out that in many cultures, including certain American Indian tribes, less aggressive boys are recognized by their elders and are given institutionalized roles, usually as a priest. 
Rather than attributing homosexuality to cultural, parental or genetic factors, Dr. Green sees an interaction of the three, as evidenced in particular by a pair of identical twins in his study. One boy was clearly feminine and the other twin typically masculine. The feminine boy was sick a lot and had little to do with his father, whereas the masculine twin had a more typical relationship with his father. As adults, both boys were bisexual, but the feminine twin was far more homosexual than his brother. 
''The twins are the metaphor for this study,'' Dr. Green said. ''They are similar but not the same. The degree to which they are not the same can be explained by the early feminine behavior of one, not by genetics.''
   

NYT: Science proves Americans are still sexist

Economist Seth Stephens-Davidowitz writes in the New York Times:
This [Google search] methodology can also be used to study gender preference before birth. Every year, Americans make hundreds of thousands of searches asking how to conceive a child of a particular sex. In searches with the words “how to conceive,” Americans are slightly more likely to include the word “boy” than “girl.” Among the subset of Americans Googling for specific gender conception strategies, there is about a 10 percent preference for boys compared with girls. 
This boy preference is surprising for two reasons. First, the top websites returned for these queries are overwhelmingly geared toward women, suggesting that women are most often making gender conception searches. Yet in surveys, women express a slight preference for having girls, not boys; men say they prefer sons. Second, many Americans are willing to admit a gender preference to even out a family. About 5 percent more boys are born than girls in the United States, so evening out a family would more often require having a girl, not a boy. Are men searching for conception advice in large numbers? Are women searching on their husbands’ behalf? Or do some American women have a son preference that they are not comfortable admitting to surveys? 
Other countries exhibit very large preferences in favor of boys. In India, for each search asking how to conceive a girl, there are three-and-a-half asking how to conceive a boy. With such an overwhelming preference for boys, it is not surprising that there are millions fewer women in India than population scientists would predict. 
Clearly there is more to learn. 

Here's something more that wouldn't be hard to learn: just looking at this graph above, it's pretty obvious that the small male preference in Google searches in America for information about how to conceive a son is being driven by Asians in the United States, especially South Asians (the same is likely true of Canada). In contrast, Australia, Britain, and New Zealand have more searches for how to conceive girls.

For collateral evidence, just do a Google search on "sex-selective abortions."
Because this data make it easy to compare different countries over time, for example, we might examine whether these gender preferences change after a woman is elected to run a country.
 
Hillary 2016!

Last year, Washington Post columnist Dana Milbank tut-tutted at a white Republican Congressman for noticing. Milbank wrote:
Republican’s abortion bill risks alienating Asian Americans  
Republicans long ago lost African American voters. They are well on their way to losing Latinos. And if Trent Franks prevails, they may lose Asian Americans, too. 
The Arizona Republican’s latest antiabortion salvo to be taken up by the House had a benign name — the Prenatal Non­discrimination Act — and a premise with which just about everybody agrees: that a woman shouldn’t abort a fetus simply because she wants to have a boy rather than a girl.  
The problem with Franks’s proposal is that it’s not entirely clear there is a problem. Sex-selection abortion is a huge tragedy in parts of Asia, but to the extent it’s happening in this country, it’s mostly among Asian immigrants.   ...
In an interview Wednesday afternoon, Franks didn’t dispute that Asian Americans would be targeted. “The real target in the Asian community here is the Asian women who are being coerced into aborting little girls,” he told me, adding: “When the left doesn’t want to make abortion the issue, they say you’re being against minorities.”   
Franks is a principled and consistent opponent of abortion, but his strategy has raised eyebrows before because of its racial component.

January 18, 2014

1st Nonwhite Hispanic Bachelor's "Comments on Gays Spark Uproar"

Non-White Hispanic
Last year, the Hollywood Reporter reported:
ABC Names First Non-White 'Bachelor'  
The next Bachelor has been revealed, and he represents a milestone for the ABC reality show. 
Host Chris Harrison announced Monday night that former Bachelorette suitor Juan Pablo Galavis ... will become the first non-Caucasian Bachelor or Bachelorette in franchise history.

But Hispanics, even a Non-White Hispanic like the blue-eyed Galavis, rank pretty far down in the Victimism Power Rankings. They simply lack the whip hand in today's Most Favored Victim struggles. Thus, the New York Times breathlessly reports:
Bachelor’ Star’s Comments on Gays Spark Uproar 
By BILL CARTER  
PASADENA, Calif. — ABC faced a potential crisis on Saturday over one of its longest-running hits when a storm erupted here over homophobic comments made by the star of the reality series “The Bachelor.” 
In an interview at a press party, Juan Pablo Galavis, the latest bachelor tasked with picking a possible mate from among a cast of 25 beautiful women, told the editor of the website The TV Page that he was opposed to the idea of ABC producing a season of the show with a gay bachelor. 
White Hispanic
Among other reasons, he said, “I don’t think it is a good example for kids to watch that on TV.” Of gay relationships, he added, “They are more pervert, in a sense.” (Mr. Galavis’s primary language is Spanish.) ...
The reaction to Mr. Galavis’s remarks was quick and intensely critical, with numerous reporters and commenters on Twitter denouncing them both as insensitive and, especially, inappropriate, given that Mr. Galavis is participating in a show that compels him to engage in romantic encounters with multiple women over a period of several weeks. 
ABC and the studio that produces “The Bachelor,” Warner Brothers, issued a statement on Saturday, saying, “Juan Pablo’s comments were careless, thoughtless and insensitive, and in no way reflect the views of the network, the show’s producers or studio.” 
Mr. Galavis posted an apology on his Facebook page on Saturday. “I want to apologize to all the people I may have offended because of my comments,” he wrote. He went on to say that “I have many gay friends, and one of my closest friends, who’s like a brother, has been a constant in my life, especially during the past five months. The word ‘pervert’ was not what I meant to say, and I am very sorry about it. Everyone knows English is my second language and my vocabulary is not as broad as it is in Spanish and, because of this, sometimes I use the wrong words to express myself.” 
There was no mention by ABC or Warner Brothers of consequences, either for Mr. Galavis or the show.

Perhaps as punishment Galavis will have his Non-White Hispanic status revoked.

Somebody needs to publish a Pokemon-style table of Power Points for all combinations of various Victimist statuses so you can instantly calculate who gets over on whom.

Oddly enough, there seems to be a pretty high correlation between how victimized your groups are perceived to be in the media and how powerful your groups are behind the scenes in the media, as Rip Torn implied on a Larry Sanders Show episode directed by Judd Apatow 16 years ago.

How could that be?
 

WSJ: "Cash for Kidneys: The Case for a Market for Organs"

From the Wall Street Journal:
Cash for Kidneys: The Case for a Market for Organs 
There is a clear remedy for the growing shortage of organ donors, say Gary S. Becker and Julio J. Elias

Why hasn't anyone invented an app with an ironic name for this? I've got dibs on:
Or-Gone 
Have you ever wanted to wake up in a motel bathtub full of ice cubes? 
    

The most giantest news story in the history of ever

Three stories from the top of the NYTimes.com homepage right now:

G.O.P. Advice: Christie, Pick a Better Team

Republicans are offering advice, sobering in its candor, for Gov. Chris Christie of New Jersey after the Fort Lee scandal. Above, Mr. Christie in Florida on Saturday.

BUSINESS DAY »

COMMON SENSE
Dangers of Giving In to Impulse for Revenge
The New Jersey traffic jam scandal offers lessons on seeking retaliation for perceived slights.



How long has this lane-closure thing been front-page news? 

Linguistic relativism, Whorf, and fire safety

From the Wikipedia bio of a short-lived contributor to a long-running debate over whether a particular glass is part full or part empty:
Benjamin Lee Whorf

Born April 24, 1897
Winthrop, Massachusetts
Died July 26, 1941 (aged 44)    Hartford, Connecticut 
Nationality American 
Fields linguistics, anthropology, fire prevention 
Institutions Hartford Fire Insurance Company, Yale University 
Benjamin Lee Whorf (April 24, 1897 – July 26, 1941) was an American linguist and fire prevention engineer. Whorf is widely known as an advocate for the idea that because of linguistic differences in grammar and usage, speakers of different languages conceptualize and experience the world differently. This principle has frequently been called the "Sapir–Whorf hypothesis", after him and his mentor Edward Sapir, but Whorf called it the principle of linguistic relativity, because he saw the idea as having implications similar to Einstein's principle of physical relativity.[2] 
Throughout his life Whorf was a chemical engineer by profession, but as a young man he took up an interest in linguistics. At first this interest drew him to the study of Biblical Hebrew, but he quickly went on to study the indigenous languages of Mesoamerica on his own. Professional scholars were impressed by his work and in 1930 he received a grant to study the Nahuatl language in Mexico; on his return home he presented several influential papers on the language at linguistic conferences. This led him to begin studying linguistics with Edward Sapir at Yale University while still maintaining his day job at the Hartford Fire Insurance Company. ...
After his death from cancer in 1941 his manuscripts were curated by his linguist friends who also worked to spread the influence of Whorf's ideas on the relation between language, culture and cognition. Many of his works were published posthumously in the first decades after his death. In the 1960s Whorf's views fell out of favor and he became the subject of harsh criticisms by scholars who considered language structure to primarily reflect cognitive universals rather than cultural differences. Critics argued that Whorf's ideas were untestable and poorly formulated and that they were based on badly analyzed or misunderstood data. In the late 20th century, interest in Whorf's ideas experienced a resurgence, and a new generation of scholars began reading Whorf's works, arguing that previous critiques had only engaged superficially with Whorf's actual ideas, or had attributed him ideas he had never expressed. 
The field of linguistic relativity studies remains an active focus of research in psycholinguistics and linguistic anthropology, and continues to generate debate and controversy between proponents of relativism and proponents of universalism. By comparison Whorf's other work in linguistics, the development of such concepts as the allophone and the cryptotype, and the formulation of "Whorf's law" in Uto-Aztecan historical linguistics, have met with broad acceptance....

Whorf graduated from the Massachusetts Institute of Technology in 1918 with a degree in chemical engineering where his academic performance was of average quality. In 1920 he married Celia Inez Peckham, who became the mother of his three children, Raymond Ben, Robert Peckham and Celia Lee.[4] Around the same time he began work as a fire prevention engineer (an inspector) for the Hartford Fire Insurance Company. He was particularly good at the job and was highly commended by his employers. His job required him to travel to production facilities throughout New England to be inspected. In one anecdote his arrival at a chemical plant is described in which he was denied access by the director because he would not allow anyone to see the production procedure which was a trade secret. Having been told what the plant produced, Whorf wrote a chemical formula on a piece of paper, saying to the director: "I think this is what you're doing". The surprised director asked Whorf how he knew about the secret procedure, and he simply answered: "You couldn't do it in any other way."[5] Whorf helped to attract new customers to the Fire Insurance Company; they favored his thorough inspections and recommendations. 
Another famous anecdote from his job was used by Whorf to argue that language use affects habitual behavior.[6] Whorf described a workplace in which full gasoline drums were stored in one room and empty ones in another; he said that because of flammable vapor the "empty" drums were more dangerous than those that were full, although workers handled them less carefully to the point that they smoked in the room with "empty" drums, but not in the room with full ones. Whorf explained that by habitually speaking of the vapor-filled drums as empty and by extension as inert, the workers were oblivious to the risk posed by smoking near the "empty drums".

Did Whorf ever solve the "empty drums" problem?

There was an even worse fire safety problem with the English-language word "inflammable," which means "easily set on fire," but seems like it might well mean "incapable of being set on fire." You really don't want confusion over that when dealing with overturned tanker-trailers, so the word "inflammable" has largely been abandoned in America in favor of the more easily grasped neologism "flammable," as we see here:
 

McWhorter: Language doesn't affect how you think

More Edge questions about what ideas popular in science should be kicked to the curb: linguist John McWhorter attacks a theory on the Nurture side of the Nature-Nurture divide.
John McWhorter 
Professor of Linguistics and Western Civilization, Columbia University; Cultural Commentator; Author, Doing Our Own Thing 
Languages Conditioning Worldviews 
Since the 1930s when Benjamin Lee Whorf was mesmerizing audiences with the idea that the Hopi people's language channeled them into a cyclical sense of time, the media and university classrooms have been often abuzz with the idea that the way your language works gives you a particular worldview.

There are two closely related ideas here:

1. Thought is affected by language form, such as differences in grammar

2. Thought is affected by language content, most notably differences in vocabulary

Content differences include Franz Boas's famous contention that the Eskimos have a gazillion words for snow. Wikipedia says about Eskimos and snow:
The claim that Eskimo languages have an unusually large number of words for snow is a widespread idea first voiced by Franz Boas and often used as a cliché when writing about how language may keep us more or less alert to the differences of the natural world. In fact, the Eskimo–Aleut languages have about the same number of distinct word roots referring to snow as English does, but the structure of these languages tends to allow more variety as to how those roots can be modified in forming a single word.[1][2]

But English has a large number of words for snow, too, as you'll note during the upcoming Winter Olympics coverage. So it's silly to be surprised that the size of vocabulary of a tiny illiterate culture isn't larger than that of the huge literate culture that produced the Oxford English Dictionary. It would be a more apples to apples comparison to contrast the number of words for snow in an Eskimo language to the number of words for the white stuff on top of Kilimanjaro in the language of a small African tribe.

Most languages appear to be fairly elaborate in form, but languages differ wildly in quantity of content. The Oxford English Dictionaryfor instance, features 600,000 words.

Orwell depicted Newspeak in 1984 as an attempt to reduce vocabulary to conceptually impoverish the subjects of the tyranny: e.g., the Declaration of Independence translates into simply "crimespeak."

For example, in the Edge essays, numerous scientists on the social engineer side of the spectrum rail against the nature - nurture conceptual framework devised by Shakespeare and Galton as something that should be permanently retired. Why? They offer lots of reasons, but a basic reason is that "nature-nurture" makes it easier for citizens to think skeptically about social engineering plans like Obama-Blasio's notion of fighting income inequality with universal pre-K, so junking the phrase "nature and nurture" would help intellectually disarm taxpayers.

McWhorter goes on:
You just want this to be true, but it isn't—at least in a way that anyone would be interested in it outside of a psychology laboratory (or academic journal).

Or as McWhorter later points out, not in ways we're supposed to be interested in, which would seem like very different things, but is increasingly the same thing to modern people as crimestop -- the predilection for feeling bored by potentially subversive trains of thought -- becomes more beaten into contemporary skulls.
It's high time thinking people let go of the idea, ever heralded as a possibility but never actually demonstrated, that different languages represent different ways of experiencing life.
Different cultures represent different ways of experiencing life, to be sure. And part of a culture is having words and expressions to express it, to be sure. Cell phone. Inshallah. Feng shui. But this isn't what Whorfianism, as it is often called, is on to. The idea is that quiet things in a language's very structural architecture—how its grammar works, how its vocabulary happens to cut up space—channel how the speaker experiences life.
And in fact, psychologists have indeed shown that such things do influence thought—in tiny ways elicitable via fascinatingly peculiar experiments. So, Russian has different words for dark and light blue and no one word that just means blue, and it has been shown that Russians are, indeed, 124 milliseconds faster at identifying grades of dark blue to other ones and grades of light blue to other ones. Or, it has been shown that people whose languages divide nouns into masculine and feminine categories are more likely, if asked, to imagine those things talking in the appropriately sexed voice if they were cartoon characters, or to associate them with gendered traits. 
This kind of thing is neat—but the question is whether the quiet background flutterings of awareness they document can be treated as a worldview. The temptation is endless to suppose that it does. Plus we are always reminded that no one has said that language prevents a speaker from thinking anything, but rather that it makes it more likely that the speaker will. 
... In the early eighties, psychologist Alfred Bloom, following the Whorfian line, did an experiment suggesting that Chinese makes its speakers somewhat less adept at processing hypothetical scenarios than English speakers. 

After all, nobody ever noticed that the Chinese tend to be pretty concrete in their thinking compared to, say, the Hindus or the Ancient Greeks. Oh, except that tends to be everybody's impression. (Whether it stems from language or not is another question, and which aspects of language would be involved is a third ...)
Whoops—nobody wanted to hear that.

Kind of a general problem with the human sciences these days: there are lots of things nobody wants to hear.
There was long train of rebuttals, ending in an exhausted draw.

In other words, Bloom's argument apparently wasn't disproved despite strongly motivated attempts to do so. (This doesn't mean it was proven, just that it was still standing after the spirit of the age took its best shots at it.)
But there are all kinds of experiments one could do that would lead to the same kind of place. Lots of languages in New Guinea have only one word for eating, drinking, and smoking. Does that make them slightly less sensitive to the culinary than other people?

Are you implying that Papuan cuisine isn't quite as sophisticated as Italian or Cantonese? That a French sommelier may come equipped with a more sophisticated vocabulary for thinking about wine than a New Guinea highlander?

Nobody wants to hear that!
Or, Swedish doesn't have a word for wipe—you have to erase, take off, etc. But who's ready to tell the Swedes they don't wipe? 
In cases like this our natural inclination is to say that such things are just accidents, and that whatever wisp of thought difference an experimenter could elicit on their basis hardly has anything to do with what the language's speakers are like—or what their worldview is. But then, we have to admit the same thing about the wisps that happen to tickle our fancies.

No, there is an obvious difference between the two examples.
What creates a worldview is culture—i.e., a worldview. And no, it won't work to say that culture and language create a worldview together holistically. 

How do we know? For example, consider ancient Greece's transformation from barely literate in 700 BC to philosophically sophisticated in 350 BC. The Greek vocabulary developed tremendously during this period as you might imagine. Richard Nisbett argued in The Geography of Thought that ancient Greek was peculiarly well-adapted to coining new conceptual words, a role that it continues to play today in the coining of new scientific and technological terms.

Is Nisbett right? This stuff's over my head. But I wouldn't rule it out. If McWhorter's upcoming book demolishes Nisbett's arguments that Japanese speakers seem to be better at seeing the context and English speakers seem to be better at "object oriented" thinking, with Japanese raised in the U.S. in-between, then good for him. But, so far, I'd consider Nisbett's argument viable if unproven.
Remember, that would mean that Chinese speakers are—holistically—a little dim when it comes to thinking beyond reality. 
Who wants to go there?

Phrased conversely, the Chinese tend to be particularly sharp at thinking about current reality, that they seem to devote a higher proportion of their mental horsepower to the palpable here-and-now.
Especially when even starting to, decade after decade, leads us down blind alleys? Hopi, it turned out, has plenty of markers of good old-fashioned European-style time. ...

A lot of anthropological examples turn out not to be very good since it's so hard to check up on something about some small tribe. Plus, there's the simple brute fact that a lot of languages of small illiterate tribes tend to be conceptually impoverished because the tribesmen don't think abstractly very often, and their brightest intellects who do come up with abstractions can't write them down to communicate them over time to future very bright fellows who would be on their wavelengths. So, the brightest illiterate sages end up playing a game of Telephone with their abstractions, with generally depressing results.

But anthropologists frequently feel the need to gloss over this with highbrow explanations of the tribe's alternative abstractions. (I'm not saying this is the case for the Hopi-Whorf tale in particular. Benjamin Whorf, by the way, was an interesting guy: an MIT chemical engineer who was the top chemical factory fire prevention inspector for a big insurance company, who took occasional breaks to go to Mexico to study indigenous languages. It's hard not to imagine that his death from cancer at age 44 was a real loss to the human sciences because who knows what he would have come up with if he'd lived long enough to turn to linguistics full time.)
What it comes down to is this. Let's ask how English makes a worldview. Our answer requires that the worldview be one shared by Betty White, William McKinley, Amy Winehouse, Jerry Seinfeld, Kanye West, Elizabeth Cady Stanton, Gary Coleman, Virginia Woolf and and Bono. 
Let's face it, what worldview would that be?

I don't know, but I suspect a lot of Frenchmen, much less Japanese, would have an opinion on the subject.

Evelyn Waugh's theory was that English had evolved not to be precise like French (with its limited vocabulary that seems to encourage the French to ring dazzling changes on a fairly distinct set of ways of the Pleasures of Being French -- French radical philosophers tend to be dazzling but oddly conservative: underneath all the novelty is a core conviction that the highest form of life is to live in France, and the highest form of being French is to sit in a sidewalk cafe and philosophize) or thorough like German (with its ability to turn every sentence into giant words, which may encourage German pedantry and profundity), but to sound good, to be a language for poets to weave their spells.
Sure, a lab test could likely tease out some infinitesimal squeak of a perceptive predilection shared by all of those people. But none of us would even begin to think of it as a way of perceiving the world or reflecting a culture. Or, if anyone would, then we are on to an entirely new academic paradigm indeed.

Perhaps we can broaden Waugh's notion to include playwrights, politicians, salesmen, celebrities, rock stars, actors, rappers, comedians, and the like. Which now that I think of it, pretty much covers "Betty White, William McKinley, Amy Winehouse, Jerry Seinfeld, Kanye West, Elizabeth Cady Stanton, Gary Coleman, Virginia Woolf and and Bono."

To take a more cynical view than Waugh, perhaps English turns out to be the finest language in practice for salesmen and other BS artists to use to infiltrate their ideas into the minds of others, that English is the ideal language for world-domination?
   

January 17, 2014

Sapolsky: Nature and Nurture are obsolete

In the 2014 Edge confab of clients of John Brockman, science book agent, seers and sages offer their views on: What scientific concepts should be put out to pasture? Robert Sapolsky argues against nature v. nurture as somewhat distinct concepts:
Robert Sapolsky
Neuroscientist, Stanford University; Author, Monkeyluv
[Anti-] Heights And Lengths And Areas Of Rectangles 
... But what I am focusing on [as bad] is a phrase that is right in the narrow sense, but carries very wrong connotations. This is the idea of "a gene-environment interaction." 
The notion of the effects of a particular gene and of a particular environment interacting was a critical counter to the millennia-old dichotomy of nature versus nurture. Its utility in that realm most often took the form of, "It may not be all genetic—don't forget that there may be a gene-environment interaction," rather than, "It may not be all environmental—don't forget that there may be a gene-environmental interaction."

Uh, no, the dominant mindset in the second half of the 20th Century, one which still reigns implicitly on almost all parts of campuses other than those explicitly dealing with genetics, was that the burden of proof was on advocates of any role for nature.
The concept was especially useful when expressed quantitatively, in the face of behavior geneticist's attempts to attribute percentages of variability in a trait to environment versus to genes. It also was the basis of a useful rule of thumb phrase for non-scientists – "But only if." "You can often say that Gene A causes Effect X, although sometimes it is more correct to say that Gene A causes Effect X, 'but only if' it is in Environment Z. In that case, you have something called a gene-environment interaction." 
What's wrong with any of that? It's an incalculably large improvement over "nature or nurture?", especially when a supposed answer to that question has gotten into the hands of policy makers or ideologues.

You notice how the phrase "nature or nurture" is now denounced as hopelessly simplistic by the ideological descendants of the simplistic and failed orthodoxy: "only nurture, never nature?" The moderates like Galton (coiner of the phrase "nature and nurture") who saw both as important turned out to be right. So now it's considered crucial to obfuscate this highly useful bit of conceptual shorthand.

For example, the Great and the Good now want to "fight inequality" through "universal pre-K public schooling." Leaving aside the overlooked issue of the enormous lag time in how the nurture of 4-year-olds will -- even theoretically -- have much impact on, say, the Forbes 400 (average age 66), this is very much of a nature-nurture question. The proposed policies of Barack Obama and Bill de Blasio are based on two assumptions:

- that the nature-nurture balance in 21st Century American life is tipped so far toward nurture that income inequality in several decades can be substantially affected by a change in nurture at age 4;

- the assumption that the nurture provided by American four-year-olds' loved ones is so far inferior to the nurture that would be provided by government employees that it's all worthwhile (an assumption, by the way, tested in boarding schools for the indigenous in Australia, Canada, and America in the early 20th Century with unencouraging results).
My problem with the concept is with the particularist use of "a" gene-environment interaction, the notion that there can be one. This is because, at the most benign, this implies that there can be cases where there aren't gene-environment interactions. Worse, that those cases are in the majority. Worst, the notion that lurking out there is something akin to a Platonic ideal as to every gene's actions—that any given gene has an idealized effect, that it consistently "does" that, and that circumstances where that does not occur are rare and represent either pathological situations or inconsequential specialty acts. Thus, a particular gene may have a Platonically "normal" effect on intelligence unless, of course, the individual was protein malnourished as a fetus, had untreated phenylketonuria, or was raised as a wild child by meerkats. 
The problem with "a" gene-environment interaction is that there is no gene that does something. It only has a particular effect in a particular environment, and to say that a gene has a consistent effect in every environment is really only to say that it has a consistent effect in all the environments in which it has been studied to date. This has become ever more clear in studies of the genetics of behavior, as there has been increasing appreciation of environmental regulation of epigenetics, transcription factors, splicing factors, and so on. And this is most dramatically pertinent to humans, given the extraordinary range of environments—both natural and culturally constructed—in which we live.

But are the ranges of typical environments to be affected by Universal Pre-K all that enormous? At present, the handful of children discovered to be kept chained to the water heater in the basement are taken away and put in foster care for better nurture, so Universal Pre-K shouldn't be evaluated against worst case scenarios.

Reading between the lines of the countless articles about how poor children are 30 million words behind, it's clear that the picture that liberals have in their minds of whom Universal Pre-K will rescue from inequality are black children being raised by a combination of tired grandmothers and surly welfare mothers with the TV on and various babbydaddies and boyfriends showing up and then disappearing. That's not a good environment, but it's not being raised by meerkats either.

The impact of nature and nurture when it comes to evaluating the expected value vs. the cost of Universal Pre-K are very much empirical questions. But many prefer to obfuscate rather than to try to clarify.
The problem with "a gene-environment interaction" is the same as asking what height has to do with the area of a rectangle, and being told that in this particular case, there is a height/length interaction.

Let's look at a rectangular analogy to Universal Pre-K. You are a downtown real estate developer. The most desirable block downtown is covered with skyscrapers doing a booming business, except for one parking lot with a street frontage of 100 feet in length. (This is, unfortunately, a 3d analogy, so let's just assume away the third dimension for the sake of simplicity: Assume the depth of the parking lot is the same as the depth of all the neighboring skyscrapers on the block and you would build over the entire depth.) The owner offers the parking lot for sale for $10 million. Do you buy it?

Well, the length of the lot is fixed, rather like nature in the short run, so the one thing in question in your mind is the height (nurture's stand-in in this example) of the building you would want to erect on the lot. If you can get the permits and financing to build a 75-story building, you will make a fortune. If you can only get the permits and financing to build a 5-story building, you will lose a fortune. So, your informed judgment about potential height (i.e., nurture), the one variable that is, as it were, up in the air, is absolutely central to your decision-making process. 

Professor Sapolsky then calls you up to explain that it is boring and trivial to think about height/length interaction and that all the most sophisticated thinkers are far beyond that. 

You hang up on him.
  

Malthus wasn't wrong, he was late

At the annual Edge question (this year: What scientific idea should be retired?), Matt Ridley writes:
Matt Ridley 
Science Writer; Founding chairman of the International Centre for Life; Author, The Rational Optimist 
Malthusianism 
T. Robert Malthus (he used his middle name) thought population must outstrip food supply and "therefore we should facilitate, instead of foolishly and vainly endeavouring to impede," disease, hunger and war. We should "court the return of the plague" and "particularly encourage settlements in all marshy and unwholesome situations". This nasty idea—that you had to be cruel to be kind to prevent population growing too fast for food supply—directly influenced heartless policy in colonial Ireland, British India, imperial Germany, eugenic California, Nazi Europe, Lyndon Johnson's aid to India and Deng Xiaoping's China. It was encountering a Malthusian tract, The Limits to Growth, that led Song Jian to recommend a one-child policy to Deng. The Malthusian misanthropic itch is still around and far too common in science. 
Yet Malthus and his followers were wrong, wrong, wrong.

Gregory Clark's A Farewell to Alms documents, using English public records such as wills from 1200 to 1800, that the English over these 600 years were using Malthus's 1798 advice to engage in "moral restraint" avant la lettre. The chief mechanism was not exposing babies on mountainsides or whatever, but was simply delaying marriage until a couple could afford the various accoutrements appropriate for their class.

How different in that regard was Jane Austen's world from today? Some, but the similarities should be obvious.

The average Englishwoman married during these centuries between 24 and 26. In contrast, the average Chinese woman married around 18. Thus, the Chinese population would grow faster, but tended to collapse when good government broke down.

In contrast, most of sub-Saharan Africa didn't have to worry about Malthusian traps until fairly recently. Population density outside of a few nice highland locations tended to be well below the agriculture capacity of the enormous amount of land. Diseases, competition with co-evolving wild animals (especially elephants, who consumed crops if there weren't enough people around to drive them off), and lack of fortifications meant that much of Africa tended to be underpopulated. The great African fear was not overpopulation of a region, but of humans dying out all together in an area.

Thus, while European culture tended to encourage sexual restraint, African culture tended to encourage sexual exuberance -- a pattern we can still see today in America.

Much the same system as the English had was at work in the U.S., although that was largely forgotten due to the tendency to assume by the still-dominant Sixties Folks to assume that the Fifties represented How Things Were Done Since Civilization Began. In reality, the very young average age of first marriage for American women in the 1950s compared to previous decades represented a zenith of mass prosperity.

The sheer numbers of the giant Baby Boom, combined with the technological failure to continue to progress to even faster personal transport such as flying cars (which would have vastly increased the supply of suburban land), quickly brought us back to a more historically common situation of delayed marriage.

Richard Dawkins contra essentialism

Every January, literary agent John Brockman gets his authors of popular science books to write short essays answering a question for his Edge website. This year's is: What scientific idea is ready for retirement?

Richard Dawkins contra Essentialism:
Essentialism—what I’ve called "the tyranny of the discontinuous mind"—stems from Plato, with his characteristically Greek geometer’s view of things. For Plato, a circle, or a right triangle, were ideal forms, definable mathematically but never realised in practice. A circle drawn in the sand was an imperfect approximation to the ideal Platonic circle hanging in some abstract space. That works for geometric shapes like circles, but essentialism has been applied to living things and Ernst Mayr blamed this for humanity’s late discovery of evolution—as late as the nineteenth century. If, like Aristotle, you treat all flesh-and-blood rabbits as imperfect approximations to an ideal Platonic rabbit, it won’t occur to you that rabbits might have evolved from a non-rabbit ancestor, and might evolve into a non-rabbit descendant. If you think, following the dictionary definition of essentialism, that the essence of rabbitness is "prior to" the existence of rabbits (whatever "prior to" might mean, and that’s a nonsense in itself) evolution is not an idea that will spring readily to your mind, and you may resist when somebody else suggests it. ...
Essentialism rears its ugly head in racial terminology. The majority of "African Americans" are of mixed race.

Sure, but the great majority are majority sub-Saharan. The minority that weren't used to prefer the Latin-style found in New Orleans where they considered themselves a middle group, but in the second half of the 20th Century, public expressions of such views became unpopular for reasons of idealistic solidarity on behalf of the black masses and/or a convenient way to prosper as the leadership of the black masses.
Yet so entrenched is our essentialist mind-set, American official forms require everyone to tick one race/ethnicity box or another: no room for intermediates.

No, actually, since the 2000 Census, the U.S. government allows people to tick as many of the racial boxes as they want. I believe there are 63 possible combinations.

On the 2010 Census, the President of the United States chose to ignore his mother's half of his family and tick only the "African-American" box.

The hilariously essentialist Census category is Ethnicity, where you are either "Hispanic" or "Non-Hispanic." Are you a Congregationalist minister and member of the Myopia Hunt Golf Club? Non-Hispanic! Are you a Tamil Brahmin? Non-Hispanic! Are you a Maori character star? Non-Hispanic! (Oh, wait, Cliff Curtis mostly plays Hispanics ... and Arabs ...)
A different but also pernicious point is that a person will be called "African American" even if only, say, one of his eight great grandparents was of African descent. As Lionel Tiger put it to me, we have here a reprehensible "contamination metaphor."

Or, these days, it's a good way to get ahead in the world, as the career of the current President shows.
But I mainly want to call attention to our society’s essentialist determination to dragoon a person into one discrete category or another. We seem ill-equipped to deal mentally with a continuous spectrum of intermediates. We are still infected with the plague of Plato’s essentialism.

Lawyers look for "bright-line" distinctions: you are either old enough to drink or you are not. You are eligible for affirmative action or you are not. The government and the culture has been rewarding certain racial groups, so it's hardly surprising that somebody who understands the modern system, such as, to pick a random example, Barack Obama will officially identify solely with the -- if you are a preppie from paradise, all else being equal -- more legally and culturally privileged race.

Something else to keep in mind is that there is one irreducible essence in human affairs that in practice surprisingly resembles a Platonic archetype: the structure of your biological family tree. Every individual has one father and one mother, two grandfathers and two grandmothers, and so forth and so on. If you draw out the shape of the family tree of your ancestors, it is exactly the same shape as every other human's in the world. It's Platonic perfection.

The only thing messy about this is the inevitable inbreeding -- 40 generations back you have roughly a trillion slots to fill in your family tree, but there weren't a trillion people around to fill it, so some (many) of your ancestors fill multiple slots in your Platonic family tree.

Racial groups -- or partly inbred extended families -- emerge from this tension between the Platonic purity of the structure and the messy reality of the names filling the structure.
   

Gregory Clark: "The Son Also Rises"

An upcoming book by economic historian Gregory Clark, author of A Farewell to Alms:
The Son Also Rises: Surnames and the History of Social Mobility (The Princeton Economic History of the Western World) [Kindle Edition] 
Gregory Clark (Author) 
Print List Price: $29.95
Kindle Price: $16.17 
Publication Date: February 23, 2014 
How much of our fate is tied to the status of our parents and grandparents? How much does this influence our children? More than we wish to believe. While it has been argued that rigid class structures have eroded in favor of greater social equality, The Son Also Rises proves that movement on the social ladder has changed little over eight centuries. Using a novel technique—tracking family names over generations to measure social mobility across countries and periods—renowned economic historian Gregory Clark reveals that mobility rates are lower than conventionally estimated, do not vary across societies, and are resistant to social policies. The good news is that these patterns are driven by strong inheritance of abilities and lineage does not beget unwarranted advantage. 
The bad news is that much of our fate is predictable from lineage. Clark argues that since a greater part of our place in the world is predetermined, we must avoid creating winner-take-all societies. 
Clark examines and compares surnames in such diverse cases as modern Sweden, fourteenth-century England, and Qing Dynasty China. He demonstrates how fate is determined by ancestry and that almost all societies—as different as the modern United States, Communist China, and modern Japan—have similarly low social mobility rates. These figures are impervious to institutions, and it takes hundreds of years for descendants to shake off the advantages and disadvantages of their ancestors. For these reasons, Clark contends that societies should act to limit the disparities in rewards between those of high and low social rank. 
Challenging popular assumptions about mobility and revealing the deeply entrenched force of inherited advantage, The Son Also Rises is sure to prompt intense debate for years to come.

One of the things that weirded English people out about Margaret Thatcher being Prime Minister was that even after 700 years or so, "Thatcher" was still a pretty downscale name. In contrast, here is a list of Anglo-Norman names from 1066 and all that: Fitzgerald, Mandeville, Percy, Baskerville, Beaumont, Curzon, Grosvenor, Longchamp, Warren, etc.

A lot of heiresses and their mothers have plotted intensely over the centuries in England to marry guys with classy sounding names. For example, according to the speculation of biographer William Manchester, the genetic reinvigoration of the Churchill line after the the half dozen generations following the spectacular John Churchill, first duke of Marlborough, was due to Winston's paternal grandmother and mother, daughter of a self-made New York millionaire. Blenheim Palace and grounds (by Capability Brown) is a romantic spot for wooing heiresses.

Future Gregory Clark titles to follow A Farewell to Alms and The Son Also Rises will hopefully include:

For Whom the Bell Curve Tolls
The Old Man and the g
A Provable Least
Depose? No, Kill Him Tomorrow
A Clean, Well-Lighted Race
The Big Uncharted Giver
To Have and Have Not (okay, I can't think of any more bad or nonsensical puns, but Hemingway's cheesiest novel sounds like an economics text anyway)