September 3, 2011

Ed Tech

From the NYT:
In Classroom of Future, Stagnant Scores
By MATT RICHTEL 
CHANDLER, Ariz. — Amy Furman, a seventh-grade English teacher here, roams among 31 students sitting at their desks or in clumps on the floor. They’re studying Shakespeare’s “As You Like It” — but not in any traditional way. 
In this technology-centric classroom, students are bent over laptops, some blogging or building Facebook pages from the perspective of Shakespeare’s characters. One student compiles a song list from the Internet, picking a tune by the rapper Kanye West to express the emotions of Shakespeare’s lovelorn Silvius. 
The class, and the Kyrene School District as a whole, offer what some see as a utopian vision of education’s future. Classrooms are decked out with laptops, big interactive screens and software that drills students on every basic subject. Under a ballot initiative approved in 2005, the district has invested roughly $33 million in such technologies. 
The digital push here aims to go far beyond gadgets to transform the very nature of the classroom, turning the teacher into a guide instead of a lecturer, wandering among students who learn at their own pace on Internet-connected devices. 
“This is such a dynamic class,” Ms. Furman says of her 21st-century classroom. “I really hope it works.” 
Hope and enthusiasm are soaring here. But not test scores. 
Since 2005, scores in reading and math have stagnated in Kyrene, even as statewide scores have risen. ...

First of all, 7th-graders shouldn't be reading Shakespeare. He's too hard for them. Maybe 9th graders should read Julius Caesar, a play with a much simpler style. But Shakespeare's comedies are hard. Also, they aren't very funny. King Lear is still really, really sad, but As You Like It is not really funny anymore.
Larry Cuban, an education professor emeritus at Stanford University, said the research did not justify big investments by districts. 
“There is insufficient evidence to spend that kind of money. Period, period, period,” he said. “There is no body of evidence that shows a trend line.” 
Some advocates for technology disagree. 
Karen Cator, director of the office of educational technology in the United States Department of Education, said standardized test scores were an inadequate measure of the value of technology in schools. Ms. Cator, a former executive at Apple Computer, said that better measurement tools were needed but, in the meantime, schools knew what students needed.

Okay, but why is Ms. Cator a former executive at Apple Computer? If the K-12 market is as promising as she thinks, wouldn't her former boss Steve Jobs have made sure to keep at Apple? After all, he has a pretty good nose for the next big thing. Moreover, a quarter of a century ago, Apple was, to a large extent, the chief K-12 technology company. That was its strong suit in 1985. Since his return to Apple in the mid 1990s, Jobs has largely abandoned K-12 for the well-educated grown-up market, with vast success. 

On the other hand, I think there are opportunities to help kids learn better in K-12 with technology. Intelligent drilling is what computers can do well. And the iPad looks like a particularly good form factor. But most of the software currently for K-12 is lousy, and most of the people buying K-12 software aren't very good either.

Interning

A headline from the LA Times:
Rug store owner showed porn to teenage intern, police say

Wait a minute ... they have interns now in the rug business? Is nothing sacred anymore? I understand that the practice of interning -- i.e., young people from affluent families working for free to get a networking leg up on the competition -- had infested many industries. But, I had assumed, the venerable rug merchant business, which has been a byword for thousands of years of being in it for the money, would not stoop so low.

Obamamania in perspective

Brent Staples reviews Randall Kennedy's book on Obama for the NYT:
Every campaign enlists its own meta-language. As Randall Kennedy reminds us in his provocative and richly insightful new book, “The Persistence of the Color Line: Racial Politics and the Obama Presidency,” the Obama forces disseminated several messages intended to soothe the racially freighted fears of the white electorate. On one channel, they reassured voters that he was not an alien, but a normal American patriot. They also made clear that he was a “safe,” conciliatory black man who would never raise his voice in anger or make common cause with people, living or dead, who used race as a platform for grievance. On yet another wavelength, the candidate proffered his bona fides as a black man to ­African-Americans who were initially wary of his unusual upbringing, his white family ties and his predominantly white political support. 
The press viewed this courtship of black voters as largely beside the point for a “post-racial” campaign that had bigger fish to fry on the white side of the street. Kennedy, who teaches law at Harvard, is having none of that. He argues with considerable force that the candidate deliberately set out to blacken himself in the public mind — while taking care not to go too far — and would have lost the election had he not done so. He sees Obama’s courtship of black voters not as tertiary, but as the main event and as the perfect vantage from which to view the campaign and the presidency. 
“The Persistence of the Color Line” consists of an introduction and eight inter­related essays that offer a fresh view of events that had prematurely taken on the cast of settled history. One essay, “The Race Card in the Campaign of 2008,” lays out an exacting standard for determining when the charge of race baiting is appropriate and applies it to several statements that were labeled as racist, or at least nearly so, during the last presidential campaign. Kennedy praises the Republican nominee, John McCain (he “imposed upon himself a code of conduct that precluded taking full advantage of his opponent’s racial vulnerability”), and redeems the former Democratic vice-presidential candidate Geraldine Ferraro, who was run out of the Clinton campaign essentially for saying what was indisputably true: Obama’s blackness mattered to his stature as a candidate. Without it, he would never have appealed so strongly “to the emotions of millions of white Americans who yearned for a moment of racial ­redemption.” 
... He sees [Rev. Jeremiah] Wright’s critique of America as excessive, but notes that it is, at bottom, more integral to the African-American worldview than was generally acknowledged during the episode. 
The messianic glow that surrounded Obama’s candidacy — Kennedy and others call it “Obamamania” — precluded closer scrutiny of his pronouncements, especially those having to do with race. The widely held notion that the now-famous race speech, “A More Perfect Union,” ranked with the Gettysburg Address or “I Have a Dream” strikes Kennedy as delusional. The speech, he writes, was little more than a carefully calibrated attempt to defuse the public relations crisis precipitated by the Wright affair. Far from frank, it understated the extent of the country’s racial divisions and sought to blame blacks and whites equally for them, when in fact, Kennedy writes, “black America and white America are not equally culpable. White America enslaved and Jim Crowed black America (not the other way around).” The speech was in keeping with the candidate’s wildly successful race strategy, which involved making white voters feel better about themselves whenever possible.
The cornerstone essay, “Obama Courts Black America,” is a breath of fresh air on many counts, not least of all because it offers a fully realized portrait of the black political opinion — left, right, center, high and low — that was brought to bear during the campaign. This is the most comprehensive document I’ve yet read on the near street fight that erupted over the question of how Obama should identify himself racially. There were those who viewed him as “too white” to be legitimately seen as black; those who had no problem with his origins; those who viewed the attempt to portray him as “mixed race” as a way of trying to “whiten” him for popular consumption; and those who accused Obama of throwing his white mother under the bus when it became clear that he regarded himself as African-American. 
Tallying votes, Kennedy reckons that it would have been political suicide for Obama to identify himself as anything other than black. This would have undermined his standing among African-Americans, whose overwhelming support he needed to win, and gained him nothing among those whites who were determined to punish him for his skin color, no matter how he described himself. 

Of course Obamamania was always all about race. Look, the guy was a state legislator as of 2004, and not even speaker, majority leader, or a whip in the Illinois legislature. He was chairman of the state senate's health and human services committee, which isn't bad, but it's not automatic Presidential Timber. Four years later, he gets elected President.  If his daddy hadn't been black, he would no more have become President than if the previous guy'd daddy had been Mr. Tree.

Let's say that as of 2004, Obama was about the 10th most important person in one of the 50 state legislatures in the country. With 50 governors, 100 U.S. Senators, and 435 Representatives, that means he was probably not in the top 1,000 politicians in the country four years before winning the Presidential election. Four years before getting elected President, Bush II was governor, Clinton was governor, Bush I was Veep, Reagan was ex-governor, Carter was governor, Ford was House minority leader, Nixon was ex-Veep, LBJ was Senate majority leader, JFK was U.S. Senator, Eisenhower was former supreme commander, and Truman was U.S. Senator.

September 1, 2011

Disparate Impact, Part MCXXXVII

The New York Times editorializes:
EDITORIAL
The Military and the Death Penalty 
Racism in the application of capital punishment has been well documented in the civilian justice system since the Supreme Court reinstated the penalty in 1976. Now comes evidence that racial disparity is even greater in death penalty cases in the military system. 
Minority service members are more than twice as likely as whites — after accounting for the crimes’ circumstances and the victims’ race — to be sentenced to death, according to a forthcoming study co-written by David Baldus, an eminent death-penalty scholar, who died in June. 
The analysis is so disturbing because the military has made sustained, often successful efforts to rid its ranks of discrimination. But even with this record, its failure to apply the death penalty fairly is more proof that capital punishment cannot be free of racism’s taint.

You know, if you do a study of star football running backs, after accounting for the circumstances, such as honors earned at that level, the whites will go on to be victims of racism in college, and then again in the pros. How do we know that? Because of disparate impact.

Alternatively, if one group has a bell curve shifted versus another group, the shift tends becomes more extreme the more extreme the selection critieria, whether for starring as running backs or committing heinous crimes. But, who could expect the New York Times editorial board to be familiar with and grasp the logic of normal probability distributions, as explained by La Griffe du Lion. Why should they? He's some pseudonymous academic.

August 31, 2011

Oak Park v. Austin

In response to my review of Bruce Norris's outstanding Chicago real estate play, Clybourne Park, James Kabala asks a good question:
What if, when the first middle-class black family or two had moved into Austin, NO ONE had panicked and sold. Wouldn't that have kept the neighborhood safe? An underclass family can't buy a house if the house isn't for sale! It seems as if the neighbors Steve commends as more practical/realistic/conservative were actually selling out the neighborhood both literally and metaphorically. If they had stayed put, there would have been no houses for anyone (white or black, middle-class or underclass, law-abiding or criminal) to buy, and the original families could have lived there indefinitely.

Norris touches upon this issue in Clybourne Park. The white family that sells out in 1959 to Lorraine Raisin in the Sun Hansberry's family has a personal reason to be alienated from their neighbors. Also, their real estate agent doesn't have a problem with it.  

The response suggested by James was the response of my in-laws in 1967 in the Austin neighborhood of the West Side of Chicago. They joined a liberal Catholic homeowners group founded by a priest whom my father-in-law knew from classical music circles. This priest had composed two operas about Chicago politics with librettos by Father Andrew Greeley. I had a long discussion with him about this at my father-in-laws' wake. 

All the members pledged to each other not to sell out. 

This failed disastrously in Austin. My in-laws held out for three years, long after most of the members had bailed out. But after their small children had been mugged three times, they finally got out. 

The role of real estate agents should be noted: real estate agents tend to specialize in one or two neighborhoods. When prices are going up, they tend to play a constructive role. For example, in our North Lakefront neighborhood in the 1990s, the local real estate agent organized many of the parties. But, if prices are going down, real estate agents can switch to trying to cash in quick by stampeding locals into panic selling. It's an eat-the-seed-corn strategy, but egging on massive turnover can make economic sense to real estate agents (including to the agents of the middle class black families) that were the first to move in. 

On the other hand, as the priest pointed out to me, a similar strategy saved Oak Park (where my father was born in 1917), just west of Austin. My father and I went to see Oak Park in 1982, which he hadn't seen since 1929. Having just read Theodore White's dismal account to visiting his old neighborhood in Boston, I expected it to be a horrible experience. Instead, Oak Park turned out to be full of architecture fans touring the lovely Frank Lloyd Wright neighborhood where my father had grown up. 

The differences between Austin and Oak Park were that Oak Park had the most historically significant domestic architecture in America. It was worth fighting to save. In contrast, while segregated Austin was a terrific place for a modest income family to raise a bunch of kids in the early 1960s -- the population density during the Baby Boom was so high that kids just played on the sidewalks in huge groups, with lots of adults around to keep an eye on them, and little girls walked to school -- there was nothing special about it. It was just a bunch of two and three flats. There were a million neighborhoods like this in Chicago.

People like Matthew Yglesias who are into urbanism, public transportation, and high density should study the destruction of working class urban neighborhoods in Chicago by integration. 

The other big difference was that the conspiracy in Oak Park went all the way to the top. Real estate agents were pressured into imposing "a black a block' quota of only selling to one black family per block. That was totally illegal after the 1968 Fair Housing Act, but, apparently, it was considered in such a good cause -- preserving the home of Prairie Style Architecture -- that everybody winked at it. (I compared Austin to Oak Park in more detail here.)

The general lesson from the differing fates of next-door neighbors Austin and Oak Park, as far as I can tell, is that, all things considered, it's better to live in a neighborhood full of architectural treasures inhabited by affluent and powerful people than in a neighborhood full of average buildings inhabited by average people.

There's no end to the way that nice things are nicer than not nice things.

Paul Graham

My new VDARE column is on Paul Graham, the finest essayist on what it takes to make it in Silicon Valley:
With the  school year starting up, I got to thinking about offering some avuncular advice to young people about how to make one’s way in the world. 
Fortunately, I resisted the urge. Instead, I’ll merely advise: read Paul Graham. 
For obvious reasons, I don’t offer young people much career advice. And even if I felt like it, I might not see much point in doing it myself because Graham has raised the quality bar so high over the last decade with his self-help essays.

Read the whole thing there.

My old articles are archived at iSteve.com -- Steve Sailer

August 30, 2011

"Clybourne Park" by Bruce Norris

From my new column in Taki's Magazine:
Bruce Norris’s Clybourne Park, winner of this year’s Pulitzer Prize in New York and Olivier Award in London, is the play I’ve been waiting for since the 1980s. Although Norris previously wrote six dramas for Chicago’s Steppenwolf Theatre Company, Steppenwolf will finally stage his masterpiece beginning September 8th. 
It’s a bitterly funny two-act play set in the same two-bedroom house on Chicago’s Near Northwest Side in white-flight 1959 and then in gentrifying 2009. Norris is superb at writing dialogue in How We Talk Now. While most playwrights live for eloquent speechifying, Norris’s 2009 characters converse realistically in interrupting, overlapping, and apologizing snatches. Moreover, Clybourne Park is the first work I can recall to capture precisely what urbanites talk about most obsessively (real estate); how they converse (euphemistically); and why (the 3Ls of real estate are “location, location, and location,” which in Chicago means, above all else, race).
In my 18 years in Chicago, I was involved in innumerable conversations that included the phrase, “It just needs a little tuckpointing.” Yet how many famous plays or movies center around real estate? Real estate and race?

Read the whole thing there.

August 29, 2011

Women and Silicon Valley

Jaron Lanier reminisces about the early days of Silicon Valley in the New Statesman, with an angle I'd never heard before:
There were precious few girl nerds at the time. There was one who programmed a hit arcade game called Centipede for the first video game company, Atari, and a few others. There were, however, extraordinary female figures who served as the impresarios of social networking before there was an internet. It still seems wrong to name them, because it isn't clear if I would be talking about their private lives or their public contributions: I don't know how to draw a line. 
These irresistible creatures would sometimes date alpha nerds, but mostly brought the act of socialising into a society where it probably would not have occurred otherwise. A handful of them had an extraordinary, often unpaid degree of influence over what research was done, which companies came to be, who worked at them and what products were developed. 
That they are usually undescribed in histories of Silicon Valley is just another instance of what a fiction history can be. The advent of social networking software and oceans of digital memories of bits exchanged between people has only shifted the type of fiction we accept, not the degree of infidelity.

Histories of the Enlightenment have been written from the perspective of the women who hosted the salons. 

August 28, 2011

Bush lawsuit to undermine NYC emergency services proceeds

From the NYT, more on the triumphant Vulcan disparate impact discrimination lawsuit, filed by the Bush Administration in 2007, against the Fire Department of New York.
One afternoon after the trial let out, Capt. Paul Washington, a black officer in Engine Company 234, in Brooklyn, sat in the courthouse cafeteria with Firefighter John Coombs, president of the Vulcans. An hour earlier, Captain Washington had testified about racial insults he encountered on the job: the casual flinging of the N-word and the defacement of a flier for the Vulcans’ first memorial service after 9/11. Where the guest speaker’s name was printed, someone had scribbled other names: Buckwheat, Al Sharpton, Fat Albert. 
Now, the two men explained that overt animus like that was fairly uncommon on the job. Instead, they complained of a corrosive obliviousness to race, discernible in acts as unsubtle as dinner-table condemnations of affirmative action and as seemingly innocuous as a recreation-room preference for Fox News. 
“Our experience is different,” Captain Washington said. “There’s 50 white guys in a firehouse from the same background — middle-class, Long Island, the kids play soccer together — so, yeah, they’re having a ball. But if you’re the one black guy in the house, maybe you ain’t having so much fun.”

August 26, 2011

The Inevitable

From the Washington Post in "Libyan Rebels Carry Out Reprisal Attacks:"
A few minutes’ drive from the fire station, at least 15 bodies, most of them Gaddafi’s black African supporters, lay rotting in the sun at a traffic junction outside his Bab al-Aziziyah complex. Several of the dead wore green pieces of cloth wrapped around their wrists to signal loyalty to the Gaddafi regime. 
 The men may have died during Tuesday’s battle for Bab al-Aziziyah, and several were wearing military fatigues. But not all of them looked like ordinary battlefield deaths. Two dead men lay face down on the grass, their hands bound behind their backs with plastic cuffs. 
The worst treatment of Gaddafi loyalists appeared to be reserved for anyone with black skin, whether they hailed from southern Libya or from other African countries. Darker-skinned prisoners were not getting the same level of medical care in a hospital in rebel-held Zawiyah as lighter-skinned Arab Libyans, Eltahawy said. 
Rebels say Gaddafi employed gunmen from sub-Saharan Africa to shore up his army against his own people, and those fighters have elicited intense enmity from Libyans. But many of the detainees in Zawiyah told Amnesty International they were merely migrant workers  “taken at gunpoint from their homes, workplaces and the street on account of their skin color,” Eltahawy said.

I wrote about this anti-black aspect of the Libyan uprising back on February 27. Qathaafee's pro-immigration and pan-African policies were especially unpopular with his subjects, and hence the rebels have long been taking out their ire especially on the Colonel's foreign black mercenaries. Similarly, the Bahrain uprising by the Shi'ite majority had a lot to do with the Sunni regime's policy of electing a new people by importing Sunni immigrants, typically security service types from other countries. But Bahrain is closely allied with the U.S., so they didn't get any cruise missiles shot at the government while they put down their people.

August 25, 2011

Darwin and Galton, again

Earlier, I pointed out how remarkable it is that Charles Darwin has, in recent decades, been promoted to near-divine status in our culture, while his 13-year younger half-cousin Francis Galton has been demonized. 

I don't know a huge amount about the two, having merely read a few biographies. Still, the two don't strike me as polar opposites the way they seem to strike the conventional wisdom today. Instead, they seem more notable for their similarities than their differences. The  two, grandsons of Erasmus Darwin, both seem like models of the Victorian gentleman amateur scientist. 

The differences between them seem fairly exiguous. Darwin wasn't as healthy as Galton, who was hugely productive from age 30 onward (after a breakdown at university), working out the math of correlation and regression in advanced middle age. James Surowiecki's book "The Wisdom of Crowds" begins with an anecdote about the discovery of his topic: an 85-year-old Galton attended a country fair where there was a contest to guess the weight of a bull. Galton collected all the guesses with the intention of demonstrating the stupidity of crowds, but discovered to his amazement that crowds could be pretty wise in situations where random errors canceled each other out. So, the octogenarian wrote up his surprising discovery and published it in Nature.

One difference is obvious: Galton had more ideas, while Darwin had the biggest idea of the century: natural selection. Galton always saw himself as following in Darwin's footsteps.

But, here's the thought experiment that just occurred to me: What if Galton had been born in 1809  and died in 1881, while Darwin was born in 1822 and lived until 1911? My guess would be that Galton might have eventually stumbled upon natural selection first, leaving Darwin to engage in follow-up work rather  like Galton's. 

How often do fraternal twins wrongly believe themselves identical?

Here's a good article in Slate's Twins Week by writer Barry Harbaugh about getting a genetic test to see if he and his twin brother Russ, a filmmaker, are identical twins (as they've always believed) or if they might be fraternal. 

The theme of the article is something I wrote about for Taki's Magazine last year: that identical twins are more likely to incorrectly call themselves fraternal than fraternal twins are to incorrectly call themselves identical. What Freud called the narcissism of small differences operates on twins. In 2010, I concluded: "While movie twins look alike but act wildly different, real-life twins often see themselves as less similar, both in looks and personality, than they appear to strangers."

Barry Harbaugh writes:
And yet: We are not strictly identical. We have our petty discordances, which in their accumulation conspire against us. Suspicion is lured by doubt. In a giant senior-class photo of Russell and me that hangs framed in my old bedroom, we might as well be cousins. I am perched at his left shoulder, looking like my head is balancing on a drinking straw, while his own neck threatens to split his collar. We had long ago compared our fingerprints in vain. He was an All-State high-school quarterback (and an All-American in college), while I sat on the bench for a beleaguered basketball team that couldn't win even five games.

Russ was a small college All-American at Wabash. (At first, I guessed that they are related to Jim Harbaugh, a former NFL quarterback now an NFL head coach, whose brother John is also an NFL head coach. But I can't find any proof for that. Still, I wouldn't be surprised if they are all part of an extended family. All these Harbaughs come from the same part of the country -- Indiana, Ohio, Michigan -- and are all motivated and talented.)
Though we've both escaped the primordial sludge of the Ohio River Valley for New York City, no one confuses us anymore. For two years, I've spent $80 a month on a pharmaceutical that will keep me from going bald. We've noticed that I'm slightly taller. That our noses have a slightly different bent. Our penmanship is at odds and his hair is falling out. ... 
We brought this upon ourselves. Russell and I tried very hard to cultivate individual interests and attitudes. Without surveying the various parenting fashions of twins in history, I might point out the school of thought that demands parents dress their identical children in matching outfits, parting their hair on opposite sides (like a mirror's reflection!), and just as well the school filled with parenting books advising the opposite.
However we came to it—whether through a mother's growing devotion to those books, or some innate desire on our part to complement each other—Russell and I have long attached ourselves to different things, and driven each other crazy with a manic desire to report in detail whatever the other missed. 

This tendency to differentiate can be fairly inevitable, especially among very ambitious twins like the Harbaughs, who are the sons of college professors. If you are named Harbaugh, the top position in sports is quarterback, and only one of you can be the starting quarterback, assuming you don't go to different high schools. If you are named, say, Barber, you can go to the NFL as a cornerback and as a running back and be roughly equally successful. But if you are named Harbaugh, well, there's quarterback, and then there's placekicker, punter, tight end, fullback, center, placekick holder, wedge-breaker-upper, waterboy, and various other football jobs that aren't anywhere near as good as being quarterback. So, it makes sense for one identical twin to break the logjam for both of them by saying, "I don't want to be quarterback. Being quarterback is stupid. I don't care about being quarterback. I want to be [something very opposite of being quarterback, like being a writer.]"

That's fairly important to grasp for thinking about twin studies of nature and nurture that compare identical and fraternal twins. Yes, it is true that other people treat identical twins more identically, on average, than fraternal twins. But identical twins have subtle stresses pushing them apart that fraternal twins are less prone to. If one fraternal twin is built like a quarterback, the odds are that the other one isn't. But if one identical twin is built like a quarterback, then the other one is pretty similar. So, they have to generate their own differentiating forces.



All twins were beheld under the same banner: unusual, unexplained, and undifferentiated.
The confusion was bound with our ignorance of knowing exactly what was happening in utero. It wasn't until the latter decades of the 19th century that embryologists figured out the basic twinning process: Either two sperm fertilize two eggs, or one egg splits in two. (An earlier notion held that twins arose from two sperm that fertilized an ovum in separate places; obviously a red herring.) In 1875, the statistician Francis Galton interviewed 100 pairs of same-age, same-sex siblings, along with their close relatives, and concluded, "Twins is a vague expression." Though not quite a zygotic eureka, he found that extreme similarity among twins was just as common as a moderate resemblance, or hardly any resemblance at all. Even through the embryonic fog, it was clear something elemental divided at least the extremely similar from all the rest. By 1911, the usage of fraternal, as it relates to twins, had entered English usage, and the lexicography of mono- and dizygotic followed five years later. 
Galton's work led to the establishment of the twin method, which proved the foundation for investigations into those dubious sciences called behavioral and eugenic. It also corroborated something that would have been apparent to the era's midwives and cowboys: Not every pair of twins comes into the world trailing the same debris. If you're witness to enough twin births—among humans or cattle—you're likely to notice certain differences from one to the next; that some pairs are born with a single placenta, that others have two placentas fused together, and still more spring from the womb with one placenta each. ...

In 2002, we went off to Wabash College together, a fine all-male school devoted to forging "gentlemen" in upstate Indiana, and one that I stomached for a year before transferring home. The decision was unilateral and it stung. So too: a trip I took that summer to Europe, alone. By sheer coincidence, his football team traveled to Vienna in July, to play an exhibition, and Russell got a tattoo on his left shoulder blade from a man he met in a bar, while I watched. An R whose leg has been cleaved in half so that with a bottom curl it also conveys a B, it proved an unreciprocated mark.

I wouldn't be surprised if there were a big difference between the two twins, one that turns up more often than you'd think among identical twins, one that may have motivated their getting a genetic test to see if they really are identical. (Or I might just be reading slightly more into the various self-dramatizing accounts by the Harbaugh twins than is really there.)

Barry Harbaugh did a lot more research on this question than I did, and came to an even stronger conclusion:
Fraternal twins rarely, if ever, think themselves as alike as two peas. Far more often, misdiagnosis occurs when identical twins think themselves unalike—"of a family likeness only." (In other words, fraternal twins don't question their zygosity; it's the identical ones who get confused). You may have heard that Mary Kate and Ashley Olsen—twins so alike that they have shared professional lives since they were 6 months old, most notably by pretending to be each other without any of their millions fans noticing—have long declared themselves fraternal. What can one say other than: Bahooey. 
As I waited for my own results to come through, I contacted every lab I could find that does this sort of commercial twin testing. I wanted to find stories of identical pairs who had thought themselves fraternal, or of twins that had no idea either way. The search turned up one pair of girls who had grown up with a triplet brother, and couldn't believe the identical result they received: "I never did feel like I was looking at a reflection," one of the sisters wrote. "When the truth finally came out my mom was shocked. She was the mother, how could she get it wrong?" 
But when I asked for help finding a pair that had thought themselves identical, only to discover otherwise—and I shook the bushes for two weeks, badgering labs all over the country—I was met each and every time with silence. Affiliated Genetics, which has been testing for zygosity since 1994, didn't have a single adult twin on record who received a heterozygous result. Not one pair (remotely around my age) had ever tested as fraternal. It seemed nothing short of astonishing. Not one? Where are all the dizygotic wannabes, vainly dressing in matching overalls?

More broadly, something I've discovered over the years that applies to more than just twins is that people seldom appreciate having you point out to them that they look like some famous person. (Sure, 18-year-old girls like to be compared to 21-year-old Victoria's Secret models, but that's about as far as it goes.)

Maybe somebody could put up with being told he looks like the great quarterback Tom Brady, but if you told a guy who looks like the great quarterback Peyton Manning that he looks kind of like Peyton Manning, he'd figure you were making fun of him. Moreover, it would never have occurred to him that he looks like Peyton Manning.

For example, I doubt if Karl Rove would like having it pointed out to him that he looks like a skinnier Lou Pearlman, the now-imprisoned boy band impresario, gay molester, and Ponzi Scheme runner. 

But, Lou would probably also hate having you mention to him that he looks like Karl. If you told Lou Pearlman through the bullet proof glass of the visiting room at his penitentiary that he kind of reminds you of Karl Rove, it would likely be the worst thing that happened to him all day.

Everybody is a special snowflake in their own minds.

August 24, 2011

J-1 visas: Not doing jobs Americans would do

Jennifer Gordon writes in an op-ed in the NYT about that amusing strike at the Hershey factory by foreign students who thought they were signing up for fun in the sun and got stuck lifting cases of candy on the late shift in Nowheresville, PA:
The J-1 visa, also known as the exchange visitor visa, has its roots in the cold war. In 1961, Congress created a program for international students and professionals to travel here, with the goal of building good will for the United States in the fight against Communism. The program, which became the J-1 visa, thrives today — but not as Congress intended. 
Instead, it has become the country’s largest guest worker program. Its “summer work travel” component recruits well over 100,000 international students a year to do menial jobs at dairy farms, resorts and factories — a privilege for which the Hershey’s students shelled out between $3,000 and $6,000. They received $8 an hour, but after fees and deductions, including overpriced rent for crowded housing, they netted between $1 and $3.50 an hour. Hershey’s once had its own unionized workers packing its candy bars, starting at $18 to $30 an hour. Now the company outsources distribution to a non-union company that hires most of its workers from the J-1 program. 
The Pennsylvania workers are not alone. Recent exposés by journalists and advocates have found similar abuse of J-1 visa holders at fast food restaurants, amusement parks and even strip clubs. 
Though the number of J-1 visa holders admitted to the United States swelled from 28,000 in the program’s first year to more than 350,000 in 2010, and the government made minor changes to the program earlier this year, the State Department has never established a sufficient oversight system. Instead, it hands that responsibility to organizations it designates as sponsors, who profit from the arrangement and so have no incentive to report abuses.

Why, after three years of high unemployment with no end in sight, hasn't this program been suspended for the duration of the slump? 

There are a whole bunch of unemployed American citizens with two-digit IQs who wouldn't mind these jobs (although not at $1.00 to $3.50 per hour), but Two Digit Americans are almost completely unrepresented by anybody in modern American life. When I hear sophisticated folk go on and on about how they are "cosmopolitan" and thus don't care about invisible lines on the ground or how, as Bill Clinton said on 9/10/2001: "the world will be a better place if all borders are eliminated," by this point I realize that, in practice, what all that high talk comes down to is somebody who is good with an Excel spreadsheet getting the better of a bunch of their fellow citizens who aren't.

Twins switched at birth

Twin expert Nancy L. Segal has a new book, Someone Else's Twin: the True Story of Twins Switched at Birth about a pair of identical twins who were mixed up at the hospital, and one went home with a lady who had just given birth to a singleton, while the mother of the twins took home one of her babies and the other lady's baby. The identical twins ran into each other in a shop 28-years-later. Drama ensued.

Segal also has a book coming out in 2012 summing up the famous Minnesota Twins study of separated twins, for which she was one of the researchers.

"Double Inanity"

In Slate, Brian Palmer denounces 135 years of twin studies in "Double Inanity:"
The idea of using twins to study the heritability of traits was the brainchild of the 19th-century British intellectual Sir Francis Galton. He's not exactly the progenitor you might want for your scientific methods. Galton coined the term "eugenics" and was the inspiration for the push to manipulate human evolution through selective breeding. ...

Over the last few decades, Galton's older half-cousin Charles Darwin has been promoted from secular sainthood to his current role as the Jesus of Atheists. But, the rise of Darwininsm in prestige has not been an unmitigated blessing to the world, so Galton has come to play the role of scapegoat, or devil. Since Darwin, the secular redeemer, is, by definition, above sin, all sins associated with Darwinism must be the fault of the designated devil, Galton.

It's a very odd phenomenon, since the two kinsmen would otherwise seem so objectively similar, both by nature (both were grandsons of the famous Erasmus Darwin, who propagated a theory of evolution in the late 18th Century) and by mutual nurture. The younger man vastly admired the older man, and the elder came to be highly impressed with the younger.
Nearly five decades after Galton published "The History of Twins"—and more than 10 years after the word "gene" entered the lexicon—researchers in the 1920s "perfected" Galton's methods by comparing identical and fraternal twins and inferring heritability from the differences between the two. The twin study today is based on the same assumptions that were made back then. (As you may be aware, a lot has changed in the field of genetics over that time.) And despite numerous indications that these assumptions are deeply flawed, researchers continue to crank out new papers, probably in response to a public demand—both insatiable and inexplicable—for evidence that we're just like our parents. (If only Freud were alive today.) ... 
Twin studies rest on two fundamental assumptions: 1) Monozygotic twins are genetically identical, and 2) the world treats monozygotic and dizygotic twins equivalently (the so-called "equal environments assumption"). The first is demonstrably and absolutely untrue, while the second has never been proven. 
That identical twins do not, in fact, have identical DNA has been known for some time. The most well-studied difference between monozygotic twins derives from a genetic phenomenon known as copy number variations. Certain, lengthy strands of nucleotides appear more than once in the genome, and the frequency of these repetitions can vary from one twin to another. By some estimates, copy number variations compose nearly 30 percent of a person's genetic code. 
These repeats matter. More than 40 percent of the known copy number variations involve genes that affect human development, and there are strong indications they explain observed differences between monozygotic twins. For example, it's often the case that one identical twin will end up victimized by a genetically based disease like Parkinson's while the other does not. This is probably the result of variations in the number of copies of a certain piece of DNA. Copy number variations are also thought to play a role in autism spectrum disorder, schizophrenia, and ADHD, all of which can appear in only one member of a monozygotic twin pair (PDF). If copy number variations can affect discrete and diagnosable disorders, then why shouldn't they influence far more complex behaviors like your inclination to head to the polls on a Tuesday night in November? 
That's just the beginning of the genetic differences between monozygotic twins. As a result of mutations during development, about one in 10 human brain cells has more or less than the typical two copies of a chromosome. Identical twins also have different mitochondrial DNA, the genetic information stored in the cellar organelle responsible for processing glucose. Research suggests that mitochondrial DNA affects brain size among a host of other neurological traits.
Twin studies also rely on the false assumption that genetics are constant throughout one's lifetime. Mutations and environmental factors cause measurable changes to the genome as life progresses. Charney cites the example of exercise, which can accelerate the formation of new neurons and potentially increase genetic variation among individual brain cells. By the time a pair of twins reaches middle age, it's very difficult to make any assumptions whatsoever about the similarity of their genes.

To his credit, Matthew Yglesias smells a rat:
That doesn’t seem to me to follow. It’s still the case that identical twins are more genetically similar than other kinds of siblings. So if we have a study showing that identical twins are systematically more similar in some respect than non-identical twins, we’re still in possession of evidence about the influence of genetic similarity on behavioral similarity. 

One of his commenters gets to the heart of the problem.
I'd go further. The case where identical twins are truly identical is a floor, not a ceiling. If identical twins are so similar despite some "copy number" and "mtDNA" differences, well, think how much more they'd be similar if you took even those additional genetic differences away. Unless the guy thinks these residual differences are anti-correlated (instead of being uncorrelated) with any differences in environmental inputs - a fairly bizarre supposition - what he's really saying is that twin studies *understate* the impact of genes.

Palmer goes on:
The equal environments assumption is similarly questionable. As anyone who's ever seen a pair of toddler twins in matching sailor suits is surely aware, monozygotic twins do get special treatment. They are more likely than their dizygotic peers to be treated as "two of a kind" by family, friends, and teachers, which must increase their chances of developing similar behaviors. There have been numerous studies showing that dizygotic twins who look similar have more personality traits in common than those who are easily distinguishable.

First, that's why the holy grail of twin studies are identical twins raised apart studies. They are hard to find, but very interesting. Second, there's the good enough for government work angle -- if the question is, can we produce equality of results by social engineering equality of inputs, well, children of the same age raised in the same home at the same time are about as equal as we can expect any government program to get the environment.

Part 2: Is he really gay?

Last week in Taki's Magazine, in the first of a two part series, I recounted (in slightly amped-up form) a conversation I've gotten into not infrequently over the years:
Him: Hey, you’ve heard of Mr. Big Name [a world-famous icon of masculinity], right? 
Me: Sure. Who hasn’t? 
Him: Well, he’s gay. 
Me: Really? That’s interesting. 
Him: Yeah, my buddy Al, who was a stuntman on a bunch of his early movies, told me. 
Me: But isn’t he married to that supermodel? 
Him: It’s just a front. 
Me: And don’t they have several kids who look just like him? 
Him: Everybody in Hollywood knew he was gay way back when. 
Me: And wasn’t his wife threatening to divorce him a couple of years ago because he was sleeping with his leading lady? 
Him: His publicist must have made it up. 
Me: And didn’t his ex-wife sue him to get his child-support payments raised to $100,000 per month? And didn’t he almost ruin his career by insisting that his crazy Danish girlfriend be cast opposite him in all his movies? And didn’t he hire private eye Anthony Pellicano to wiretap those script girls who had filed paternity and sexual-harassment suits against him? And doesn’t he maintain a secret second family in Bakersfield? 
Him: He’s gay as a French horn.

So, this week in Taki's, I'm back with the rest of the possible explanations for this curious state of affairs.

The secret languages of twins

Jon Lackman writes in Slate:
In rare cases, however, children do develop an entire language of their own, and amazingly, all full-blown twin languages spontaneously develop the same structure, regardless of the language spoken at home. Aarhus University linguist Peter Bakker told me that twin-language structure is unlike that of any established language, and its syntax doesn't simply reflect the usual mistakes made by children. (Deaf children not taught sign language who invent their own also use this structure, by the way.) This "gives us a potential insight into the nature of language," Bakker said, into mankind's "first language," now lost to history. 
Twin languages are simple, just as simple as necessary, one might say. For one, they freely mix subjects, verbs, and objects, putting the most important item first in any context. In an Estonian study a child said, in his private language, "Again I foyer toward write come." (Estonian grammar would have dictated, "I come again to the foyer to write.") Negation appears at the sentence's beginning or end, regardless of where it appears in the native language. Thus one Swiss child said, "Bobby, here drive no!" instead of, "Bobby, don't drive here!" Verbs aren't conjugated. There's no way to locate things or events in time and space. And finally, twin languages almost never use pronouns, just proper names. Language can get simpler than this, but not much. ...
If language originated between just two people, it might well have looked like this: The seemingly universal twin-language structure is blissfully easy to use in one-on-one conversation. However, that first language would have had to evolve quickly to be useful to a larger community. Societies need "unambiguous ways to distinguish between subject and object," Bakker says. "In the twin situation these can be dispensed with, but not in languages in which it is necessary to refer to events outside the direct situation." ... 
Linguist Bernard Comrie at UC-Santa Barbara cautions that research into the birth of language is still in its infancy. "First we were told that creole languages would provide us with insight into 'first language,' then when that didn't pan out interest shifted to deaf sign language (also with mixed results)—I guess twin language will be the next thing," he wrote me. Twin language is particularly difficult to test because children give it up quickly, except when they are very isolated. And you can't just isolate kids on purpose—not anymore, anyway. Gone are the days when the pharaoh Psammetichus I could send two infants off to be raised by goats, or the Holy Roman Emperor Frederick II could forbid children's caregivers from speaking to them.

Project Nim, the 1970s attempt to teach the chimpanzee Nim Chimpsky to use sign language to disprove Noam Chomsky's ideas about what Steven Pinker calls the "language instinct," was in the tradition of Psammetichus and Frederick. It's an interesting question.

Lackman recalls a 1981 movie in which a primitive tribe has only one phrase: "It will be mine." That's not too far off from many of the things Nim had to say for himself, such as:
“Me banana you banana me you give.”
“You me banana me banana you give.”
“Banana me me me eat."

August 23, 2011

Libya

Libyan rebels fired at forces loyal to Qaddafi during fierce fighting in downtown Tripoli on Monday. - NYT

Can you actually hit anything firing a gun from above your head? Is the fighting really that "fierce" if you can't be bothered to get behind the car right next to you and, you know, aim?
Seif al-Islam el-Qaddafi, whose capture the rebels had trumpeted since Sunday, walked as a free man to the Qaddafi-controlled luxury Rixos Hotel in the center of Tripoli early Tuesday, boasting to foreign journalists there that his father’s government was still “in control” and had lured the rebels into a trap, the BBC and news services reported. 

That's quite a strategy Col. Qatthafi has come up with -- luring the enemy into the downtown of your capital. Amazing nobody has ever thought of that ploy before.
His appearance raised significant questions about the credibility of rebel leaders.

I'm shocked to hear of doubts about the credibility of anybody involved in this.

At the moment, whatever is going on in Tripoli is a confusing mess. But my prediction all along has been that once Obama started the "no-fly zone," he'd keep dropping bombs until Col. Gaddafi is gone. For example, I wrote on March 25:
Yet, the bottom line about what will happen isn't really all that confusing. What matters most is that Obama has an election coming up in 19 months. He can't afford to go into the campaign known as The President of the United States Who Started a War with Muammar Gaddafi and Failed to Win. ... 
I'm not saying that Obama had this all figured out from the moment he agreed to start the war or that he's even figured it out after a week, but it will eventually dawn on him that his alternatives are now: 
1) Lose to Crazy America-Hating Terrorist Moamar Khadaffy, or
2) Drop More Bombs. 
So he will choose what's behind Door #2. 
Of course, after Qadafi is gone, a whole bunch more stuff will happen in Libya, but, seriously, who cares? How much does Obama care about Libya versus how much does he cares about his fabulous career?