January 10, 2009

Build, baby, build!

The idea of Obama having a blank check to spend billions for two years on "infrastructure" has liberals fantasizing over all the SWPL projects they'd like to build, such as solar powered magnetic levitation trains. The problem, of course, is that liberals have spent the last 40 years making building anything in the the coastal regions of America an extremely slow process, so it's implausible for them to argue that their favorite projects could have non-negligible "stimulus" effect.

So, that's leading liberals to demand that we cut the red tape holding back construction. It's time to deregulate. Bulldoze, baby, bulldoze!

You see, all those environmental regulations are supposed to stymie bad people. But, we liberals are good people, so the laws shouldn't apply to us.

My published articles are archived at iSteve.com -- Steve Sailer

What Bush regrets

The AP reports:

President George W. Bush told a group of Texas reporters Friday that he regretted immigration policies were not reformed while he was in office.

"I'm very disappointed that it didn't pass," he said in an interview with correspondents from his home state. "I'm very worried about the message that said, 'Republicans are anti-immigrant.'"

Bush said he wanted a comprehensive immigration plan "not for political standing or for Latinos, but because it was best for the country," the Houston Chronicle reported in its online edition Friday.

The outgoing president said that in hindsight he should have pushed his immigration proposal soon after the 2004 election, rather than after partisan squabbling over Social Security began.

Yeah, sure, that was the problem -- partisan squabbling, rather than the public hating your plan.

My published articles are archived at iSteve.com -- Steve Sailer

Steven Pinker gets his genome tested ...

... and discovers he has the Bald Gene.

In a long article in the New York Times Magazine, "My Genome, My Self," the author of The Blank Slate recounts all that he has learned about himself from having his genome sampled, which turns out to be unsurprisingly modest.

The most prominent finding of behavioral genetics has been summarized by the psychologist Eric Turkheimer: “The nature-nurture debate is over. . . . All human behavioral traits are heritable.” By this he meant that a substantial fraction of the variation among individuals within a culture can be linked to variation in their genes. Whether you measure intelligence or personality, religiosity or political orientation, television watching or cigarette smoking, the outcome is the same. Identical twins (who share all their genes) are more similar than fraternal twins (who share half their genes that vary among people). Biological siblings (who share half those genes too) are more similar than adopted siblings (who share no more genes than do strangers). And identical twins separated at birth and raised in different adoptive homes (who share their genes but not their environments) are uncannily similar.

Behavioral geneticists like Turkheimer are quick to add that many of the differences among people cannot be attributed to their genes.

Identical twins raised apart tend to be almost as similar as identical twins raised together, although part of the reason is that identical twins raised alone don't feel the need to distinguish themselves from their twin by developing something unique about themselves. Horace Grant, the skinny power forward on Michael Jordan's first three championship teams, would probably have become a quick forward if he hadn't grown up playing on youth teams alongside his identical twin Harvey Grant, who became an NBA All-Star shooting forward, while Horace was given the role of rebounding forward.

I know two pairs of adult identical twins, the Brimelows and the Woodhills, and the personal affects vary mildly among the Brimelows and moderately among the Woodhills.
But not all variation in nature arises from balancing selection. The other reason that genetic variation can persist is that rust never sleeps: new mutations creep into the genome faster than natural selection can weed them out. At any given moment, the population is laden with a portfolio of recent mutations, each of whose days are numbered. This Sisyphean struggle between selection and mutation is common with traits that depend on many genes, because there are so many things that can go wrong.

Penke, Denissen and Miller argue that a mutation-selection standoff is the explanation for why we differ in intelligence. Unlike personality, where it takes all kinds to make a world, with intelligence, smarter is simply better, so balancing selection is unlikely.

Is smarter simply better? If it takes, say, bigger brains, the answer isn't terribly clear. Analogously, Intel assumed that faster clockspeed computer CPU chips were simply better for about 15 years. But the struggle to break the 4.0 gigahertz barrier proved overwhelming and so Intel has given up and gone in different directions in recent years, when most chips sold seem to be between 2.0 and 3.0 gigahertz, although performance keeps improving.

Keep in mind that Intel has big advantages over natural selection in getting from one performance peak to another. For example, if Intel decides that its strategy of single chips with ever faster clockspeeds is heading toward a deadend, it can simultaneously start working on an R&D project for double core and quad-core chips with moderate clockspeeds. At first, the new type of CPUs won't be as good as the old type, but it doesn't have to sell the beta versions of the changed design. It can keep making them and throwing them away until the new style chips are as good as the competition's old style chips.

In contrast, natural selection doesn't provide you much of a laboratory in which to putter around while you're working the kinks out of your next model while your factory keeps churning out the satisfactory current model.

Similarly, bigger brains require more food. They make you more likely to tip over and hurt yourself. They require your mother to have a wider pelvis so she won't die in childbirth, which makes her a slower runner.

But intelligence depends on a large network of brain areas, and it thrives in a body that is properly nourished and free of diseases and defects. Many genes are engaged in keeping this system going, and so there are many genes that, when mutated, can make us a little bit stupider.

At the same time there aren’t many mutations that can make us a whole lot smarter. Mutations in general are far more likely to be harmful than helpful, and the large, helpful ones were low-hanging fruit that were picked long ago in our evolutionary history and entrenched in the species. One reason for this can be explained with an analogy inspired by the mathematician Ronald Fisher. A large twist of a focusing knob has some chance of bringing a microscope into better focus when it is far from the best setting. But as the barrel gets closer to the target, smaller and smaller tweaks are needed to bring any further improvement.

The Penke/Denissen/Miller theory, which attributes variation in personality and intelligence to different evolutionary processes, is consistent with what we have learned so far about the genes for those two kinds of traits. The search for I.Q. genes calls to mind the cartoon in which a scientist with a smoldering test tube asks a colleague, “What’s the opposite of Eureka?” Though we know that genes for intelligence must exist, each is likely to be small in effect, found in only a few people, or both. In a recent study of 6,000 children, the gene with the biggest effect accounted for less than one-quarter of an I.Q. point. The quest for genes that underlie major disorders of cognition, like autism and schizophrenia, has been almost as frustrating. Both conditions are highly heritable, yet no one has identified genes that cause either condition across a wide range of people. Perhaps this is what we should expect for a high-maintenance trait like human cognition, which is vulnerable to many mutations.

The hunt for personality genes, though not yet Nobel-worthy, has had better fortunes. Several associations have been found between personality traits and genes that govern the breakdown, recycling or detection of neurotransmitters (the molecules that seep from neuron to neuron) in the brain systems underlying mood and motivation....

But it seems even less plausible to say that more or less of any major psychological trait is "simply better." We may have our subjective preferences, but the major personality traits are likely to be ones on which normal variation doesn't much change average Darwinian fitness over the generations.

Even if personal genomics someday delivers a detailed printout of psychological traits, it will probably not change everything, or even most things. It will give us deeper insight about the biological causes of individuality, and it may narrow the guesswork in assessing individual cases. But the issues about self and society that it brings into focus have always been with us. We have always known that people are liable, to varying degrees, to antisocial temptations and weakness of the will. We have always known that people should be encouraged to develop the parts of themselves that they can (“a man’s reach should exceed his grasp”) but that it’s foolish to expect that anyone can accomplish anything (“a man has got to know his limitations”). And we know that holding people responsible for their behavior will make it more likely that they behave responsibly. “My genes made me do it” is no better an excuse than “We’re depraved on account of we’re deprived.”

Many of the dystopian fears raised by personal genomics are simply out of touch with the complex and probabilistic nature of genes. Forget about the hyperparents who want to implant math genes in their unborn children, the “Gattaca” corporations that scan people’s DNA to assign them to castes, the employers or suitors who hack into your genome to find out what kind of worker or spouse you’d make. Let them try; they’d be wasting their time.

The real-life examples are almost as futile. When the connection between the ACTN3 gene and muscle type was discovered, parents and coaches started swabbing the cheeks of children so they could steer the ones with the fast-twitch variant into sprinting and football. Carl Foster, one of the scientists who uncovered the association, had a better idea: “Just line them up with their classmates for a race and see which ones are the fastest.” Good advice. The test for a gene can identify one of the contributors to a trait. A measurement of the trait itself will identify all of them: the other genes (many or few, discovered or undiscovered, understood or not understood), the way they interact, the effects of the environment and the child’s unique history of developmental quirks.

Well said.

On the other hand, as futile as individual genomics is likely to prove to be relative to current expectations., the Law of Large Numbers suggests that racial genomics is likely to prove more fertile.

My published articles are archived at iSteve.com -- Steve Sailer

Dennis Dale on "Rachel Getting Married"

Mining my Comments section, here's Dennis Dale's of Untethered's take on the critically-lauded Jonathan Demme movie starring Anne Hathaway, which ends with an interminable wedding celebration featuring various world music acts:

Dennis Dale said...

"Rachel" is an expression of the self-consciousness that is now central to the soluble identity of a certain cultural set, affluent and influential (affluential?) liberal white Americans.

Here we have the most vital expression of any given culture, the marriage ceremony, and it is imagined as a palimpsest featuring any culture but one's own. The effect is not broadening, as they imagine, but deadening.

Toward the end of this interminable mess (how much longer is bad lighting and hand-held cinematography going to pass for naturalistic authenticity with our hopeless critical class?), during the wedding party, after we've been subjected to this calvalcade of pretentious multicultural references, I think they trotted out some mardi gras dancers. I refuse to believe they're not joking.

Think of two classic films--the Godfather and The Deer Hunter--and the remarkable wedding scenes that anchor their first acts, defining a community in a given time and place, and compare them to this film, which expresses nothing so much as the profound lack of confidence that these people brandish as proof of their moral superiority.

Compare those exuberant scenes to the prolonged shambling of this (Robyn Hitchcock and his friggin' lute?! Are you kidding me?!).

And all the scrupulous inclusion only adds up to condescension in the end. Note the black people in the film--smiling caricatures, for all--or because of-- the effort to portray them counter-stereotype. The groom is so well-mannered and mild that he's barely there. And don't get me started on Bill Irwin's interpretation of a kind father as an oozing nipple.

But Anon is wrong--Anne Hathaway rules, freakishly oversized eyes, scrawny neck and all. She superbly portrays youthful self-destruction. They should have done it right--there's no need for a dead sibling back story. People set out to self-destruct for no good reason all the time--and that would have been a fitting synecdoche for the setting of the story, a people self-destructing for no good reason.

My published articles are archived at iSteve.com -- Steve Sailer

Neanderthals

Here's another outtake from the upcoming book The 10,000 Year Explosion: How Civilization Accelerated Human Evolution by Greg Cochran and Henry Harpending. If the stuff that wasn't good enough to make the book is this good, how good is the actual book going to be?

The first Neanderthal skeleton recognized as such was found in a limestone quarry in the Neander Valley in Germany in 1856. At first, this rather odd skeleton was thought to be that of some medieval guy crippled by arthritis (or a Celt, or a diseased Cossack): it was only identified as a representative of an extinct type of human somewhat later. This other human race was named after the site: spelling reform later changed its name to Neandertal and eventually most paleontologists followed, driven by obscure interdepartmental struggles. We’re sticking with ‘Neanderthal’, though: an archaic spelling seems only appropriate for a vanished species.

We know quite a bit about the Neanderthals, more in fact than we know about our anatomically modern African ancestors of that period. In part this is because physical conditions in Western Europe favored preservation so that there really are more fossil remains. In addition, a high level of general education meant that farmers and quarrymen were more likely to call in a professor when they found cave paintings or an odd-looking skeleton: that is, more likely in nineteenth-century Germany or France than in nineteenth-century Africa or China. There were plenty of professors close by, since Neanderthals left their remains conveniently close to famous universities and five-star restaurants. Many Neanderthal skeletons, site, and artifacts have been found – remains from a few hundred individuals, in sharp contrast to the handful of known African human fossils from the same period.

The Neanderthals had big brains (averaging about 1500 cubic centimeters, noticeably larger than those of modern people) and a technology like that of their anatomically modern contemporaries in Africa, but were quite different in a number of ways: different physically, but also socially and ecologically. Neanderthals were cold-adapted, with relatively short arms and legs in order to reduce heat loss - something like Arctic peoples today, only much more so. Considering that the climate the Neanderthals experienced was considerably milder than the high Arctic (more like Wisconsin), their pronounced cold adaptation suggest that they may have relied more on physical than cultural changes. Of course they spent at least six times as many generations in the cold as any modern human population has, and that may have had something to do with it as well.

We don’t yet know for sure, but it seems likely that, as part of their adaptation to cold, Neanderthals were furry. Chimpanzees have ridges on their finger bones that stem from the way that they clutch their mother’s fur as infants. Modern humans don’t have these ridges, but Neanderthals do. Moreover, we know that humans can become furry with very simple genetic changes, since there are a few people working in the circus in which such a change has already taken place.

They were chinless and had big honking noses, heavy brow ridges, a pulled-forward face, and long, low skulls that tended to bulge outwards at the sides. The body form differences are seen in children – and so they were innate, rather being a consequence of their way of life. They were heavily built and muscular, judging from their skeletons, which had larger areas of muscle attachment than those seen in people today or in their contemporary African cousins. This means that they were stronger than us, probably much stronger. You could think of them as being born wrestlers.

Those conclusions about Neanderthal physique come from studying fossils, but modern methods have allowed researchers to draw less obvious conclusions as well. ...

Judging from the isotopic composition of their remains, Neanderthals were meat-eaters, pure top carnivores, comparable to lions or wolves. This is very different from most contemporary hunter-gatherers, who usually depend more on plant foods than meat.

It also seems that they ate almost no fish, which is somewhat surprising, considering that a number of the Neanderthal sites have been along rivers with strong salmon runs.

Like other top carnivores, Neanderthals were thin on the ground. The typical Neanderthal site has relatively more remains of cave bears than later sites occupied by modern humans, which suggests that Neanderthals were scarcer than later humans. We think that there may have been as few as 30,000 of them in all of Europe.

Neanderthal sites are generally found in caves and beneath rock overhangs, which functioned as shelters and also tended to preserve their remains. We have found signs of open-air camps as well, but any structures built seem to have been very simple. Such camps may have been common, but were far less likely to be preserved than sites in caves and rock shelters: as so often in archaeology, differences in preservation can completely obscure the original distribution of things.

Neanderthals used rather sophisticated stone tools, but used almost no bone, ivory, or shell. They had fewer types of tools than their successors. Their “Mousterian” tools (named for the southwestern French site Le Moustier), consisted of carefully shaped flake tools and small hand axes, almost always made from local materials. We find Mousterian tools over vast areas of Europe and western Asia, but there is little variation in space or time, which supports the general impression that their capacity to innovate was low. François Bordes, a famous French archaeologist, has said their technology consisted of “beautiful tools made stupidly” - by rote, or perhaps by instinct.

There are indications that they often used those stone tools to make things out of wood, but only a few such wooden objects have been preserved. We find awls that could pierce animal hides, but no bone needles with eyes, which their successors used to make tailored clothing. There is evidence that they processed hides, presumably for clothing, but they must have been used for blankets or ponchos rather than coats or parkas. That kind of clothing was good enough to let them survive in Ice Age Europe, but evidently not good enough to allow the Neanderthals to settle the high Arctic: ultimately this also kept them out of the Americas. Their front teeth show an unusual pattern of wear – apparently they were used as a third hand or vice, or perhaps to prepare animal skins. Somewhat similar patterns of wear have been observed in peoples like Eskimos that prepare hides by chewing.

Neanderthals had fire, and probably could not have settled ice-age Europe without it. However, they didn’t do anything fancy with it: there were no specialized hearths and there is no evidence that they used lamps.

Neanderthals are the first humans known to have buried their dead, but there is no clear evidence of ceremony or ritual in those burials. We don’t find weapons or decorative objects associated with those graves as we often do with the graves of modern humans. It may be that burial was for them more a way of disposing of unpleasant remains than a ritual occasion. It may have been more like flushing a goldfish down the toilet.

We know that Neanderthals hunted big game (red deer, European bison, sometimes mammoths and rhinos) and took big risks in the process, judging from their many healed fractures. The injury pattern is like that of bronco riders, as documented by Eric Trinkaus. At root, these high risks were a consequence of their lack of projectile weapons, which allow hunters to bring down big game without getting dangerously close – that and a lack of any other safe way of making a living. Neanderthals used stabbing spears and were probably ambush hunters. Using this strategy, they had to get up close and personal with desperate animals that outweighed them several fold, a good way to get hurt.

Since they were so often injured, they had to help each other. That's the only way in which they could have survived while recovering from those serious injuries. You see a similar pattern in some other cooperative-hunting species such as lions: injured members of the pride manage to feed off the kills of others while they recover. We see lots of healed injuries in saber-tooth tigers as well, whose hunting pattern of stabbing from ambush was rather similar to that of Neanderthals. Come to think of it, their heavy, almost bear-like build (compared to that of the other big cats) was also similar. In some cases, Neanderthals carried this cooperation very far, providing care that allowed permanently crippled individuals to reach advanced ages. The most famous example of this is the skeleton of a forty-something man found in Shanidar, in northern Iraq. His right arm was withered and had suffered multiple fractures, while the lower arm and hand had been lost. He had a crippled and withered right leg. In addition, he had suffered a crushing blow to the face that likely left him blind in one eye. All of these were long-healed injuries. Clearly, this guy had been around the block – twice, on his face.

Groups of Neanderthals were able to kill big game, but that would have been far harder for individuals. As a member of the group, they also received the Paleolithic equivalent of health insurance, a necessity in their kind of high-risk hunting. Because of group efficiency in hunting and the high degree of within-group cooperation, membership in a Neanderthal band or tribe was valuable. It would have been almost impossible to survive outside such a group. As Bill Hamilton has said, such arrangements are vulnerable to free-riders, individuals that take advantage of the benefits but don’t pull their weight – which in this case would have meant avoiding hunting and its risks. A high degree of within-band relatedness would have mitigated this tendency – due to kin selection, the principle that behaviors that cost the individual but help close relatives can be favored by selection. This suggests that Neanderthal bands may have been reluctant to accept outsiders, especially males, if we assume that males had the primary responsibility for hunting. Unrelated newbies would have had the most to gain by shirking dangerous duties.

Hamilton pointed out that, among social carnivores, the would-be immigrant often has to go through a difficult probationary period without necessarily succeeding – we see a similar pattern in some recent hunter-gatherers. There would have to have been a way in which new individuals could join the tribe, in order to avoid dangerous levels of inbreeding. It may be that only females changed bands, which is apparently the case in chimpanzees and is the most common pattern in humans.

Trends of this sort may have existed in anatomically modern humans as well, but the high risks associated with Neanderthals’ specialized big-game hunting may have taken things much further. We expect that they were more cooperative than our African ancestors and more clannish. This might have interfered with inter-band relations and made the development of trade and other inter-band social interactions more difficult.

` If you take too many chances in the process of making a living, you'll get yourself killed before you manage to raise a family. Therefore there is a maximum sustainable risk per calorie acquired from hunting. If the average member of the species incurs too much risk, more than that sustainable maximum, the species goes extinct. The Neanderthals must have come closer to that red line than anatomically modern humans in Africa. Risks were particularly high because the Neanderthals seem to have had no way of storing food – they had no drying racks or storage pits in frozen ground like those used by their successors. Think of it this way: storage allow more complete usage of a large carcass such as bison that might weigh over a thousand pounds – it wouldn’t be easy to eat all of that before it went bad. Higher utilization - using all of the buffalo - drops the risk per calorie.

Since women in Africa were probably gathering vegetable foods, men there didn't have to produce as much food or produce it as steadily, which meant that they could choose game animals that were safer but less abundant. And that's what they did in Africa's Middle Stone Age (MSA). They went after relatively uncommon but mild-mannered eland, rather than abundant, deadly dangerous Cape buffalo. And as corollary, anatomically modern humans in Africa didn't have as heavy a build as Neanderthals - they didn't need it.

Although Neanderthals must have had very high levels of within-group cooperation, they were no angels: they had a weakness for long pork. As we said before, they may have experienced evolutionary pressures favoring clannishness or even hostility to outsiders. That's natural: reduced competition at one level allows more competition at a higher level. If the members of some ethny could 'all just get along', they could conquer the world, and likely would. We have found clear-cut evidence of cannibalism at several Neanderthal sites. At Krapina, every long Neanderthal bone found had been split open for marrow.

Like other early humans, Neanderthals were relatively uncreative; their tools changed very slowly and they show no signs of art, symbolism, or trade. Their brains were large and had grown larger over time, in parallel with humans in Africa, but we really have no idea what they did with them. Since brains are metabolically expensive, natural selection wouldn't have favored an increase in brain size unless it increased fitness, but we don't know what function that those big brains served. Usually people explain that those big brains are not as impressive as they seem, since the brain-to-body weight ratio is what’s really important, and Neanderthals were heavier than modern humans of the same height.

You may wonder why we normalize brain size by body weight. We wonder as well.

Among less intelligent creatures, such as amphibians and reptiles, most of the brain is busy dealing with a flood of sensory data. You’d expect that brain size would have to increase with body size in some way in order to keep up. If you assume that the key is how much surface the animal has, in order to monitor what’s causing that nagging itch and control all the muscles needed for movement, brain size should scale as the 2/3rds power of weight. If an animal has a brain that’s bigger than predicted by that 2/3rds power scaling law, then maybe it’s smarter than average. That argument works reasonable well for a wide range of species, but it can’t make sense for animals with big brains. In particular it can’t make sense for primates, since in that case we know that most of the brain is used for purposes other than muscle control and immediate reaction to sensation. Look at this way - if dividing brain volume by weight is a valid approach, Nero Wolfe must be really, really stupid.

We think that Neanderthal brains really were large, definitely larger than those of people today. This doesn’t necessarily mean that they were smarter, at least not as a culture. The archaeological record certainly indicates that they were not, since their material culture was definitely simpler than that of their successors. In fact, they may have been relatively unintelligent, even with their big brains. Although brain size certainly is correlated with intelligence in modern humans, it is not the only factor that affects intelligence. By the way, you may have read somewhere (The Mismeasure of Man) that brain volume has no relationship to intelligence, but that’s just a lie.

One paradoxical possibility is that Neanderthals lacked complex language and so had to be smart as individuals in order to learn their culture and technology, while that same lack severely limited their societal achievements. Complex language of the type we see in modern humans makes learning a lot easier: without it, learning to create even Mousterian tools may have been difficult. In that case, individuals would have to repeatedly re-invent the wheel (so to speak) while there would have been little societal progress.

It could also be that Neanderthal brains were less powerful than you’d expect because there just weren’t enough Neanderthals. That may sound obscure, but bear with us. The problem is that evolution is less efficient in small populations, in the same way that any statistical survey – polls, for example -becomes less accurate with fewer samples. Natural selection is pretty good at eliminating a defective gene when its disadvantage is significantly great than the inverse of the population size. When the disadvantage is smaller than that, the defective gene has a reasonable probability of reaching high frequency by drift. It can even become universal in that population. This tendency is insignificant in large populations, but it can lead to problems in small ones, as more and more slightly deleterious mutations accumulate. There is a countervailing tendency – the generation of favorable mutations, which are likely to spread – but that tendency becomes weaker and weaker as the population becomes smaller. Thus, over the long term, a population that is too small is likely to go extinct for purely genetic reasons, if some other disaster doesn’t strike first. This is an issue that concerns conservationists who are trying to maintain endangered species such as the whooping crane or Florida panther.

Neanderthals were not so rare as to risk extinction by genetic load. But the same argument has other implications. Even if a population is big enough for long-term survival, it may still suffer some genetic load. This would matter most for extremely complicated adaptations that relied on precise action of many genes: the more complicated the adaptation, the more vulnerable it would be to this kind of genetic sand in the gears. As it happens, the most complicated human adaptation is the brain, and one might expect that it would show the greatest vulnerability to such problems. There is some direct evidence of this, concerning a different kind of mutation load. Children whose parents are closely related, first cousins or closer, are significantly more likely to have two copies of deleterious recessive mutations – and their IQ is affected to a greater extent that other traits such as height. We think that the long-term effective population size of Neanderthals was less than that of anatomically modern humans, since Africa was less affected by the ice ages, which at their worst made most of Europe uninhabitable – so Neanderthals may have had more problems with genetic load. Because of this, Neanderthals may have had less efficient brains than their anatomically modern contemporaries or humans today.

Our favorite hypothesis is that Neanderthals and other archaic humans had a fundamentally different kind of learning than moderns. One of the enduring puzzles is the near-stasis of tool kits in early humans - as we have said before, the Acheulean hand-axe tradition last for almost a million years and extended from the Cape of Good Hope to Germany, while the Mousterian lasted for a quarter of a million years. Somehow these early humans were capable of transmitting a simple material culture for hundreds of thousands of years with little change. More information was transmitted to the next generation than in chimpanzees, but not as much as in modern humans. At the same time, that information was transmitted with surprisingly high accuracy. This must be the case, since random errors in transmission would have caused changes in those tool traditions, resulting in noticeable variation over space and time – which we do not see.

It looks to us as if toolmaking in those populations was, to some extent, innate: genetically determined. Just as song birds are born with a rough genetic template that constrains what songs are learned, early humans may have been born with genetically determined behavioral tendencies that resulted in certain kinds of tools. Genetic transmission of that information has the characteristics required to explain this pattern of simple, near-static technology, since only a limited amount of information can be acquired through natural selection, while the information that is acquired is transmitted with very high accuracy.

My published articles are archived at iSteve.com -- Steve Sailer

"Appaloosa"

A movie likely to be overlooked in the Oscar nominations is the fall cowboy flick, "Appaloosa," but Viggo Mortensen deserves serious Best Supporting Actor consideration. Here's my review in The American Conservative:

The bald and square-jawed actor Ed Harris has played American heroes and psycho killers since first drawing notice as astronaut John Glenn in 1983's "The Right Stuff." He's now written and directed "Appaloosa," an amiable Western about masculine camaraderie and honor adapted from the book by Robert B. Parker, the genre novelist who created Spenser, the Boston private eye. "Appaloosa" furnishes Harris and Viggo Mortensen (the King in "The Return of the King") with plenty of wry lines for their portrayals of itinerant lawmen in the New Mexico of the 1880s.

Fish do not feel wet, we are told (although on what authority, I cannot say), and cowboys and Indians movies once felt no more awkward than cops and robbers films do today. Westerns were then less a genre than a natural, default mode. In the early 1970s, however, urban crime dramas, such as "The French Connection" and "The Godfather" replaced Westerns as the norm. The Western has since become a highly self-conscious genre, one almost immobilized by the weight of its pre-1970 cinema history.

As an actor, however, Harris appears unburdened by all the film school baggage the genre has accumulated. The straightforward "Appaloosa" provides two outstanding roles and sundry old-fashioned pleasures.

By churning out countless cowboy movies, Hollywood had helped enshrine the idea that America was built by frontier settlers. The decline of the Western coincided with the rise in self-consciousness of the descendents of Ellis Island immigrants. By 1970, the grandchildren of Ellis Island wished to assert a new vision. America, their movies implied, was built not by pioneers, but by Catholic and Jewish immigrants, especially the gangsters and policemen of the big cities.

Thus, Martin Scorsese spent over 30 years and more than $100 million to film the 1928 book Gangs of New York to push his mobocentric theory of American history back into the mid-19th Century. The tagline for his movie was "America Was Born in the Streets."

In "Appaloosa," Harris portrays the marshal, a man whose gun hand gets steadier the more the adrenalin flows. He's honest, courageous, professional (he always reloads his six-shooter instantly after killing a bad guy -- you never know when you might need to shoot another one), and perhaps not quite right in the head. Mortensen is his deputy, better educated (a West Point grad), but content to follow his boss's lead because the marshal's slightly demented heroism provides him with a moral compass.

Both Westerns and Urbans offer promising plots for movies because they depict a Hobbesian world where life is full of interest. Modern crime movies are about the grim business of maintaining order. Westerns, in contrast, tend to be sunnier because they are about establishing order, forging a legitimate monopoly on violence.

The burghers of Appaloosa hire the pair to bring the law to their dusty town terrorized by the gang of a rich rancher turned brigand (played by Jeremy Irons, using Daniel Day-Lewis's I-drink-your-milkshake Mid-Atlantic accent). Harris and Mortensen pin on their silver stars, ask a few hotheads to come quietly, shoot those who won't, and soon order is instituted.

Then, disorder arrives on the train in the comely form of a tightly corseted widow, Renée Zellweger of "Chicago." The actress endures a lot of flack for her scrunched-up facial features, but she's well cast here as a seemingly refined lady. The widow is looking for a Wild West town with such a high male-female ratio that nobody will notice she's not Lillie Langtry while she's on the prowl for the reigning alpha male.

The deputy (Viggo) is immediately smitten by her, but she doesn't notice him because he's only a beta. When the marshal (Ed) briefly goes off his rocker and brutally beats a harmless barfly for using vulgar language in front of a lady, he wins her heart and they quickly marry. To her disappointment, that savage moment proves anomalous. Mostly, the marshal is good at keeping the peace. Bored, the missus starts looking for trouble, which, quickly enough, finds her.

Westerns usually have happy, yet bittersweet, endings. The law-enforcing man of violence triumphs, making the settlement finally safe for children and schoolmarms. The tamed town no longer needs a hero, so he rides off into the sunset, obsolete but majestic.

Ed Harris isn't the most expert of directors, but his chemistry with Mortensen overcomes the occasionally off-kilter editing and inadequate score, making "Appaloosa" the best traditional Western since Kevin Costner's "Open Range."

Rated a soft R for some violence and language.

My published articles are archived at iSteve.com -- Steve Sailer

January 9, 2009

"Rachel Getting Married"

Here's my review for The American Conservative of last fall's "Rachel Getting Married," which is still relevant because Anne Hathaway appears to be the frontrunner for the Best Actress Oscar. (This is the version as I wrote it, not as it was printed, so don't blame the magazine for my gratuitous scandal-mongering conclusion in which, based on almost no evidence, I insinuate that John McCain may have broken up P.J. O'Rourke's marriage to the real-life model for Anne Hathaway's character.)
Hollywood likes to squeeze a little more milk out of the DVD cow by occasionally re-releasing an old movie as an (inevitably longer) "Director's Cut." Sadly, we never get to buy a shorter "Editor's Cut." With luck, director Jonathan Demme's "Rachel Getting Married" will be the first. Buried under more than an hour of Demme's Sixties noodling is a nifty sixty-minute family drama.

Demme, who was born in 1944 (in between George Harrison and Keith Moon), was a sort of idiot savant music video genius, who in 1984 made the best ever rock concert movie, Talking Heads' "Stop Making Sense." His 1986 masterpiece "Something Wild" incorporated the nascent "world music" trend delightfully. Unfortunately, the title "Stop Making Sense" proved prophetic. Demme's shambolic 1992 Academy Award acceptance speech for "Silence of the Lambs" may be the most incomprehensible yet.

As Demme's musical-visual gifts dimmed, he turned to "liberal humanist" (i.e., boring) message movies such as "Philadelphia," in which Tom Hanks proves that homophobia caused the AIDS epidemic (rather than, say, industrial-scale gay promiscuity). After Demme's useless remakes of "Charade" in 2002 and "The Manchurian Candidate" in 2004, the industry seems to have concluded that he doesn't have enough brain cells left to handle a big production. Thus, the low budget "Rachel Getting Married" looks like an amateur wedding video. Half the film consists of Demme's not-as-hip-as-they-used-to-be friends improvising tedious toasts and mediocre music.

The movie's better half stars a charismatic Anne Hathaway (a heretofore-bland leading lady whose dark eyebrows made most of the impression in "The Devil Wears Prada") as Kym, an attentionaholic part-time model turned full-time drug addict who is furloughed from a posh rehab clinic for her sister's wedding. Exactly as her levelheaded sister Rachel dreads, Kym's self-destructive antics enthrall the multicultural throngs crowding the grounds of their father's Connecticut estate to prepare for Rachel's big day on which the Reform rabbi is to marry her to a tall, gentlemanly black man from Hawaii.

The highlight of the ceremony is the groom singing his bride a Neil Young ballad. White liberals critics have gone nuts over "Rachel" because the interracial marriage reminds them of a certain black Hawaiian's promise that promoting "mutual understanding" is "in my DNA." I fear, though, that even electing Obama President won't get many black guys to understand the appeal of whiny Canadian folk rockers from the Sixties.

First-time screenwriter Jenny Lumet named the groom "Sidney." She is presumably referencing both Sidney Poitier in Stanley Kramer's "Guess Who's Coming to Dinner," and her father, Sidney Lumet, director of 1957's "Twelve Angry Men," one of Kramer's successors as a liberal warhorse.

Various shocking revelations about Kym's culpability in the death a decade before of their little brother ensue, culminating in a confrontation with her mother (1980s legend Debra Winger of "An Officer and a Gentleman" making one of her myriad, but still welcome, comebacks). "Rachel Getting Married" has a decent little plot if you like upscale suburban family tragedies in the tradition of "Ordinary People." Lumet handles the disclosures about the death of the child realistically and effectively. Rather than build up to stagey moments, jagged shards of information are blurted out before you can prepare your emotional defenses.

Still, a more entertaining screenplay could be written about the star's off-screen misadventures. Hathaway was in the news in June when the FBI hauled away her suave Italian boyfriend, Raffaello Follieri. Outfitted with clerical cassocks and a claim to be the Vatican's chief financial officer, Follieri had wormed his way into a $100 million deal with Bill Clinton and Ron Burkle to sell off Roman Catholic churches in America to pay for sex scandal settlements. On a rented yacht in Montenegro, the bipartisan cute couple also hosted the 70th birthday party of John McCain.

An equally entertaining movie could be made about the real-life Lumet sisters (who are granddaughters of famed jazz vocalist and beauty Lena Horne). When their dad received his Lifetime Achievement Oscar in 2005, screenwriter Jenny, the sensibly dressed old-fashioned leftist, had the global television spotlight stolen from her by the startling new cleavage of her sister Amy, a would-be model and 1992 National Review contributor ("Baby Cons of America, unite: You have nothing to lose but your parents' guilt.") Interestingly, Amy Lumet's marriage to hard-partying conservative satirist P.J. O'Rourke broke up about when she is said to have worked for John McCain.

Now, Jenny / Rachel has taken sibling rivalry to a new level.

Rated R for language and brief sexuality.

Henry Harpending on how not to hunt a Cape Buffalo

Greg Cochran and Henry Harpending now have a website up for their new book The 10,000 Year Explosion: How Civilization Accelerated Human Evolution.

They've posted four outtakes from the book that didn't make the final draft for reasons of length. Here's part of a section intended to help readers understand what it must have been like for early humans to hunt big game with just spears:

Probably most of our readers don’t have personal experience with old-fashioned, Pleistocene-style big game hunting. The only place in which it is still possible - not for much longer, at that – is Africa, where the big game had a chance to adapt as mankind gradually became formidable hunters and thus managed to survive until today. Without that experience, it’s hard to realize how remarkable Neanderthals were, how difficult hunting bison and elk with thrusting spears must have been. It’s not easy to appreciate the risks stone-age hunters had to take when they went after mammoths, rhinos, or Cape buffalo: it’s not exactly safe today, even with modern weapons. One of us, however (Henry Harpending) does have that experience, and the following note gives a flavor of what it’s like – particularly when you don’t have the faintest idea what you’re doing.

Encounter with a Buffalo

When I (HCH) was a graduate student in the 1960’s I spent a year and a half in the northern Kalahari desert doing fieldwork with !Kung Bushmen, foragers who lived by foraging wild foodstuffs and hunting game animals. With several other graduate students we had a base camp near the border with Southwest Africa (now Namibia) about 100 miles south of the Caprivi Strip on the northern border of Botswana. The nearest source of supplies was a two-day trip from their camp by four wheel drive truck.

Several weeks after the rainy season ended there were reports in the neighborhood of a cape buffalo that was harassing people and animals. Often older males lose rank and leave herd to wander by themselves, angry and uncomfortable. They are a threat to people and stock, especially horses.

We were out of meat in our camp, and so with the confidence and foolishness of youth we decided to hunt down the buffalo. We had visions of steaks and chops as well as many pounds of dried meat for travel rations and dog food. At that time permits for Buffalo were only a few dollars from the Botswana game department, and we had several. Although there were stories of Buffalo being aggressive and dangerous to hunt, to my eye they were simply large cattle. Bushmen never hunted them with their poison arrow and spear technology, but they too were naïve and had great faith in our high-powered rifle.

One morning we set off to where the animal had last been reported. The party was a colleague, several young Bushman males, and myself. We soon picked up its tracks and for several hours followed its wanderings through the low thorny scrub. To me the tracks looked exactly like those of a cow but the Bushmen never hesitated. When it was apparent at one point that there were no tracks at all in view I asked, and the Bushmen told me that there was no point in following the tracks since they knew exactly where it was going. We often saw this hunting with Bushmen­–they used actual tracks as a guide but knew the habits of animals so well that they often proceeded on their own to pick up actual tracks later on.

This went on for hours until, suddenly, a young man grabbed my shoulder and said “there it is.” I looked long and hard until I saw it, well camouflaged behind several yards of thick brush, sideways, staring hard at us with its bright pig eyes. It was about forty yards away.

As I brought the rifle up I was dismayed to realize that it still had a powerful telescopic sight. I should have removed it and use open iron sights in thick bush but I had forgotten. With the magnification of the scope I saw a black mass surrounded by brush. It took a moment to locate the front legs, then the chest. Oriented, I aimed and fired. “Bang-whump”, the bang from the rifle and the whump as the bullet struck the buffalo. He jerked a little, then simply stood there staring at me. “Bang-whump, bang-whump” as I fired two more rounds.

Now he tossed his head and snorted, then started running toward us. Buffalo charge with their nose high, only lowering their head to use their horns on contact. I fired one more round at the charging animal, head on, simply pointing at him because he was so close, then turned and ran. We discovered later that the bullet had struck his shoulder, ricocheted off his scapula, and exited through the skin on his side. It certainly didn’t slow him down at all: I might as well have been shooting at a railroad locomotive.

There were three of us running away now from the charging animal: my colleague, our camp dog, and myself.

You can find out what happened here.

To see how tough cape buffalo are, at the bottom of their excerpt is a Youtube of the now-famous "Battle at Kruger" video of a baby cape buffalo's encounter with hungry lions and crocodiles.

My published articles are archived at iSteve.com -- Steve Sailer

"Happy-Go-Lucky"

You may have wondered what is this movie "Happy-Go-Lucky" that keeps winning year-end awards from critics -- e.g.:

British comedy "Happy-Go-Lucky" has almost swept the 43 Annual National Society of Film Critics Awards on Saturday, taking home four trophies including Best Director for Mike Leigh.

Here's my review from a couple of months ago in The American Conservative:

“Happy-Go-Lucky,” five-time Oscar nominee Mike Leigh’s “quirky” and “offbeat” comedy about a young London schoolteacher who is, yes, happy-go-lucky, has enjoyed the most unanimous critical acclaim of any film this year. All 31 “Top Critics” on the Rotten Tomatoes website have given “Happy-Go-Lucky” their personal thumbs up. Indeed, star Sally Hawkins has a shot at an Oscar nomination because Academy members like to vote for obscure British actresses in low budget movies nobody has seen, such as Imelda Staunton’s Best Actress nod for Leigh’s last film, “Vera Drake.”

Leigh, a Best Director nominee for 1996’s “Secrets and Lies,” prides himself on improvising slice-of-life leftwing movies about the English working class, which this Royal Academy of the Dramatic Arts graduate knows all about because his physician father had proletarian patients.

Since he doesn’t work from a script, investors are cautious about investing in Leigh’s vague ideas. "My tragedy as a filmmaker now," he declaims, "is that there is a very limited ceiling on the amount of money anyone will give me to make a film.” So, the British National Lottery obligingly kicked in some of “Happy-Go-Lucky’s” budget.

Lotteries are notoriously a tax on stupidity; evidently, they are also a subsidy for vapidity because “Happy-Go-Lucky” is the worst movie by a prominent director since M. Night Shyamalan’s allergy allegory “The Happening.” Leigh’s film is smug, boring, plotless, and pointless, the perfect embodiment of the Obama Era of liberal self-congratulation.

To Leigh, Hawkins’s character “Poppy” is as adorable as the two Audreys: Hepburn in “Breakfast at Tiffany’s" and Tatou in “Amélie.” To me, Hawkins is insufferable. Imagine a “Star Wars” prequel in which a female Jar-Jar Binks hogs the screen for the entire two hours. Poppy smirks, snickers, and sniggers, mugging like Jim Varney in those old “Hey Vern” movies, an overgrown class clown laughing relentlessly at her own jokes, which are never, ever funny.

There’s nothing more excruciating than watching people onscreen laugh, especially when they crack themselves up. (What’s really funny is seeing characters mortified with embarrassment.) In general, happy people aren’t very funny and funny people aren’t very happy. A friend had dinner in the 1990s with the famous comic Jackie Mason, and reported that it was a grim ordeal. Mason spent the evening complaining about how Ed Sullivan had “ruined his career” in 1964.

And how exactly did Poppy, a North Londoner, acquire her quasi-Australian accent? Her youngest sister, a drunken law student, talks like Sid Vicious, but Poppy sounds like the Crocodile Hunter. In a male actor, a working class Australian accent sounds manly yet affable (that’s why the U.S.-born Mel Gibson normally plays his American roles with an unexplained hint of Down Under in his voice), but on a woman it just sounds tomboyish and goofy.

Most of Leigh’s movies have been about the oppression of the proletariat, but by 2008 their values are apparently ascendant in London. Any character who thinks about the future—such as Poppy’s one married, home-owning sister—is scorned as a buzz-kill.

Most people in “Happy-Go-Lucky” have pleasant government jobs. Judging from this movie, the British welfare state exists mostly so people with soft college degrees can have some place to hang out together while making plans for which pub or disco to go to after work.

The only plot device consists of Poppy’s weekly driving lessons with a tightly wound little fundamentalist Christian with bad teeth, played by Eddie Marsan. I initially assumed these two equally unattractive single people would wind up settling for each other, but when he insists she lock the car doors when two black youths bicycle by, he demonstrates (in Leigh’s mental universe) that he is morally unworthy of her, and probably a dangerous psycho to boot.

Instead, Leigh hooks her up with a school social worker, who is played by a ludicrously handsome young actor who looks like one of those towering Olympic swimming medalists with massively masculine jawlines molded by years of Human Growth Hormone abuse.

One vignette of this momentum-free movie unwittingly exemplifies the female cluelessness that has made Britain’s schools a dystopia of juvenile male thuggishness. When one of her students starts punching other children, does Poppy punish him? No, she signs the bully up for counseling, which consists of three adults—the headmistress, Poppy, and her future boyfriend—sitting around praising the little lout and asking him what’s the real reason he hits people. (Actual answer: it’s fun.)

Rated R for language.
The other elderly British leftwing low budget improvisatory director whose last name begins with an L is Ken Loach, who directed "Land and Freedom" about the Spanish Civil War and "The Wind that Shakes the Barley" about the Irish Civil War. Perhaps a bit of a clunky director, but I find Loach more likable than Leigh.

My published articles are archived at iSteve.com -- Steve Sailer

January 8, 2009

Rove calling the kettle black

Karl Rove writes in the Wall Street Journal that the Mortgage Meltdown wasn't Bush's fault:

President Bush Tried to Rein In Fan and Fred:
Democrats and the media have the housing story wrong.

... Some critics blame Mr. Bush because he supported broadening homeownership. But Mr. Bush’s goal was for people to own homes they could afford, not ones made accessible by reckless lenders who off-loaded their risk to GSEs [Government-Sponsored Enterprises]....

As one of those critics, let me point out to Mr. Rove that Mr. Bush didn't attempt to rein in Fannie Mae and Freddie Mac when it came to minority lending. In fact, he egged them on. From Bush's June 17, 2002 speech to St. Paul's African Methodist Episcopalian church:

Now, we've got a problem here in America that we have to address. Too many American families, too many minorities do not own a home. There is a home ownership gap in America. The difference between Anglo America and African American and Hispanic home ownership is too big. (Applause.) And we've got to focus the attention on this nation to address this.

And it starts with setting a goal. And so by the year 2010, we must increase minority home owners by at least 5.5 million. In order to close the homeownership gap, we've got to set a big goal for America, and focus our attention and resources on that goal. (Applause.)...

I want to thank Franklin Raines, of Fannie Mae and Leland Brendsel of Freddie Mac. Thank you all for coming. (Applause.)...

Three-quarters of white America owns their homes. Less than 50 percent of African Americans are part of the homeownership in America. And less than 50 percent of the Hispanics who live here in this country own their home. And that has got to change for the good of the country. It just does. (Applause.) And so here are some of the ways to address the issue. First, the single greatest barrier to first time homeownership is a high downpayment. It is really hard for many, many, low income families to make the high downpayment. ...

And let me talk about some of the progress which we have made to date, as an example for others to follow. First of all, government sponsored corporations that help create our mortgage system -- I introduced two of the leaders here today -- they call those people Fannie May and Freddie Mac, as well as the federal home loan banks, will increase their commitment to minority markets by more than $440 billion. (Applause.) I want to thank Leland and Franklin for that commitment. It's a commitment that conforms to their charters, as well, and also conforms to their hearts.

This means they will purchase more loans made by banks after Americans, Hispanics and other minorities, which will encourage homeownership. Freddie Mac will launch 25 initiatives to eliminate homeownership barriers. Under one of these, consumers with poor credit will be able to get a mortgage with an interest rate that automatically goes down after a period of consistent payments. (Applause.)

Fannie Mae will establish 100 partnerships with faith-based organizations that will provide home buyer education and help increase homeownership for their congregations. I love the partnership. (Applause.)

My published articles are archived at iSteve.com -- Steve Sailer

January 7, 2009

Genes being incorporated in federal longitudinal social studies

The federal government runs a number of gigantic multi-decade human sciences studies of Americans, with the best known being the 1979 National Longitudinal Study of Youth, which was featured prominently in The Bell Curve in 1994, but is still going on, with IQ scores now available on thousands of the children of the original sample.

Newer studies are including genetic data. In the Chronicles of Higher Education, Christopher Shea reports in "The Nature-Nurture Debate, Redux:"

What has led to the new genetic turn in sociology, at least among a minority? In part it has to do with the availability of important new data sets. The National Longitudinal Study of Adolescent Health, aka Add Health, for example, at Chapel Hill, was designed from the start to incorporate both sociological and genetic information. It was begun, in 1994, by Bearman, J. Richard Udry, and Kathleen Mullan Harris. The idea was to capture as much information as possible about the social circumstances, friendship networks, and family conditions of 21,000 teenagers in 132 schools, from grades 7 through 12. The survey included a disproportionate number of twins, both fraternal and identical, full- and half-siblings, and adopted kids, allowing preliminary analyses of the heritability of traits. Follow-up interviews were conducted a year later.

Then, for the third wave of the study (in 2002), 2,500 siblings were asked for DNA samples (via cheek swabs). In wave four, now in progress and run by Harris, DNA is being sought for all participants (now they can just spit in a tube.) Many of the papers in the AJS issue draw on the Add Health study.

Various findings on the influence of genes, such as The Gene for Not Getting Any, but I don't like to trumpet early research on behavioral genetics since so much of it doesn't pan out. The important point is that we are slowly developing the tools to answer nature-nurture questions fairly definitively. Of course, this raises the question of whether the results are slanted in favor of those who possess The Gene for Agreeing to Have Your Genes Sampled. (Just kidding).

It should be possible to ask the best known tracked sample, the NLSY79 participants, for genetic samples in an upcoming re-interview, but I don't know of any plans for doing that. The cost of genetic sampling is dropping rapidly but it's still awfully high for doing full scans on thousands.

The upcoming National Children's Study will be gigantic: 100,000 kids (including 3000 pairs of twins), tracked from before birth up through age 21, with participation of mothers and, sometimes, fathers. It will be primarily focused on environmental impacts on kids' health, but it appears that they will have to do both genetic and IQ testing ("cognitive") to answer their questions, such as whether chronic exposure to insecticides hurts cognitive function.

So, as the evidence rolls in, expect persecution of realists by Blank Slate Creationists to rise to new heights.

My published articles are archived at iSteve.com -- Steve Sailer

January 6, 2009

Black Swan Sighting

Jill Claman of Fox News interviews investment guru David Swensen, who has guided Yale's endowment ($20 billion last summer) to (until recently) consistently (and, to my mind, suspiciously) gigantic returns (17.8% per year for ten years):

Claman: Isn't it fair to say right now we face what some call a Black Swan event? This term "Black Swan" indicates that something we rarely ever see. Has it taken you by surprise?

Swensen: You're absolutely right to characterize it as the Black Swan event. By the nature, the events have to take people by surprise.

Don't blame me, it was a Black Swan!

My published articles are archived at iSteve.com -- Steve Sailer

January 5, 2009

Black Swans and Tournaments

A point I want to make more clearly is that one major reason that accurately predicting events that people are particularly interested in is so hard is because many of those events are the result of some kind of tournament.

We are fascinated by tournaments. (Just look at all the complaints that tonight's college football championship game only represents a quasi-tournament rather than an explicit tournament like the NCAA basketball championships).

So many of the things we most want experts to predict for us are explicit tournaments (e.g., the Super Bowl playoffs) that have been carefully designed to create maximum uncertainty in the later, more climactic rounds by matching the best contestants against each other.

For example, in about 90 or 100 tries, a #16 seeded team in the men's NCAA basketball team has never upset a #1 seeded team in the opening round, so basketball games are actually quite predictable when there is a fair-sized difference in quality between teams as determined by their seasonal performance. But subsequent round games become less predictable as the quality gap narrows, so public interest builds.

Or, the things we are interested in can be semi-explicit tournaments (e.g., the Presidential primary/general election process).

Or, unplanned events take on some of the nature of tournaments.

For example, people in the 19th Century were utterly fascinated by the Battle of Waterloo (June 18, 1815), which determined the basic political arrangements of Europe up through 1914. It was often remarked that the next century of European dominance was determined by the events of a few minutes in the crisis of the battle in which Napoleon's hitherto-undefeated Imperial Guard nearly broke through the British lines, but were stopped just short. Then, they faltered, broke, and ran.

Waterloo -- which Wellington called "a damn nice thing -- the nearest run thing you ever saw" -- was seen as evidence against large-scale deterministic theories of history, since so much depended upon something so close.

Contributing to Waterloo's fame was its numerous tournament-like aspects. For example, Bonaparte was the old champion making a stunning comeback. Wellington was the challenger who had never faced Napoleon before, but had worked his way up to the top by defeating his best marshals.

Finally, much that interests us are forged by vaguely tournament-like processes. For example, stock prices are the result of, in effect, competitions between those who think the price is too low and those who think it is too high.

On the other hand, the kind of phenomena that the social sciences (and much of public policy) are concerned with -- crime rates, test scores, and the like -- tend not to be very tournament-like at all, and thus tend to be fairly predictable.

My published articles are archived at iSteve.com -- Steve Sailer

The Brown Swan

My critique in VDARE.com of Nassim Nicholas Taleb's bestseller "The Black Swan: The Impact of the Highly Improbable" tries to walk the delicate line of giving the book credit while explaining some of the ways it will be misinterpreted -- especially its title phrase.

In From Dawn to Decadence, 94-year-old historian Jacques Barzun offered a dozen dictums on pp. 655-656 summarizing what he's learned from three quarters of a century of scholarship. One was:

"The potent writings that helped to reshape minds and institutions in the West have done so through a formula or two, not always consistent with the text. Partisans and scholars start to read the book with care after it has done its work."

(By the way, this is certainly true of Barack Obama's autobiography, which has "done its work" without being carefully read!)

A reader explains how Taleb's new catchphrase "Black Swan" is being rapturously greeted on Wall Street by the very people who poured billions into subprime mortgages in Compton. Hey, it's not their fault they didn't see all those defaults coming: it was a Black Swan!
I would emphasize, that in my opinion the Black Swan is a timely rationalization for the gross incompetence seen across finance (both the more private part and their governmental overseers) regarding very predictable events. That is, it is wrong in principal, because, and as you state, the disaster(s) should have been expected (or rather, the ‘black swan’ event would have been loaning to bad credit risks and having them actually paying the loans back, not the other way around).

Furthermore, you focus on residential mortgages and minority ownership (i.e., given VDare’s emphasis), but, of course it also applies to commercial mortgages, credit card debt, etc. In short, Taleb has given incompetent and/or corrupt finance types (especially “quants”) an easy out. For example, let’s say you are a risk manager at [gigantic but inept financial institution] (which I was), and you missed seeing, as you point out, that based on a normal distribution the default rate for Mexicans is on average X% (which is significantly more than for your typical founding stock American). Basic probabilities based on normal distributions would suggest you were a fool for not seeing a wave of defaults coming, but then you now have Taleb’s ‘black swan’ event to explain yourself.

I will now give you personal insight into this. I worked at [humongous Wall Street money pit] (until February of 2008) on what was called a “credit specific risk” add-on to their Value-at-Risk model. My focus was covering Credit Default Swaps (“CDSs”) and related credit derivates (CDOs, CDO-squareds, etc.) and non-derivatives. After the financial markets began their meltdown (which continues predictably to this day, and beyond) in late July/early August of 2007, the head of the Market Risk asked me if I had read The Black Swan. I told him that I had not, and asked if he had read [Taleb's earlier book] Fooled by Randomness. ... By Thanksgiving almost every high level risk manager on Wall Street had read (or said they had read) The Black Swan. In hindsight, it is all so clear, here is a book that essentially goes on and on about what is normally a true but normally trivial point, by definition.

In effect Taleb gave the elites that screwed up on a monumental scale an easy out For example, “yes, I’m head of risk for Merrill and some say I should have seen it coming, but surely you have read ‘The Black Swan’ and now understand how that would have been impossible.” In short, Taleb has given all of finance the copout they need when they most needed it (i.e., everyone - practitioners, academics, regulators, etc.).

Attributing the worthlessness of your mortgage-backed security full of 2006-vintage Sand State subprime loans to a "Black Swan" is, in effect, a lot like blaming it on "Sh*t Happens," but it makes you sound erudite rather than stoned.

As today's WSJ article "Housing Push for Hispanics Spawns Wave of Foreclosures" suggests, the mortgage meltdown wasn't an unpredictable Black Swan at all. That rapidly Hispanicizing regions (the great majority of defaulted mortgage dollars came from just four states -- California, Arizona, Nevada, and Florida) turned out to be full of people who couldn't pay back their giant mortgages wasn't impossible to forecast: instead, it was, to coin a term, a Brown Swan -- a predictable disaster that goes unforeseen due to pervasive political correctness.

My published articles are archived at iSteve.com -- Steve Sailer

Gaza and Barrage Balloons

When a war broke out between Israel and Hezbollah in South Lebanon in the summer of 2006, war fever in the America press reached frightening levels. For a few weeks, there seemed a very real threat that this frenzy would push America into war with Hezbollah's supporter Iran.

So, that month I spent a lot of time writing about how ridiculous this all was, how it's not 1938 again, how the Middle East is less a powder keg than a powder thimble, how America has roughly half the defense spending in the world, how Iran barely has an air force, how war is decliningly profitable, etc etc. In the indirect way my writing works, I may have helped deflate that dangerous war bubble.

This time around, fortunately, there doesn't seem to be as much media mania in the U.S.

I wonder why?

Perhaps it's just the even more extreme one-sidedness of the conflict; or the lack of a credible Muslim sponsor country for the enthusiasts to demand that America bomb; or the sense of deja-vu, the feeling that this is just depressing and boring business as usual. Weirdly, I have a vague hunch that the lack of insanity in the press is in some way connected to Bernie Madoff, ridiculous as that sounds.

All that said, Gaza is an important worst case stress test of the advantages of separatism. The Israelis built a fairly effective fence around Gaza that more or less prevents suicide bombers from getting into Israel. They've removed the Israeli settlers from Gaza. Now, their main problem is Gazans lobbing explosives over the fence into Israel. It's in everybody's interest to help them come up with an effective solution for that.

We know that the long term solution is, in the words of newspaper magnate Lord Copper in Waugh's Scoop, "the Beast stands for strong, mutually-antagonistic governments everywhere." Nobody in Jordan or Syria shoots stuff at Israel anymore because the governments of Jordan and Syria know that the Israelis will come and break the government's shiny war weapons, so the governments keep their hotheads under control. I'm not sure how they do it, and I'm not sure I want to know. But, they do it.

Unfortunately, that's a long way off in the case of Gaza, the West Bank, and Lebanon. The problem is that the political process by which strong governments will eventually emerge in these lands will no doubt be through a long struggle with Israel in which various bravos demonstrate their courage and patriotic bona fides by attacking Israel, bringing about Israeli reprisals, which in turn stoke anti-Israeli fanaticism, etc.. Presumably, somebody will eventually come out so securely on top that he can then call it off and start living above ground again, but that could be a long, long way off.

So, I've been trying to think of a technical solution to the problem of people in the Gaza Strip shooting locally-made unguided missiles at Israeli towns nearby. From 2001 through 2008, 15 people have killed by Qassam rockets fired from Gaza.

This has not been a gigantic problem so far for Israelis, in part because most of the missiles are so short range that they can only reach a single Israeli town, which the government of Israel has been fortifying. The Israelis have the technology to track a rocket back to its launch site and place an explosive on that spot within a few minutes. This means that the Palestinians typically shoot and scoot, which in turn means that they can't calibrate their fire. With unguided high trajectory weapons, such as mortars, artillery, and the Gazans' rockets, to actually hit your target, you need to stay in one place and, taking guidance from forward observers, fire again and again, methodically walking the impacts up to the target. But the Gazans are terrified of dying from Israeli counter-fire, so they prop up their missile in an orchard, point it in the general direction of that Israeli town, and drive away. So, their accuracy doesn't improve.

If the Gazans were to get a guided missiles (with longer ranges), this could prove to be a much larger problem for Israel. On the other handed, those are expensive, and the Gaza Strip doesn't currently have the industrial base to make them so they'd have to be imported. But Israel's fear of Palestinians importing better missiles encourages Israel to keep a clampdown on Gaza imports, with much economic pain inflicted, which just encourages Gazans to fire missiles at Israel.

So, an effective Israel anti-missile defense system would be beneficial.

Israel intends to implement by 2010 in the Gaza neighborhood the "Iron Dome" anti-missile missile, but there are some drawbacks. First, it won't be able to protect the Israeli town closest to the border, since it takes 15 seconds to get launched and the total flight time to that local target is less than that. Second, each Iron Dome anti-missile missile costs about $100,000, so it's an expensive solution to use against home-made rockets.

I've long wondered if guns wouldn't be more cost-effective anti-missile weapons than missiles. The usual advantage of a rocket is that it continues to accelerate after launch, allowing it to achieve higher ultimate speeds, whereas a gun's projectile achieves its maximum speed as it leaves the barrel and subsequently declines. When you need very rapid response, however, perhaps guns are the better technology, perhaps combined with some sort of guidance system for the projectile? One downside of guns is that they tend to have high fixed costs, while missiles have high variable costs, but this kind of chronic situation seems ideal for a few fixed high-tech guns. Also, in the Gaza area, they could be aimed so that their projectiles that miss could come down in the sea harmlessly.

Anyway, I don't know whether current gunpowder guns would work at all, or whether this kind of anti-missile gun defense would be dependent on the final development of a practical railgun, which was one of those war-winning wonder weapons the Germans tinkered with way back in WWII instead of developing a tank with the cost-quality effectiveness of the Russian T-34.

However, there's another old defensive technology that might be updatable with modern electronics to be even an better solution: barrage balloons. During the Blitz in 1940, the British launched 1,400 balloons anchored by heavy cables to damage German airplanes flying under 5,000 feet who collided with their metal cables. They were modestly effective against the plague of V1 buzz bomb cruise missiles that Germans fired at London later in the war, destroying 231 of them. The Germans, however, cleverly built wire cutters into the wings of the V1.

My thought is that high-tech barrage balloons could defend Israeli towns against missiles in a different way than simply relying upon impact with the cable (a method that assumes the flying attacker has wings, which missiles don't). Instead, they could be used to pre-position anti-missile shrapnel charges at various altitudes. As a missile from Gaza is launched, Israeli radar could choose which of the floating charges to detonate.

Does that make any sense?

My published articles are archived at iSteve.com -- Steve Sailer

January 4, 2009

My review of Taleb's "The Black Swan"

Here is my new review in VDARE.com of Nassim Nicholas Taleb's influential book on risk, "The Black Swan."

My published articles are archived at iSteve.com -- Steve Sailer

WSJ: "Housing Push for Hispanics Spawns Wave of Foreclosures"

From the Wall Street Journal:

"Housing Push for Hispanics Spawns Wave of Foreclosures"

California Rep. Joe Baca has long pushed legislation he said would "open the doors to the American Dream" for first-time home buyers in his largely Hispanic district. For many of them, those doors have slammed shut, quickly and painfully.

Mortgage lenders flooded Mr. Baca's San Bernardino, Calif., district with loans that often didn't require down payments, solid credit ratings or documentation of employment. Now, many of the Hispanics who became homeowners find themselves mired in the national housing mess. Nearly 9,200 families in his district have lost their homes to foreclosure.

For years, immigrants to the U.S. have viewed buying a home as the ultimate benchmark of success. Between 2000 and 2007, as the Hispanic population increased, Hispanic homeownership grew even faster, increasing by 47%, to 6.1 million from 4.1 million, according to the U.S. Census Bureau. Over that same period, homeownership nationally grew by 8%. In 2005 alone, mortgages to Hispanics jumped by 29%, with expensive nonprime mortgages soaring 169%, according to the Federal Financial Institutions Examination Council.

An examination of that borrowing spree by The Wall Street Journal reveals that it wasn't simply the mortgage market at work. It was fueled by a campaign by low-income housing groups, Hispanic lawmakers, a congressional Hispanic housing initiative, mortgage lenders and brokers, who all were pushing to increase homeownership among Latinos.

What about President Bush and his 2002 White House Conference on Minority Homeownership, where he called for adding 5.5 million Hispanic and black homeowners via cutting back on barriers to the American Dream, such as down payments?

The network included Mr. Baca, chairman of the Congressional Hispanic Caucus, whose district is 58% Hispanic and ranks No. 5 among all congressional districts in percentage of home loans not tailored for prime borrowers. The caucus launched a housing initiative called Hogar -- Spanish for home -- to work with industry and community groups to increase mortgage lending to Latinos. Mortgage companies provided funding to that group, and to the National Association of Hispanic Real Estate Professionals, which fielded an army to make the loans.

In years past, minority borrowers seeking loans were often stopped cold by a practice called red-lining, in which lenders were reluctant to lend within particular geographical areas, often, it appeared, on the basis of race. But combined efforts to open the mortgage pipeline to Latinos proved successful.

"We saw what we refer to in the advocacy community as reverse red-lining," says Aracely Panameno, director of Latino affairs for the Center for Responsible Lending, an advocacy group. "Lenders were seeking out those borrowers and charging them through the roof," she says.

Ms. Panameno says that during the height of the housing boom she sought to present the Hispanic Caucus with data showing how many Latinos were being steered into risky and expensive subprime loans. Hogar declined her requests, she says.

A very large fraction of the people steering Hispanics into risky and expensive subprime loans were Hispanics, so it's hardly surprising that their political representatives weren't interested in hearing about predatory lending abuses. Hispanic mortgage brokers, real estate agents, and construction workers were making a killing off easy credit, so why kill the goose that laid the golden egg?

When the national housing market began unraveling, so did the fortunes of many of the new homeowners. National foreclosure statistics don't break out data by ethnicity or race. But there is evidence that Hispanic borrowers have been hard hit. In part, that's because of large Hispanic populations in areas where the housing bubble was pronounced, such as Southern California, Nevada and Florida.

And why was the Housing Bubble pronounced in those areas with large Hispanic populations? In the propaganda of the time, population growth was constantly cited as justifying rising home prices, but there was no mention of whether these new people had the earning capacity to pay back their mortgages.

In U.S. counties where Hispanics account for more than 25% of the population, banks have taken back 6.7 homes per 1,000 residents since Jan. 1, 2006, compared with 4.6 per 1,000 residents in all counties, according to a Journal analysis of U.S. Census and RealtyTrac data.

Hispanic lawmakers and community groups have blamed subprime lenders, who specialize in making loans to customers with spotty credit histories. They complain that even solid borrowers were steered to those loans, which carry higher interest rates.

In a written statement, Mr. Baca blamed the foreclosure crisis among Hispanics on borrowers' lack of "financial literacy" and on "lenders and brokers eager to make a bigger profit." He declined to be interviewed for this story.

But a close look at the network of organizations pushing for increased mortgage lending reveals a more complicated picture. Subprime-industry executives were advisers to the Hogar housing initiative, and bankrolled more than $2 million of its research. Lawmakers and advocacy groups pushed hard for the easy credit that fueled the subprime phenomenon among Latinos. Members of the Congressional Hispanic Caucus, who received donations from the lending industry and saw their constituents moving into new homes, pushed for eased lending standards, which led to problems.

Mortgage lenders appear to have regarded Latinos as a largely untapped demographic. Many were first or second-generation U.S. residents who didn't own homes. Many Hispanic families had multiple wage earners working multiple cash jobs, but had no savings or established credit history to allow them to qualify for traditional loans.

The Congressional Hispanic Caucus created Hogar in 2003 to work with industry and community groups to increase mortgage lending to Latinos. At that time, the national Latino homeownership rate was 47%, compared with 68% for the overall population. Hogar called the figure "alarming," and said a concerted effort was required to ensure that "by the end of the decade Latinos will share equally in the American Dream of homeownership."

Hogar's backers included many companies that ran into trouble in mortgage markets: Fannie Mae and Freddie Mac, both now under federal control; Countrywide Financial Corp., sold last year to Bank of America Corp.; Washington Mutual Inc., taken over by the government and sold to J.P. Morgan Chase & Co.; and New Century Financial Corp. and Ameriquest Mortgage Corp., both now defunct.

Hogar's ties to the subprime industry were substantial. A Washington Mutual vice president served as chairman of its advisory committee. Companies that donated $150,000 a year got the right to place a research fellow who would conduct Hogar's studies, which were used by industry lobbyists. For donations of $100,000 a year, Hogar offered to provide news releases from the Hispanic Caucus promoting a lender's commercial products for the Latino market, according to the group's literature.

Hogar worked with Freddie Mac on a two-year examination of Latino homeownership in 63 congressional districts. The study found Hispanic ownership on the rise thanks to "new flexible mortgage loan products" that the industry was adopting. It recommended further easing of down-payment and underwriting standards.

Representatives for Hogar declined repeated requests for comment.

The National Association of Hispanic Real Estate Professionals, one of Hogar's sponsors, advised the group, shared research data and built a large membership to market loans to Latinos. By 2005, its ranks had grown to 16,000 agents and mortgage brokers.

The association, called Nahrep, received funding from some of the same players that funded Hogar. Some 22 corporate sponsors, including Countrywide and Washington Mutual, together paid the association $2 million a year to attend conferences and forums where lenders could pitch their loan products to loan brokers.

While home prices were rising, the lending risk seemed minimal, says Tim Sandos, Narhep's president. "We would say, 'Is he breathing? OK, we'll give him a mortgage,' " he recalls.

Nahrep's 2006 convention in Las Vegas was called "Place Your Bets on Home Ownership." Countrywide Chairman Angelo Mozilo spoke, as did former Housing and Urban Development Secretary Henry Cisneros, a force in Latino housing developments in the West.

The words "Las Vegas" constantly pop up in these kind of articles.

Countrywide and other sponsors contracted with Nahrep to set up regional events where they could present loan products to loan brokers and their customers. Mr. Sandos says his organization doesn't get paid to promote particular lenders.

At the height of the subprime lending boom, in 2005, banking and finance companies gave at least $2.3 million in campaign contributions to members of the Hispanic Caucus, according to data from the Center for Responsive Politics.

In October 2008, a charitable foundation set up by Mr. Baca received $25,000 from AmeriDream Inc., a nonprofit housing company and Hogar sponsor. Mr. Baca has long backed AmeriDream's controversial seller-financed down-payment assistance program. AmeriDream provided down-payment money to buyers, a cost that was covered by home builders in the form of donations to the nonprofit.

This is a tax evasion scam, often organized through minority charities, such as churches. The Bush Administration tended to push it as "compassionate conservatism."

New housing legislation last fall outlawed the program. Mr. Baca is cosponsoring a bill that would allow AmeriDream and similar nonprofits to resume arranging seller-financed down-payment assistance to low-income Federal Housing Administration borrowers.

Such seller-financed loans comprise one-third of the loans backed by the FHA, and have defaulted at nearly triple the rate of other FHA-insured loans, according to agency spokesman William Glavin.

[prime candidates chart]

In a news release, AmeriDream said the donation to Mr. Baca's foundation was intended to fund the purchase of gear for firefighters in his district. Local news reports say the foundation gave away $36,000 in scholarships this year.

Internal Revenue Service records indicate that Mr. Baca's son, Joe Baca Jr., has an annual salary of $51,800 as executive director of the Joe Baca Foundation, which is run out of the congressman's home. Joe Baca Jr. says he currently is taking only about half that listed salary.

Mr. Baca's office declined to comment on the AmeriDream contribution.

Mr. Baca remains opposed to strict lending rules. "We need to keep credit easily accessible to our minority communities," he said in a statement released by his office.

Mortgage lending to Hispanics took off between 2004 and 2007, powered by nonprime loans. The biggest jump occurred in 2005.

If this had been merely a cynical re-election ploy by the Bush Adminstration, Bush could have pulled the plug on it the day after the November 2004 election. But, instead, this was a practically universal delusion among the Great and the Good. To Karl Rove, it was a permanent good idea that would bring about long-term realignment by making Hispanics into home-owning Republicans. To Democrats, it was pork for their constituents.

The 169% increase in nonprime mortgages to Hispanics that year outpaced a 122% gain for blacks, and a 110% increase for whites, according to a Journal analysis of mortgage-industry and federal-housing data. Nonprime mortgages carry high interest rates and are tailored to borrowers with low credit scores or few assets.

Between 2004 and 2007, black borrowers were offered nonprime loans at a slightly higher rate than Hispanics, but the overall number of Hispanic borrowers was much larger. From 2004 to 2005, total nonprime home loans to Hispanics more than tripled to $69 billion from $19 billion, and peaked in 2006 at $73 billion.

Tricks of the Trade

Mortgage brokers became a key portion of the lending pipeline. Phi Nguygn, a former broker, worked at two suburban Washington-area firms that employed hundreds of loan originators, most of them Latino. Countrywide and other subprime lenders sent account representatives to brokerage offices frequently, he says. Countrywide didn't respond to calls requesting comment.

Representatives of subprime lenders passed on "little tricks of the trade" to get borrowers qualified, he says, such as adding a borrower's name to a relative's bank account, an illegal maneuver. Mr. Nguygn says he's now volunteering time to help borrowers facing foreclosure negotiate with banks.

Many loans to Hispanic borrowers were based not on actual income histories but on a borrower's "stated income." These so-called no-doc loans yielded higher commissions and involved less paperwork.

Another problem was so-called NINA -- no income, no assets -- loans. They were originally intended for self-employed people of means. But Freddie Mac executives worried about abuse, according to documents obtained by Congress. The program "appears to target borrowers who would have trouble qualifying for a mortgage if their financial position were adequately disclosed," said a staff memo to Freddie Mac Chairman Richard Syron. "It appears they are disproportionately targeted toward Hispanics."

Freddie Mac says it tightened down-payment requirements in 2004 and stopped buying NINA loans altogether in 2007.

"It's very hard to get in front of a train loaded with highly profitable activities and stop it," says Ronald Rosenfeld, chairman of the Federal Housing Finance Board, a government agency that regulates home loan banks.

Regions of the country where the housing bubble grew biggest, such as California, Nevada and Florida, are heavily populated by Latinos, many of whom worked in the construction industry during the housing boom. When these markets began to weaken, bad loans depressed the value of neighboring properties, creating a downward spiral. Neighborhoods are now dotted with vacant homes.

By late 2008, one in every nine households in San Joaquin County, Calif., was in default or foreclosure -- 24,049 of them, according to Federal Reserve data. Banks have already taken back 55 of every 1,000 homes. In Riverside, Calif., 66,838 houses are owned by banks or were headed in that direction as of October. In Prince William County, Va., a Washington suburb, 11,685 homes, or one in 11, was in default or foreclosure.

Gerardo Cadima, a Bolivian immigrant who works as an electrician, bought a home in suburban Virginia for $330,000, with no money down. "I said this is too good to be true," he recalls. "I'm 23 years old, with a family, buying my own house."

When work slowed last year, Mr. Cadima ran into trouble on his adjustable-rate mortgage. "The payments were increasing, and the price of the house was starting to drop," he says. "I started to think, is this really worth it?" He stopped making payments and his home was sold at auction for $180,000.

In the wake of the housing slump, some participants in the Hispanic lending network are expressing second thoughts about the push. Mr. Sandos, head of Nahrep, says that some of his group's past members, lured by big commissions, steered borrowers into expensive loans that they couldn't afford.

Nahrep has filed complaints with state regulators against some of those brokers, he says. Their actions go against Nahrep's mission of building "sustainable" Latino home ownership.

These days, James Scruggs of Northern Virginia Legal Services is swamped with Latino borrowers facing foreclosure. "We see loan applications that are complete fabrications," he says. Typically, he says, everything was marketed to borrowers in Spanish, right up until the closing, which was conducted in English.

"We are not talking about people working for the World Bank or the IMF," he says. "We are talking about day laborers, janitors, people who work in restaurants, people who do babysitting."

Two such borrowers work in Mr. Scrugg's office. Sandra Cardoza, a $28,000-a-year office manager, is now $30,000 in arrears on loans totaling $370,000. "Her loan documents say she makes more than me," says Mr. Scruggs.

Nahrep agents are networking on how to negotiate "short sales" to banks, where Hispanic homeowners sell their homes at a loss in order to escape onerous mortgages. The association has a new how-to guide: "The American Nightmare: Strategies for Preventing, Surviving and Overcoming Foreclosure."

—Louise Radnofsky contributed to this article.

Write to Susan Schmidt at susan.schmidt@wsj.com and Maurice Tamman at maurice.tamman@wsj.com

Well, I told you so.

My published articles are archived at iSteve.com -- Steve Sailer