Stephen W. Browne | Rants and Raves

CAT | Science

Dec/15

2

The windows of the soul

I’m on a road trip in New Mexico right now, enjoying the incredible scenery, the mountain air, and occasionally NPR.

That’s how I caught a program on neuro-opthamology, which is something I’m growing more and more familiar with but only now have a name for.

My nine-year-old daughter is being treated with neuro-opthamology for ambliopia, “lazy eye.” Twice a week she attends eye therapy where she does various exercises, and every day she’s not in therapy I do four different kinds of exercises with her for an hour and fifteen minutes.

Exercises include reading letters at increasing distances with one eye, reading small print through different lenses, and watching television through colored filters.

I’ve watched her ability to read rapidly improve, and it’s improving her school work greatly. In a year or thereabouts she should have highly improved depth perception as well.

The National Public Radio program The People’s Pharmacy was called “What to do About a Ghost in Your Brain,” and if you are interested in such things I highly recommend you look it up.

A highly successful artificial intelligence researcher suffered what seemed to be a mild concussion when his car was rear-ended. For the next nine years he suffered from his senses, perceptions and thoughts giving him weird and conflicting signals. He had problems with his balance and once easy tasks became almost impossibly difficult.

He recounted trying figure out what was behind a nagging feeling of something wrong by sheer force of will. After a few hours of hard thinking with sweat pouring down his face he realized he’d put his shoes on the wrong feet!

After finding a therapy program his is almost completely recovered. And the fascinating thing is, a lot of the therapy he described sounds like the kind of thing my daughter is doing.

The eyes are an extension of the brain and what goes in through them can alter the way the brain functions, help it to route around damage.

And by damage, they meant lesions so small they could not be detected by MRI or CAT scan.

There was much fascinating, and worrying, information about the effect of concussions from car accidents, falls, and sports injuries. For example a minor concussion one doesn’t think much about can leave the brain more vulnerable to a later concussion, even much later.

The good news is, the damage is treatable and we’re learning more about how to treat it all the time.

And though it’s better to start treatment soon after the damage occurs, it can still be treated years later.

One of the scientists on the program tried to give an idea of the complexity of that organ where the mind resides. She suggested an order of complexity equivalent to 100 million personal computers. The truly amazing thing is that we are starting to get a handle on that complexity.

We stand at the beginning of an age of exploration that may be as important as the exploration of space.

Best of all we have hope for those that have suffered the most feared loss of all, the loss of self.

No tags

Apr/15

26

Go see Ex Machina

ex-machina-

Alex Garland’s Ex Machina is one of those movies that’s very hard to review without spoilers. It’s got a conclusion I kind of didn’t see coming, though I did guess some of the twists and turns before the end.

I can tell you that 1) yes it’s entertaining, and 2) yes it’s thought-provoking, if you have steeped yourself in the literature of the Singularity and the possibility of what we call “strong AI.” That is to say artificial intelligence that can pass the Turing Test.

The Turing Test was proposed by mathematician and cybernetics pioneer Alan Turing, who is currently the subject of the movie “The Imitation Game.”

Turing suggested we’d know we’d created an intelligent being when a human could sit in a room with a teletype (this was a while ago) and exchange messages with a correspondent he couldn’t see. If at the end of a lengthy conversation he couldn’t tell if it was a human or a machine at the other end, we’d know.

At some indeterminate time in the near future Caleb (Domhnall Gleeson) is brought to the Alaskan hideaway of multi-billionaire genius Nathan (Oscar Isaac), who coded the world’s largest search engine when he was 13. It must have blown Google out of the water by then.

Nathan is experimenting with strong AI in the form of robots. Specifically one robot Ava (Alicia Vikander) who appears to be partly an attractive woman with body parts made of transparent plastic that reveal the inner machine.

These are only speaking parts in the movie. The only other character that interacts with the trio is Nathan’s beautiful mute servant/lover Kyoko (Sonoya Mizuno).

Caleb has ostensibly been brought to conduct a Turing Test on Ava. In a series of seven sessions he simply has to engage in dialog with Ava and at the end give an opinion as to whether she passes.

Of course there’s more to it than that. All the characters are engaged in various kinds of manipulation – and that’s part of the test too.

Nathan is a seriously unlikable character, and he’s also consciously aware that he may be creating humanity’s replacement. Does that make him a god of sorts – or just yesterday’s news in an unimaginable future?

How will the AIs of the future feel about us, and how much does that depend on how we treat them? (Hint: reread Frankenstein, the original by Mary Shelley not the Hammer films.)

Would an intelligent entity without glands and hormones have any kind of motivations we’d understand at all? Would the concept of gender mean anything to them? How about self-preservation?

Sam Goldwyn famously said, “If you want to send a message, use Western Union,” meaning message flicks are all too prone to become didactic, heavy-handed and boring.

Ex Machina does a pretty good job of letting the questions suggest themselves – if you’re the kind of person who thinks about that sort of thing. If not, the plot comlications might still entertain.

No tags

Oct/13

29

Hubble telescope discovers oldest galaxy

z8_GND_5296 is not what you call a real exciting name, but the reality is exciting enough.

Observations by the Hubble Space Telescope and Keck Observatory in Hawaii have confirmed the galaxy with that unexciting moniker is the oldest and most distant in the universe found to date.

z8_GND_5296 formed within 700 million years after the Big Bang, making it about 95 percent of the age of the universe. It is about 13.1 billion light-years away, which means that we’re seeing it as it was 13.1 billion years ago when it was young and generating new stars at a furious rate.

The Hubble Space Telescope was launched into low Earth orbit in 1990. The idea for space-based telescopes has been around since at least 1923. The advantage of a space telescope versus ground-based astronomy is on Earth the atmosphere creates optical distortions we see as the twinkle in little stars, and absorbs much of the infrared and ultraviolet light.

After the Hubble was launched it was discovered its mirror had been ground incorrectly and it wouldn’t perform at the optimal level expected. It still preformed scientifically useful observations though, and in 1993 the first of four servicing missions conducted from the Space Shuttle repaired and upgraded the Hubble.

The things accomplished using the Hubble, though largely unheralded, are breathtaking. Because of the Hubble we now know the rate of expansion of the universe, a phenomenon first discovered by astronomer Edwin Hubble (1889-1953) for whom the telescope is named. And that’s only one of many breakthroughs in astrophysics that have come from Hubble observations.

The Hubble is expected to keep operating through at least 2014, maybe until 2020. Considering that Mars Exploration Rover, which landed on Mars on January 25, 2004, is still functioning under much greater environmental stress than the Hubble that’s probably not too much to hope for.

A successor the James Webb Space Telescope (JWST) is scheduled for launch in 2018. It will orbit far higher than the Hubble and benefit from all the advances in space hardware made since 1990. Given what the Hubble has accomplished, we can hardly imagine what will come from the JWST.

The Hubble’s cumulative costs were estimated at about $10 billion as of 2010. For the JWST they’re talking about $8 billion, but with the way the government works one might expect cost overruns.

Nevertheless I am reminded of something I once heard scientist and SciFi author Jerry Pournelle say at a presentation at Oklahoma University once.

Pournelle observed that when the government gives lots of money to social scientists and educators, very often the results are either nothing or downright counterproductive. You give money to scientists and engineers and though there may be massive waste – at least you get something for it.

I wonder if future generations will look back and think we lived in the most exciting time in history, and never noticed it.

Note: Cross posted from my professional blog at the Marshall Independent.

No tags

May/13

20

Review: Home Run

Note: This appeared in the print-only TV Guide of the Marshall Independent.

I ran into this film quite by accident on Sunday. I was pressed for time, had to have something to review and I really didn’t want to see another iteration of “The Great Gatsby.” I’ve seen the Robert Redford version, a work of genius about people I don’t give a flip about.

I knew “Home Run” was about baseball, and addiction. “42” made me care about baseball, and some people I do give a flip about are in recovery.

Well, you’re not far into the movie before you notice it’s proselytizing for Christianity and promoting Celebrate Recovery, a program run by the Saddleback Church, founded in 1980 by Pastor Rick Warren in Lake Forest, California.

Celebrate Recovery was started in 1990 by Waren and Pastor John Baker in response to the various 12-step Anonymous programs. Warren and Baker have a similar approach but differ on two points. One, they bring all addictive behaviors, “hurts, habits, and hang-ups” under one roof.

And, they felt the AA reference to a “higher power” was too vague and specifically center their reliance on Jesus Christ.

So aside from the message and Frank Capra’s advice to use Western Union if you’ve got one, how does it stack up as entertainment.

Not bad actually, but it can make you more than a bit uncomfortable in spots.

Cory Brand (Scott Elrod) is a major league baseball player with a drinking problem. After publicly screwing up one too many times, accidentally bloodying
Carlos the batboy’s nose (Juan Martinez) while throwing a tantrum, his agent Helene (Vivica A. Fox) sends him home to Okmulgee, Oklahoma to 1) do conspicuous good works, and 2) get into a 12-step program. CR is the only one he can find.

On the way home he crashes a car while tanked to the gills, putting his brother Clay (James Devoti) in the hospital in the process. Which fortuitously gives him the opportunity for well-publicized good works – taking over as coach for the Little League team.

First complication, the batboy plays on the team and is his brother’s adopted son.

Second complication, another coach Emma Johnson (Dorian Brown), is the high school sweetheart he abandoned 10 years before when she got pregnant.

Third complication, their son Tyler (Charles henry Wyson) is on the team and doesn’t know his idol is his father.

The movie starts with a flashback. A bucolic farm scene segues into Cory’s abusive alcoholic father making him practice batting. Dad was a player who never made it past the minor leagues.

Flash forward. Cory is a seriously unlikable person. He’s got the manners, morals and habits of a spoiled six-year-old. He screws up everything for himself, the people who care for him, and won’t acknowledge any responsibility for it.

Watching him can make you squirm in your seat. They’ve got addictive behavior down.

He’s got just two things going for him. One is that he can really hit a ball. The other it turns out, is he has a real gift for coaching kids.

Of course the film is about his literal come-to-Jesus moment, brought about by a combination of things. One of them is learning the sister-in-law (Nicole Leigh) he’d contemptuously dismissed as having no problems greater than a clogged sink, had a childhood rough enough to make him ashamed of his whining. They show this with a brilliantly simple visual involving no special effects magic.

Drawbacks. This is a film about addiction, which they attribute to childhood trauma passed down through generations. But lots more people have rotten childhoods than ever become addicts to substances or self-destructive behavior.

Conversely, lots of addicts have nothing to complain about in their childhoods.

Trauma, especially young no doubt contributes to addictions, but there are other issues as well. Is there such a thing as an addictive personality? If so, is it biochemical in nature, and more to the point is it hereditary?

Is Cory a lush because his dad was abusive, or were they both lushes and prone to be mean when drunk because of their heredity?

And here’s where it gets very interesting. The open profession of faith makes a lot of people uncomfortable these days, but the fact is faith-based programs have comparatively high success rates. If addiction is a biochemical weakness, the “self-medication theory,” then what if living with it involves strong, single-minded belief in… something.

If religion isn’t something you go to the movies for, this isn’t your cup of tea. If you have dealt with addiction, of know someone who has, it might be interesting.

What’s also interesting is, for an indy movie with competent acting and good visual composition, it cost only $1.2 million to make. Which it made back with change its opening weekend.

No tags

Note: Cross-posted from my blog at The Marshall Independent.

A couple of weeks ago Google hired Ray Kurzweil to be its new director of engineering.

Google is best-known for its Internet search engine. In fact, the chances are good if you were looking into Kurzweil you might have been sent here by Google.

It’s unclear to me as yet what Kurzweil was hired to do at Google, but he is a pioneer in speech recognition technology, so it might have something to do with developing ways to ditch this keyboard and just tell your computer what you want it to do.

Uh, somehow I don’t think that’s going to make work around the office any easier. I already get weird looks because I like to compose aloud.

But what Kurweil is best-known for is the ideas he lined out in his book “The Singularity is Near.” Kurzweil thinks that in the not-so-distant future we’re going to have Artificial Intelligence, immortality, and powers and abilities that would seem godlike to us now.

The Technological Singularity is an idea that’s been kicking around for a while. The term was popularized by mathematician and SF author Vernor Vinge. Singularitarians can be classified with what’s called the Transhumanist Movement. which holds that we can and should transcend the limits of our biology and become more than human, superhuman.

The idea is that at a certain point in history possibly within our lifetime, our technology will have advanced so far that it is literally impossible to predict anything beyond that point. The analogy is with the physical singularity created by a black hole, from which no information can escape.

Of course, humans being what we are, that doesn’t prevent us from trying to picture what might lie on the other side of the Singularity.

There are a number of theoretical ways of reaching the Singularity, either gradually or in one breakthrough leap, but the standard model has it the Singularity happens when our machines get smarter than we are.

This is a prospect that is both exciting and kind of scary. If and when our machines become self-aware, who’s to know they might not say, “Hey thanks! Now who needs you?”

But if the Singularity is a scary idea, the alternative is just depressing, The Age of Failed Expectations.

That’s the notion that our knowledge and technology is approaching inherent limits and progress will start to slow down and eventually become almost static. We’ll see modest increases in human lifespan and not much more. Our laptops will have much more number crunching capacity, but we won’t be discussing the meaning of life with them. There’ll be an ever-increasing amount of stuff on the Internet, but most of it will be drivel and sorting through it will be as time-consuming and unproductive as the time you spend on Facebook.

I’m a technological optimist. Like William Faulkner, “I believe that man will not merely endure. He will prevail.”

But I have to admit, that’s my nature as an optimist. I believe it because I prefer to, not because I have any evidence either way.

And that’s the nature of the Singularity, the only way we’re going to find out is by living through it.

But I’ll be waiting to see what comes out of Google, now Kurzweil’s there.

No tags

Note: cross-posted from my newspaper blog at The Marshall Independent.

I just came across a fascinating article on a device currently in development that might have kept the New York subway tunnels from flooding. (Well, fascinating for infrastructure geeks like me that is.)

“In all, seven New York subway tunnels and two commuter train tunnels flooded during Monday’s record flooding. Some of the tunnels were flooded from track to ceiling and “it is still too early to say how long it will take to restore the system to full service,” the Metropolitan Transit Authority, which operates the rail systems, said Wednesday.”

The device is basically a big inflatable balloon plug, and the idea was originally to protect tunnels from terrorist gas or firebomb attacks. Tests have been conducted with high-pressure water though, proving it would be effective in flood emergencies.

The fascinating thing to me is, I happen to know that this has been done before. To be precise, during World War II.

My son’s late godmother, and my daughter’s namesake, was an Englishwoman named Judith Hatton. She was among other things, the widow of a Russian spy from the KGB department known as SMERSH (“Smiyrt shpionem” or “Death to spies”) that James Bond used to tangle with – and that’s not even the most interesting thing about her.

During WWII she was the youngest censor at the BBC. Her father was an engineer who helped develop a way to protect the London subway tunnels from disastrous flooding.

During the Blitz this was a serious worry. Literally tens of thousands of people slept in the subway stations which were used as bomb shelters by the people of London. The danger was, three tunnels go under the Thames River. The Luftwaffe used to drop sticks of bombs on the river, hoping to rupture one of the tunnels, which would have flooded most or all of the system causing huge loss of life.

The solution was to install gates at either ends of the tunnels under the river. I’m not sure but I believe they were drop gates that could be slammed shut in seconds if needed.

Of course, if there were trains in transit under the river… The term in medicine is “triage.”

Judith was actually in a train in transit under the Thames during an air raid. Evidently during raids, the tube trains would stop moving for the duration. According to Judith people were cheerful and brave, telling jokes and sharing smokes and having a jolly good time sharing the very English camaraderie of tough times.

She told me once she actually considered telling people about the gates on either side of them ready to drop if the tunnel ruptured, but then just shrugged and thought, “Oh why spoil the fun since there’s nothing we can do about it anyway?”

No tags

Sep/12

12

Deadly decisions

Note: Cross-posted from my newspaper blog at The Marshall Independent, which references an article “Deadly force decisions.”

Researching and writing the article “Deadly force decisions” for Monday’s paper was the most intense experience I’ve had at the Independent to date – and that includes donning harness and climbing a 70 foot ladder to the top of the MERIT Center wind tower simulator.

Though the article ran more than 900 words, I could easily have made it twice as long. Because what I didn’t include was, I got to try a few simulations myself.

Trainer Matt Loeslie let me try the LASERSHOT simulator out with a pistol during a break between officer trainings.

Let me explain, though I haven’t had much to do with firearms for some years, I’m passing familiar with them. I am not a stranger to interpersonal violence in odd parts of the world, and I have seen violent death.

Years ago I was within earshot of a deadly force encounter in Oklahoma and clearly remember the sequence of shots. And I remember many years ago a certain idiot youth and friend darn near did get shot doing something stupid that scared an officer near a crime scene under low-light conditions.

The police of course, are pros and have extensive training in these kinds of scenarios. The simulator is designed to bring an element of realism that gun ranges can’t have. I asked if the simulator induces stress. Some officers said it can sometimes. One joked that media presence was a great stress provider.

Then my turn came. I left shaken. I’m still a little shaken.

One scenario: a man in an apartment hallway holding a knife to a woman’s throat screaming he was going to kill her. There were bystanders in the narrow space.

After trying out the commands to drop the knife, just like I’d seen the officers do, I took the shot at the suspects head. Replay showed I got him. Very possibly grazed the victim but certainly saved her life.

Loeslie complimented my shooting, and asked gently if I could have taken the shot earlier.

Could have, and should have. The fact is I was caught up in it and did not want anyone to die.

Lesson learned: under American law an officer’s duty is to protect life, including the life of a suspect. But sometimes the choice is forced upon them of who is going to die.

The last scenario I tried was a prolonged horror. Answering a call to a high school where a “hit list” was found in a student’s locker. A pretty young girl is summoned out of class and asked to explain.

Everything went south from there, and I probably did pretty much everything wrong.

She goes to her locker ignoring all commands to stay where she was. She opened her locker, ignoring commands to show her hands, pulls out a cell phone. I shot.

The scenario should have ended there, but I think Loeslie had stepped out of the room and the scenario kept playing. (Fact is, I don’t know. It was that absorbing.)

Girl calls her mother and has a very disturbed conversation.

Then she puts the cell phone back, ignoring continued commands to be still, and this time pulls out a gun.

Ignoring commands to put it down she then put it to her head and finally pulled the trigger.

Startled by the sound of the shot, I shot her again as she fell. It couldn’t have made her any deader, but I was horrified.

Later with the benefit of 20/20 hindsight it occurred to me I should have had a Taser rather than a pistol. I didn’t have the Taser simulator but I could have said “Taser!” to indicate a transition from pistol to Taser. I should NEVER have let the girl open the locker, perhaps should have restrained her physically or even Tasered her.

Yes, I know how that would have looked in the press if a gun hadn’t been found in the locker.

I draw two conclusions from this experience.

One is that I am VERY glad the technology for this kind of training exists. This is not the kind of decision-making skill one wants officers to learn “on the job.”

The other is, I think every journalist who covers the police beat should try out this training.

And it wouldn’t hurt the general population to see some of these scenarios either. Especially those involving traffic stops, low-light conditions, etc.

No tags

Mar/07

1

Am I a Neanderthal?

Wonderful news! My pet crackpot theory may be viable after all.

Some time ago in my post “Can you think?” http://rantsand.blogspot.com/2006/10/can-you-think.html I posed this question:

“How often have you, after examining the evidence reached a conclusion that was uncomfortable, unsettling or profoundly disturbing to you, i.e. reached a conclusion that you did not like?”

Once after posing this question, someone asked me, “So what was a conclusion that really disturbed you?”

“Neanderthals” I said.

“Huh? What’s disturbing about Neanderthals? They’ve all been dead for about 24 thousand years.”

Exactly. They’re dead, all of them.

If you scroll down to “Racism, some questions” you’ll see that I mentioned a position on race held by some Anthropologists; that the only subdivision of humanity with characteristics different enough to be called a “race”, were the Neanderthals.* These people classify them as Homo Sapiens Neanderthalensis, and contend that after the end of the last Ice Age, they rebred into the main human line and the most extreme characteristics largely disappeared.

The counter theory is that Neanderthals were a separate species that became extinct and contributed little or nothing to the modern human gene pool. This theory classifies them as Homo Neanderthalensis.

The weight of evidence has see-sawed back and forth on this one. The early image of Neanderthals as shuffling primitives was revised when it was determined that they really didn’t look all that different from us. (Apparently the first complete skeleton discovered was that of an individual with ricketts and arthritis, and taken as typical.)

The problem is (as my Osteology professor put it) that bone is very plastic and reshapes in response to environmental conditions. This makes it a very poor repository of genetic information. (In the American southwest, archeologists were puzzled for years that the Basketmaker culture showed a strong continuity of material culture after skull shapes changed abruptly. Some were playing with theories of an invading people who wiped out the locals, and took over their material culture whole, until somebody finally noticed that around this time they had changed cradleboards from a soft to hard design – thus changing the skull shapes of their children.)

Almost nothing is known about soft tissue characteristics of Neanderthals, only differences in bone structure: shin bones more round in cross section than modern humans, no real chin to speak of, heavy brow ridges and a pronounced sloping forehead. That and a brain case on average 300-500 CCs larger than the modern average – and everybody is still wondering what the heck that might mean.

Nothing is known about skin, hair or eye color, though popular depictions in the past have tended to show them as dark, compared to the lighter Cro-magnon of many popular depictions.

It is a popular sport in some circles to find evidence of unconscious racism everywhere, but this looks like the real thing. Neanderthals lived in Europe, Anatolia and the Middle East during the last Ice Age. White skin is a cold climate adaptation. It is far more likely that the Neanderthals were light-pigmented and the early Cro-magnons darker.

That was the genesis of my pet theory. I thought Neanderthals might be us – I mean white Europeans. If they rebred into the main human line, perhaps the surviving characteristics in modern populations might be white skin and light-colored eyes. (That and some of the heavy skull bones you see in Germans and Scandinavians.)**

Then along came fossil DNA analysis, and the conclusion was that they were indeed a separate species who died out.

This is profoundly disturbing. Neanderthals made tools and interred their dead with grave goods, there was even something that looks a lot like a bone flute discovered in one site.

Let that possibility sink in for a moment. An intelligent species, who modified their environment and cared for their dead like us died out and left nothing behind but bones and tools, and the knowledge that a people who lived, loved, and wondered about the universe are gone forever. What does that mean about us? Could we wind up on the list of extinct species someday? Except that unless there is a successor species, there won’t be any lists kept.

Opponents of this theory countered that there have been skeletons found that appear to be transitional types, or hybrids. Swedish paleontologist Bjorn Kurten advanced a theory that would explain this. According to his theory, Homo Sapiens and Neanderthals interbred – but muled out and produced sterile offspring. (He popularized this in a novel, Dance of the Tiger http://www.amazon.com/Dance-Tiger-Novel-Ice-Age/dp/0520202775/sr=8-1/qid=1172767738/ref=pd_bbs_sr_1/103-2995162-4924614?ie=UTF8&s=books and a sequel.)

However, now the evidence has swung the other way and the question declared resolved – again. Neanderthal were human and the variation in DNA is attributed to the claim that we are a species with an unusually narrow range of genetic differences.

So we’re off the hook then? Intelligent species don’t commonly become extinct after all?

Maybe not. One reason advanced for why we have such a narrow genetic range as a species is that at one time the whole ancestral human population declined to as few as 10,000 individuals, perhaps due to a super-volcano disaster. (Which by the way, was not due to the industrial West destroying the ecology.)

Any way you look at it and whatever we may discover about the history of the human species, we are reminded that a planet that can support life in all the rich diversity we find on earth – is a catastrophe planet.

************************************************************************************
Neanderthal early on became a metaphor for “A crude, boorish, or slow-witted person” (American Heritage Dictionary online) or an aggressive, violent person. Once on a libertarian website I read a comment wherein someone stated, “Do not aggress against others, and you will not be aggressed against.” (What an incredibly tortured construction!) I had to comment to the effect, “With respect, what planet are you living on? Unprovoked aggression is one of the constants of human history.” To which the individual replied, “Neanderthal! Think like a Neanderthal and go the way of the Neanderthals.”

This was wonderfully ironic, because there is nothing in the archelolgical record that suggests that Neanderthals were more aggresive and warlike than their successors. If anything, it is more likely that they lacked the degree of social organization that permitted war-making capabilities and thus were not able to resist their food sources being encroached upon by the new neighbors. (There is no evidence that they were wiped out by violence, but some speculation that they were simply out-competed by a people more efficient at exploiting the food resources.)

* OK, I don’t think I can get into too much trouble for this. My opinion, for what it’s worth, is that this is semantic hair splitting. Consider two extremes: species such as cheetahs or Everglades cougars, which have so little genetic variation that it’s members are virtually clones, and species such as the dog tribe in which the variation is bewildering. (All breeds of domestic dogs, wolves and coyotes are members of the same species by the primary definition – they can interbreed and produce fertile offspring.) From this perspective, “race” is a phenomenon that occurs very weakly in humanity, very strongly in the canine species.

** One Michael Bradley has a theory he advanced in his book, The Iceman Inheritance: Prehistoric Sources of Western Man’s Racism, Sexism and Aggression.
http://www.amazon.com/Iceman-Inheritance-Prehistoric-Sources-Aggression/dp/1879831007/sr=1-2/qid=1172768420/ref=pd_bbs_2/103-2995162-4924614?ie=UTF8&s=books

He claims that white people are uniquely racist, sexist and aggressive due to what he calls their “Neanderthal-Caucasoid” ancestry. On reading this my first impulse was to list all of the white nations that have historically displayed high levels of aggression, racism and sexism: the white Mongols, the white Zulus, the white Japanese, the white Sioux Indians…

My second was to say, “Inherently more agressive? Then get the #$%& out of the way wimp!”

No tags

Theme Design by devolux.nh2.me