Formerly, the CORI Bulletin for Members
of Central
Ohioans for
Rational Inquiry.
Edited By Rev. Art - Cheerleader For Science
YOURS FOR BOLOGNA DETECTION SINCE 1996
“What you think you know may not be so.”
QUOTE / UNQUOTE
"From my rotting body,
flowers shall grow and I am in them and that is eternity." [Edvard Munch, Artist]
"When
we are planning for posterity, we ought to remember that virtue is not
hereditary." [Citizen Thomas Paine]
"One problem with gazing too frequently into the past is that we may
turn around to find the future has run out on us." [Michael Cibenko]
"Every sentence that I utter must be understood not as an affirmation,
but as a question." [Niels Bohr]
(Click on any image to ENLARGE IT)
LOVING THE LIMELIGHT
Quick with a good quote, some professors
have a second career on the small screen
By
JENNIFER JACOBSON - in
Chronicle of Higher Education
Section: The Faculty
Volume 52, Issue 33, Page A12
During the Monica Lewinsky scandal in the
late 1990s, Cass R. Sunstein, a law professor at the Universityof Chicago - appeared on television regularly to argue that impeaching President
Bill Clinton was wrong.
Then he got sick of it. He was bored with the
cameras, sitting in the studio had lost its novelty, and, to top it off, his
earpiece kept falling out. So after CNN asked him to appear yet again, he said
he would agree only on one condition: that his dog join him on the air.
The network agreed. During the commercial
break, the phones were ringing off the hook, Mr. Sunstein recalls. Viewers
wanted to know where they could buy a dog like Perry, Mr. Sunstein's Rhodesian
Ridgeback. "He was a big TV star," says Mr. Sunstein. The experience,
he says, was "the highlight of my television career."
Few scholars, of course, ever have a
television career. Most professors never get calls from producers asking them
to talk about their areas of expertise for an audience of millions. But
academics like Mr. Sunstein; Diane Ravitch, an education professor at New York
University; Michael Eric Dyson, a professor of religious studies at the University
of Pennsylvania; and Pepper Schwartz, a sociologist at the University of
Washington, have made it onto producers' Palm Pilots because they have written
popular books and well-read opinion columns, can look good on camera, and offer
great quotes.
While university administrators applaud
professors who engage in public debate, they do acknowledge that by going on
television, academics risk appearing less scholarly to their colleagues. Fellow
faculty members may criticize them for oversimplifying complex ideas, yet those
colleagues may also envy their success in the popular culture.
Public intellectuals dispute the idea that
television appearances sully their scholarly reputations. After all, it is
their scholarship, they say, that prompted them to write opinion columns that
caught the eye of television producers in the first place. More importantly,
they say, they have a professional responsibility to share information with the
public and policy makers, not just their peers in academe.
Regardless of professors' personal sense of
public service, television shows keep coming back to them for one reason: their
ability, rare among academics, to speak in sound bites rather than paragraphs.
"You're not giving a lecture on Foucault for a bunch of French theorists,"
Mr. Dyson says. "You've got five minutes to hit it or quit it."
Minute by Minute
Professorial pundits tend to have a clear
idea of where they can "hit it." Many of them have been appearing on
television for more than 20 years and recognize the best formats for sharing
their views.
"I only go on shows that are news shows,
not Maury Povich," says Arthur Caplan, director of the Center for
Bioethics, at Penn.
That way, Mr. Caplan says, he can stick to his core
issues. These include organ donation and in vitro fertilization, which he has
been writing about since the 1980s.
A show like Dr. Phil, which has
invited Mr. Caplan a couple of times, seems to be more focused on advice and
self-help, he says. So he has said no. And "that lady that's always
hectoring people," he says, referring to Laura C. Schlessinger, "I
wouldn't do that either."
Instead, he prefers to appear on 60
Minutes or shows on CNN. Although he has only a few minutes to speak, Mr.
Caplan says he has learned to use television to spread messages. "On any
single appearance, I can't say everything I want," he admits. But if a
show has him on several times, he can repeat his message and build on it with
each appearance.
Mr. Caplan has plenty of opportunities to do
just that. In a busy week, he fields five or six television interviews. Other
weeks he might do only one. He does not like to take the time to travel for
such appearances, so he films them in local studios.
He makes sure to bring his own compact with
face powder and a custom-made earpiece. The earpiece he bought for $30 from a
hearing-aid company, which sent him a plaster-of-paris kit to mold to his ear.
He wanted his own because earpieces often fall out, and they are dirty. He uses
the compact, which his wife picked out for him, "to powder my nose if it's
shiny."
On television-free days, Diane Ravitch
doesn't wear much makeup — no eyeliner, eye shadow, or mascara. Some days,
she says, she does not even apply lipstick.
She is not a fan of getting made up for
television; in fact, she says "that's the worst part." There is,
however, an upside to it, Ms. Ravitch says: The skillfully applied products
make her look 20 years younger — for three minutes.
She realizes that is not a lot of time to
share her views with the public. But it is a chance to reach a national
audience, she says. Besides, most Americans get their news from television.
"So if you can say something that's educational and valuable for them to
hear," she says, "that's more than they'll hear for the rest of the
day."
The most fun she had on television was on The
Daily Show With Jon Stewart, she says, where she appeared after she wrote The
Language Police: How Pressure Groups Restrict What Students Learn. (The
book, published in 2003, is about how both the left and the right restrict phrasing
and content in textbooks and standardized exams.) Ms. Ravitch admits she was
nervous before going on. It was "a little bit intimidating," she
says. "I thought 'Oh, gosh. I'm so much older than the people in the
studio audience.'" Mr. Stewart, though, put her at ease. "He said
'Don't worry about being funny,'" she recalls. "'That's my
job.'"
When Mr. Sunstein appeared with his dog on
Greta Van Susteren's Burden of Proof in December 1998, he was not trying
to be funny. He just wanted a change. He persuaded the network to allow him to
do a show on the "associated legal issues" when an airline loses your
dog at the airport. Mr. Sunstein recalls that Ms. Van Susteren also brought her
German shepherd.
While that experience went off without a
hitch, Mr. Sunstein says his lowlight was when he appeared on The NewsHour
With Jim Lehrer to discuss the controversy over the 2000 presidential
election. The earpiece malfunctioned so that every word he said was repeated in
his ear. It "was a real challenge to see if I could talk," he says.
Plus, the earpiece kept falling out. Upon
learning that Mr. Caplan wears one specially designed to fit his ear, Mr.
Sunstein seems intrigued. "That's very clever," he says. "I
should do that."
Risk and Reward
Presidents and provosts commend professors
for going on camera. "I think it's wonderful that Penn has a way to get
out the message that our work matters in the world," says Rebecca W.
Bushnell, the university's dean of arts and sciences.
Mark A. Emmert, president of the University of Washington, agrees. When someone like Pepper Schwartz talks about academic
research in a popularized format, "it gives people a little window into
what we do and how it relates to their lives," he says. "It
demystifies some of the academy. That's incredibly useful."
But he does acknowledge that faculty members
who engage in these public conversations risk being taken less seriously. They
may, he says, experience "the Carl Sagan effect." Mr. Emmert is
referring to the famous Cornell University professor and Pulitzer Prize-winner who faced
criticism from some of his colleagues for becoming the face and voice of
astronomy and ultimately a celebrity.
"It's important for university leaders
to make it clear to the university community that we value that engagement by
our faculty," he says. "While faculty do have different roles to
play, the role of the popularizer of scholarship is very important."
Not everyone in academe shares that view.
Judith A. Howard, a professor of sociology and a social sciences dean at the University of Washington, contends that Ms. Schwartz's career has suffered in some ways because
of her television appearances, mainly in terms of the esteem in which her work
is held.
"I don't mean to suggest she hasn't had
a very successful career," Ms. Howard says of her colleague, who focuses
on gender, family, and human sexuality issues. But "there's a way in which
some academics don't respect people who have that kind of relationship with the
media."
She contends, however, that that attitude is
beginning to change. For instance, she notes that Ms. Schwartz received a major
award last year from the American Sociological Association for contributing to
the public's understanding of sociology.
Ms. Schwartz says she was touched by the
honor, especially since her areas of study, unlike race and class, are not
traditionally rewarded in the discipline. But she does admit that television
appearances can make a professor seem less scholarly. "Sometimes you're
talking about a topic that's hardly about world peace," she says. "I
do that because it's fun. I don't pretend that it's world peace."
She cites her appearance on the Today
show a couple of months ago as a prime example. She was there to give
"some perspective" on whether it was a good idea for women to get
back together with their ex-boyfriends, she says.
It is unlikely her colleagues caught the
segment. Gary Hamilton, a sociology professor, says he has never watched Ms.
Schwartz on television. He is not even sure when she is appearing on various
shows, he says: "It's not announced in the department." Mr. Hamilton
used to have the office right across the hall from Ms. Schwartz's and a
front-row seat to watch the camera crews come and go. "I used to chuckle
about that," he says, "but I was never jealous."
Douglas Baird, a law professor at the University of Chicago with Mr. Sunstein, does not envy his better-known colleagues, either.
"If I were as smart as Cass," he says, "I'd be jealous."
Besides, in an intense academic environment like
Chicago, he contends, "you're only as good as your
latest academic article." Mr. Sunstein is not shirking his
responsibilities as a professor, since he continues to teach large classes and
publish books, Mr. Baird says. "It's a little hard to say he shouldn't be
on television, too."
Resentment
But some public intellectuals say their
colleagues do resent their status as pundits.
"'I can't believe they go back to you
again and again,'" Mr. Caplan says some will complain. Or a colleague will
tell him that another professor says he is "hogging the media."
Most of the criticism is not about substance,
he says. "It's about appearance: 'I don't like the fact that you're on a
lot.'"
Wonbo Woo, a producer at ABC's World News
Tonight, says that he and his colleagues look for professors who can play
the role of outside observer and succinctly fill in the gaps inherent in TV. A
long television piece on his show runs about two and a half minutes, he says:
"That's not a lot of time to tell a story."
Mr. Sunstein does a great job, Mr. Woo says.
The constitutional law scholar is one of roughly 500 academic contacts Mr. Woo
keeps in his PDA. The producer says he often calls Mr. Sunstein because he is
"ideologically balanced" and "not easy to dismiss." The
professor, he says, has a very thoughtful approach to talking about the law to
an audience that does not necessarily understand it.
Besides his ability to simplify his subject,
Mr. Woo also appreciates that Mr. Sunstein readily suggests other academics the
producer should contact if he thinks they may be more knowledgeable about a
certain topic. Ms. Schwartz says she does the same thing. She often recommends
colleagues "to get all the best information presented" and because
she is sometimes overwhelmed with requests.
For junior professors, Mr. Caplan says, there
can be downsides to accepting such invitations. Inexperienced young professors
can say things that alienate senior professors or the board of trustees. And
they can also be seen as less scholarly just because they have been on
television a few times. "You have to weigh those things," he says.
"My attitude is ... I'll put my CV up against anyone."
Larry J. Sabato agrees. The professor of
politics at the University of Virginia says it is perfectly OK for academics to be public intellectuals as
long as they are also private ones. "You do your teaching, publish your
books, you perform your public service," he says. "That comes
first."
Now that he has taught at the university for
nearly 30 years, Mr. Sabato says, he does not encounter jealous colleagues. But
he did early on. "You've got a lot of assistant professors," he says.
"Everyone's coming up for tenure. They're doing their books. They're
wondering why is this guy going on and not me."
But that is up to the producers, Mr. Sabato
says: "I don't call them. They call me."
The producers call Michael Eric Dyson every
week. The professor of religious studies and humanities, known for his books on
African-American perspectives, like his recently published Come Hell or High
Water: Hurricane Katrina and the Color of Disaster, says his colleagues at
the University of Pennsylvania frequently comment on his appearances. And a lot of
them are proud of him. But some professors, he says, take issue with the fact
that such appearances "bleed" beyond the sanctified borders of
academe. The critics seem to believe, he says, that if more than five people
outside of your discipline read your work then you are not scholarly enough.
Never mind that he has written 13 books in as many years.
Mr. Dyson has been going on television for
nearly two decades and is typically asked to comment on race, class, and
culture. He rattles off some highlights from his television CV: "The Today
show, Nightline, Rap City on BET. I've been on so many. The O'Reilly Factor, Hannity and
Colmes, the Tavis Smiley show." It is quite a list. But he says
he is not going to say yes to every request when he is working on a book 15
hours a day: "I'm not a media whore."
(Click on any image to ENLARGE IT)
ALPHABETS ARE AS SIMPLE AS…
BY ROGER HIGHFIELD in The Telegraph, U.K.
Writing systems may look very
different, but they all use the same basic building blocks of familiar natural
shapes, reports
If
there is one quality that marks out the scientific mind, it is an unquenchable
curiosity. Even when it comes to things that are everyday and so familiar they
seem beyond question, scientists see puzzles and mysteries.
*Familar form: letters have
been shaped by everyday sights such as the crescent moon*
Look at the letters in the words
of this sentence, for example. Why are they shaped the way that they are? Why
did we come up with As, Ms and Zs and the other characters of the alphabet? And
is there any underlying similarity between the many kinds of alphabet used on
the planet?
To find out, scientists have
pooled the common features of 100 different writing systems, including true
alphabets such as Cyrillic, Korean Hangul and our own; so-called abjads that
include Arabic and others that only use characters for consonants; Sanskrit,
Tamil and other "abugidas", which use characters for consonants and
accents for vowels; and Japanese and other syllabaries, which use symbols that
approximate syllables, which make up words.
Remarkably, the study has
concluded that the letters we use can be viewed as a mirror of the features of
the natural world, from trees and mountains to meandering streams and urban
cityscapes.
The shapes of letters are not
dictated by the ease of writing them, economy of pen strokes and so on, but
their underlying familiarity and the ease of recognising them. We use certain
letters because our brains are particularly good at seeing them, even if our
hands find it hard to write them down. In turn, we are good at seeing certain
shapes because they reflect common facets of the natural world.
This, the underlying logic of
letters, will be explored next month in The American Naturalist, by Mark
Changizi, Qiang Zhang, Hao Ye, and Shinsuke Shimojo from the California
Institute of Technology in Pasadena.
The analysis is simplistic but, none the less, offers an intriguing glimpse
into why we tend to prefer some shapes over others when we communicate by
writing.
The team set out to explore the
idea that the visual signs we use have been selected, whatever the culture, to
reflect common contours, landscapes and shapes in natural scenes that human
brains have evolved to be good at seeing.
"Writing should look like
nature, in a way," said Dr Changizi, explaining how similar reasoning has
been used to explain the sounds, signs and colours that animals, insects and so
on use to tell each other they are, for example, receptive to sex.
To be able to compare Cyrillic,
Arabic or whatever, they turned to the mathematics of topology, which focuses
on the way elements are connected together in a letter rather than overall
shape, so that fonts do not matter and nor does handwriting, whether neat
calligraphy or crudely written with a crayon grasped in a clenched fist.
For example, each time you see a
T, geometrical features and frills such as serifs may differ according to the
font or handwriting but the topology remains the same. By the same token, L, T,
and X represent the three topologically distinct configurations that can be
built with exactly two segments. And, to a topological mind, an L is the same
as a V. In this way, the team could classify different configurations of
strokes, or segments, to boil an alphabet of alphabets down to their
essentials.
Across 115 writing systems to
emerge over human history, varying in number of characters from about 10 to
200, the average number of strokes per character is approximately three and
does not appear to vary as a function of writing system size. Sticking to
letters that can be drawn with three strokes or fewer, the team found that
about 36 distinct characters is the universe of letters in a theoretical
alphabet.
Remarkably, the study revealed
regularities in the distribution of (topological) shapes across approximately
100 phonemic (non-logographic) writing systems, where characters stand for
sounds, and across symbols. "Whether you use Chinese or physics symbols,
the shapes that are common in one are common in the others," said Dr
Changizi.
For comparison, the team studied
the shapes found in the real world, such as the Y shapes seen at the corner of
a cube, or the simpler L and T shapes found in the branches of trees, yurts,
huts, tepees and simple dwellings and so on.
They analysed the frequency of
the shapes in 27 photographs of savannas and tribal life, 40 miscellaneous
photographs of rural and small-town life and 40 computer-generated images of
buildings. Much to their surprise, whether analysing the shapes in an urban
landscape, or those in a leafy wilderness, they had very similar distributions
of configurations and shapes.
Most striking of all, the team
found a high correlation between the most common contour combinations found in
nature and the most common contours found in letters and symbols across
cultures. For example, contours resembling an "L" or "X"
are more common in both human visual signs and natural scenes than anything
resembling an asterisk (*).
When the popularity of each shape
was plotted, a wiggly curve emerged that closely matched that of the popularity
of the forms and architectures found in nature: the most common letter shapes
mirrored common real-world shapes.
As a check that they had found
something truly significant, they looked at the distribution of shapes found in
trademark symbols. Once again, they follow the same plot, again suggesting that
it is looks that matter, as one would expect for a logo, not ease of writing.
The idealised flower used by BP may be hard to write but is easy to recognise
because it mirrors a natural shape.
For comparison, they applied the
same analysis to the shapes found in the scribbles of children and six kinds of
shorthand, where it is ease of writing that is paramount. Now the distribution
of shapes is not the same as found in nature. The easiest shapes to scribble
are not the most common. Thus, the reason the letters of the alphabet are
shaped as they are is to be in harmony with the mental machinery we have
evolved to analyse the patterns of the natural world, not for ease of writing,
said Dr Changizi.
"Vertebrates have evolved
for tens of millions of years with their visual systems having to be good at
recognising the configurations that are common out there in nature," he
said. "We don't have really good mechanisms for recognising shapes that
don't often occur in nature." As a result, letters and symbols based on
rare natural shapes are themselves rarities.
Given how the distribution of
features in our world is so similar, whether from an urban or a rural
environment, the team would not expect writing systems that evolved among
peoples who lived in desert regions to differ much from those of tribes in
tropical rainforests. Nor does he expect keyboards to have much impact:
"Despite the growth in the number of fonts, almost none of which is
written by hand any more, they appear to possess the same shapes as they always
did."
There is a cosmic dimension to
this study. Dr Changizi speculates that if there is intelligent alien life in
the universe, then so long as these creatures live, like us, among
"macroscopic opaque objects strewn about", they will evolve writing
symbols like our own. Alphabets on a planet orbiting another sun will, if
materials, light and shade are similar to our own world, have features in
common with those used on Earth: if ET writes home, we may think there is
something familiar about his handwriting.
THE GREAT TRANSITION - BY NEIL
SHUBIN
in Edge - The Third Culture
Every human handshake echoes the Devonian: the structures we shake with –
shoulder, elbow, wrist – were first seen in fish living in streams 370 million
years ago...
When we look back after 370 million years of evolution, the invasion of land
by fish appears special. However, if we could transport ourselves by time
machine to this early period, it isn't clear whether we would notice anything
extraordinary. We would see a lot of fish, some of them big and some of them
small, all of them struggling to survive and reproduce. Only now, 370 million
years later, do we see that one of those fish sat at the base of a huge branch
of the tree of life—a branch that includes everything from salamanders to
humans. It would have taken an uncanny sixth sense for us to have predicted
this outcome when our time machine deposited us in the middle of the Devonian. Continue reading: Edge - The Third Culture
HUMANS AS PREY
Rather than Man the Hunter, we may need to visualize ourselves as more
like Giant Hyena Chow, or Protein on the Go...
BY
DONNA HART in Chronicle
of Higher Education
Section: The Chronicle Review
Volume 52, Issue 33, Page B10 - April, 2006
Donna Hart is a lecturer in anthropology
at the University of Missouri at St. Louis and, with Robert W. Sussman, author of Man the Hunted: Primates, Predators, and Human
Evolution (Westview Press, 2005).
There's little doubt
that humans, particularly those in Western cultures, think of themselves as the
dominant form of life on earth. And we seldom question whether that view holds
true for our species' distant past — or even for the present, outside of
urban areas. We swagger like the toughest kids on the block as we spread our
technology over the landscape and irrevocably change it for other species.
Current reality does appear to perch humans
atop a planetary food chain. The vision of our utter superiority may even hold
true for the last 500 years, but that's just the proverbial blink of an eye
when compared to the seven million years that our hominid ancestors wandered
the planet.
"Where did we come from?" and
"What were the first humans like?" are questions that have been asked
since Darwin first proposed his theory of evolution. One commonly
accepted answer is that our early ancestors were killers of other species and
of their own kind, prone to violence and even cannibalism. In fact a
club-swinging "Man the Hunter" is the stereotype of early humans that
permeates literature, film, and even much scientific writing.
Man the Hunter purports to be based on
science. Even the great paleontologist Louis S.B. Leakey endorsed it when he
emphatically declared that we were not "cat food." Another legendary
figure in the annals of paleontology, Raymond A. Dart, launched the
killer-ape-man scenario in the mid-20th century with the help of the best
public-relations juggernaut any scientist ever had: the writer Robert Ardrey
and his best-selling book, African Genesis.
Dart had interpreted the finds in South
African caves of fossilized bones from savannah herbivores together with
damaged hominid skulls as evidence that our ancestors had been hunters. The
fact that the skulls were battered in a peculiar fashion led to Dart's firm
conviction that violence and cannibalism on the part of killer ape-men formed
the stem from which our own species eventually flowered. In his 1953 article
"The Predatory Transition From Ape to Man," Dart wrote that early
hominids were "carnivorous creatures, that seized living quarries by
violence, battered them to death, tore apart their broken bodies, [and]
dismembered them limb from limb, ... greedily devouring livid writhing
flesh."
But what is the evidence for Man the Hunter?
Could smallish, upright creatures with relatively tiny canine teeth and flat
nails instead of claws, and with no tools or weapons in the earliest millennia,
really have been deadly predators? Is it possible that our ancestors lacked the
spirit of cooperation and desire for social harmony? We have only two reliable
sources to consult for clues: the fossilized remains of the human family tree,
and the behaviors and ecological relationships of our living primate relatives.
When we investigate those two sources, a
different view of humankind emerges. First, consider the hominid fossils that
have been discovered. Dart's first and most famous find, the cranium of an Australopithecuschild
who died over two million years ago (called the "Taung child" after
the quarry in which the fossil was unearthed), has been reassessed by Lee
Berger and Ron Clarke of the University of the Witwatersrand, in light of recent research on eagle predation. The
same marks that occur on the Taung cranium are found on the remains of
similarly sized African monkeys eaten today by crowned hawk eagles, known to
clutch the monkeys' heads with their sharp talons.
C.K. Brain, a South African paleontologist
like Dart, started the process of relabeling Man the Hunter as Man the Hunted
when he slid the lower fangs of a fossil leopard into perfectly matched
punctures in the skull of another australopithecine, who lived between one
million and two million years ago. The paradigm change initiated by Brain
continues to stimulate reassessment of hominid fossils.
The idea that our direct ancestor Homo
erectus practiced cannibalism was based on the gruesome disfigurement of
faces and brain-stem areas in a cache of skulls a half-million years old, found
in the Zhoukoudian cave, in China. How else to explain these strange manipulations
except as relics of Man the Hunter? But studies over the past few years by Noel
T. Boaz and Russell L. Ciochon — of the Ross University School of Medicine
and the University of Iowa,
respectively — show that extinct giant hyenas could have left the marks as
they crunched their way into the brains of their hominid prey.
The list of our ancestors' fossils showing
evidence of predation continues to grow. A 1.75-million-year-old hominid skull
unearthed in the Republic of Georgia shows punctures from the fangs of a saber-toothed cat. Another skull,
about 900,000 years old, found in Kenya, exhibits carnivore bite marks on the brow ridge. A
six-million-year-old hominid, also found in Kenya, may well have been killed by a leopard. A fragment
of a 1.6-million-year-old hominid skull was found in the den of an extinct
hyena, in Spain. A cranium from 250,000 years ago, discovered in South Africa in 1935, has a depression on the forehead caused by a
hyena's tooth. Those and other fossils provide rock-hard proof that a host of
large, fierce animals preyed on human ancestors.
It is equally clear that, outside the West,
no small amount of predation occurs today on modern humans. Although we are not
likely to see these facts in American newspaper headlines, each year 3,000
people in sub-Saharan Africa are eaten by crocodiles, and 1,500 Tibetans are
killed by bears about the size of grizzlies. In one Indian state between 1988
and 1998, over 200 people were attacked by leopards; 612 people were killed by
tigers in the Sundarbans delta of India and Bangladesh between 1975 and 1985. The carnivore zoologist Hans
Kruuk, of the University of Aberdeen, studied death records in Eastern Europe and
concluded that wolf predation on humans is still a fact of life in the region,
as it was until the 19th century in Western European countries like France and Holland.
The fact that humans and their ancestors are
and were tasty meals for a wide range of predators is further supported by
research on nonhuman primate species still in existence. My study of predation
found that 178 species of predatory animals included primates in their diets.
The predators ranged from tiny but fierce birds to 500-pound crocodiles, with a
little of almost everything in between: tigers, lions, leopards, jaguars,
jackals, hyenas, genets, civets, mongooses, Komodo dragons, pythons, eagles,
hawks, owls, and even toucans.
Our closest genetic relatives, chimpanzees
and gorillas, are prey to humans and other species. Who would have thought that
gorillas, weighing as much as 400 pounds, would end up as cat food? Yet Michael
Fay, a researcher with the Wildlife Conservation Society and the National
Geographic Society, has found the remnants of a gorilla in leopard feces in the
Central African Republic. Despite their obvious intelligence and strength,
chimpanzees often fall victim to leopards and lions. In the Tai Forest in the Ivory Coast, Christophe Boesch, of the Max Planck Institute,
found that over 5 percent of the chimp population in his study was consumed by
leopards annually. Takahiro Tsukahara reported, in a 1993 article, that 6
percent of the chimpanzees in the Mahale Mountains National Park of Tanzania
may fall victim to lions.
The theory of Man the Hunter as our
archetypal ancestor isn't supported by archaeological evidence, either. Lewis
R. Binford, one of the most influential figures in archaeology during the last
half of the 20th century, dissented from the hunting theory on the ground that
reconstructions of early humans as hunters were based on a priori positions and
not on the archaeological record. Artifacts that would verify controlled fire
and weapons, in particular, are lacking until relatively recent dates. Because
no hominids possess the dental equipment or digestive tract to eat raw flesh,
we need to be able to cook our meat, but the first evidence of controlled fire
is from only 790,000 years ago.
And, of course, there's also the problem of
how a small hominid could subdue a large herbivore. The first true weapon we
know of is a wooden spear about 400,000 years old, although the archaeologist
John Shea, of the State University of New York at Stony Brook, likened it to a
glorified toothpick. Large-scale, systematic hunting of big herbivores for meat
may not have occurred any earlier than 60,000 years ago — over six million
years after the first hominids evolved.
What I am suggesting, then, is a less
powerful, more ignominious beginning for our species. Consider this alternate
image: smallish beings (adult females maybe weighing 60 pounds, with males a
bit heavier), not overly analytical because their brain-to-body ratio was
rather small, possessing the ability to stand and move upright, who basically
spent millions of years as meat walking around on two legs. Rather than Man the
Hunter, we may need to visualize ourselves as more like Giant Hyena Chow, or
Protein on the Go.
Our species began as just one of many that
had to be careful, to depend on other group members, and to communicate danger.
We were quite simply small beasts within a large and complex ecosystem.
Is Man the Hunter a cultural construction of
the West? Belief in a sinful, violent ancestor does fit nicely with Christian
views of original sin and the necessity to be saved from our own awful, yet
natural, desires. Other religions don't necessarily emphasize the ancient
savage in the human past; indeed, modern-day hunter-gatherers, who have to live
as part of nature, hold animistic beliefs in which humans are a part of the web
of life, not superior creatures who dominate or ravage nature and each other.
Think of Man the Hunted, and you put a
different face on our past. The shift forces us to see that for most of our
evolutionary existence, instead of being the toughest kids on the block, we
were merely the 90-pound (make that 60-pound) weaklings. We needed to live in
groups (like most other primates) and work together to avoid predators. Thus an
urge to cooperate can clearly be seen as a functional tool rather than a
Pollyannaish nicety, and deadly competition among individuals or nations may be
highly aberrant behavior, not hard-wired survival techniques. The same is true
of our destructive domination of the earth by technological toys gone mad.
Raymond Dart declared that "the
loathsome cruelty of mankind to man ... is explicable only in terms of his
carnivorous, and cannibalistic origin." But if our origin was not
carnivorous and cannibalistic, we have no excuse for loathsome behavior. Our
earliest evolutionary history is not pushing us to be awful bullies. Instead,
our millions of years as prey suggest that we should be able to take our
heritage of cooperation and interdependency to make a brighter future for
ourselves and our planet.
FELINE PREDATOR
GAMBLING – THE PIPE
DREAM PAYOFF
By: Nando
Pelusi Ph.D. in Psychology Today , 3-21-2006
Summary: Why high rollers
think they'll hit it big. The author says we'd bet the farm for a shot at the
jackpot.
"One
more roll, and I'll get it all back."
Everyone gets mildly addicted to something.
For some people it is gambling. Square foot for neon square foot, the casino is
the highest density den of absorption—and vice—known to man. But a casino is
just the racy poster girl for a multibillion-dollar industry that flourishes
from the office betting pool to the PowerBall jackpot—anywhere we defy logic
and statistics in pursuit of an easy payoff. Gambling plays on at least two
human universals: the urge to get something for nothing and the difficulty of
giving up that dream, no matter how high the stakes or the odds against it.
Winning vs.
Recouping Losses
For most of human prehistory, living through
the night was not a given. For this reason, goes the evolutionary hypothesis,
our ancestors learned to take what we'd now consider murderous chances in
pursuit of food and mates. Those who continuously gambled and won became our
forebearers, passing on a taste for the "off chance." The possibility
that a big score could be just around the corner, but you never know where or
when you'll hit on it, parallels modern gambling: One more rock overturned and
you find dinner.
When you start losing, the darker side of
that equation asserts itself: "One more roll and I'll recoup my
money" becomes a formula for huge losses. No one is exempt. Remember the
Barings Bank fiasco, in which the Rolls Royce of British banking was felled by
a lone trader who kept making bad bets on derivatives, desperate to dig himself
out of a hole? His gut overrode his training. We're all vulnerable to this risk
instinct, the feeling that "I can and must recoup my losses."
When Risk
Pays
For our ancestors, it was actually risky to avoid
risk altogether. Sometimes the next big score really is just around the
corner. If you find an edible critter behind one in 50 rocks, your foraging
pays off, especially when the terrain is safe.
In this case, one in 50 is excellent odds
because you're in a low-risk, potentially high-yield situation. It's sort of
like online dating: Meeting 10 people for coffee is not a huge imposition,
especially since you could be finding your future partner.
Playing the slots is designed to feel
similarly risk-free, but in reality it's high-risk, low-yield, at least in the
long run. You're practically guaranteed a net loss and have only the slimmest
shot at the jackpot. Another disadvantage: Gambling doesn't teach you anything
new, whereas the risks our ancestors took for survival had a steep learning
curve—after overturning four dozen rocks, you've identified some helpful
patterns.
An F in
Finance and Statistics
Money is a relatively new concept for the
human species. We learn about it the way we learn to read or play the
piano—with effort. For most of us, money makes scant intuitive sense. We
understand trade, but fiat value eludes us. A diamond is vastly more valuable
than a cup of water, until you're dehydrated. So which has more
"value?" We learn to contextualize money because it's not natural to
think in mathematical abstractions.
As bad as we are with money, we're worse when
it comes to statistics. High risk is linked to high yield in our minds, because
risks like staging a coup or making a power play are often worthwhile. But what
about wagering double-or-nothing? Gambling upends the natural correlation
between high risk and high yield. Losses quickly add up, but the gains don't
increase accordingly, though we're likely to think they do. That's because
we're notoriously bad oddsmakers.
Even mild proficiency in statistics requires
study. Flying is safer than automobile travel; more than twice as many people
lose their lives in car accidents each year than have died in the entire
history of air travel. But the shock from a single plane crash evokes awe,
whereas the tallies of auto fatalities put us to sleep. Same with gambling: Two
percent interest earned on our checking account does not hold a candle to the
lights, bells and riches of a casino jackpot. We're drawn to the more
astonishing event, regardless of its probability.
Stakes at
an All-Time High
Today, you can pad from your bed to your
computer in your furry slippers and start clicking away your funds at 4 A.M. Online gambling is the ultimate example of the
high-yield, low-risk paradox. A virtual casino is practically barrier-free, so
it seems risk-free. In fact, it is low-yield and high-risk.
Further, it contains even more unknowns (the state of security, unfair
practices by the house, collusion among players) than casino gambling. As the
advent of the Internet makes clear, gambling itself is evolving too quickly for
us to catch up.
Wagering
Wisely
Beware
of the tendency to tell yourself that you must recoup your losses. This drags you further into the hole and perpetuates the obsessive aspect of gambling.
Argue with the instinct that you can beat the system by reminding yourself of
the statistical realities.
Place restrictions on your wagering before you get sucked in. The rush you get from gambling is fun and often legal, but beware of your judgment while in the zone. Leave your credit card in a safe place and do not let it into the casino.
Your risk instinct generally operates below the level of conscious awareness. Make your self-talk about the fun of gambling conscious—so that you
control the gambling, rather than the other way around.
If
these suggestions don't help, place it all on 21 Red (I have a hunch). No,
make that 21 Black.
RECOMMENDED SKEPTICS' BLOGS:
Pharyngula PZ MYERS' blog
RATIONALLY SPEAKING: MASSIMO PIGLIUCCI'S BLOG
FALLACIOUS ASSAULTS
The technology gap in the old Soviet Union gave rise to jokes, for example: The latest
achievements of the East German electronics company Robotron were celebrated—they had built the world's largest microchip!
This was spotted @ Evolving
Thoughts a skeptics' blog BY JOHN WILKINS - a postdoctoral research fellow at the Biohumanities Project, University of Queensland:
Here is a A MODEL SCIENTIFIC PAPER - the very paradigm of what a scientific
paper should be...
ELECTRON BAND STRUCTURE IN GERMANIUM, My Ass
Abstract: The exponential dependence of resistivity on
temperature in germanium is found to be a great big lie. My careful theoretical
modeling and painstaking experimentation reveal 1) that my equipment is crap,
as are all the available texts on the subject and 2) that this whole exercise
was a complete waste of my time. Read the paper Here ...
OR - jump to the
Conclusion
Going into physics was
the biggest mistake of my life. I should've declared CS. I still wouldn't have
any women, but at least I'd be rolling in cash.
Q.) Describe
the sound of a moist waffle falling onto a hot griddle.
A.) Spit [John Wilkins]
...Gratuitous Nudity Pic
Good fortune. COMMENT! Please spread the
meme. Don’t smoke in bed…