Information

Is there biological evidence for self-awareness in animals?


I am conducting an investigation into the topic of the intelligence of animals, in particular farm animals. I would be interested to hear a scientific and biologic perspective as to what is perceived to be self-awareness, whether it be in a human or not. Specifically, I am interested in the high-level self-awareness that we as humans experience and that is only now, beginning to be theorised as possible in other mammals (e.g. Dolphins).

The question is framed in a robotic context, where a being (natural or artificial) is defined to be self-aware if it can approximate enough human behaviour and thought, that it becomes indistinguishable from a human. This term we could label sentience, but this is a subjective term, and so I will avoid using it for the purpose of this question. I will define a being as self-aware in the robot context, as a being capable of both conscious reflection of itself and of its own conciousness.

One of the problems I have found in my investigation, is a serious misunderstanding about the nature of self-awareness, amongst the general public.

It seems that many people incorrectly assume that self-awareness is a product of a being having a soul. This is rather unfortunate for animals as orthodox Christianity teaches that animals do not have souls, and therefore are incapable of attaining self-awareness.

My investigation aims to bring to mainstream public attention, a new definition of self-awareness, and to end the centuries-old misconception that we are the only intelligent, self-aware creatures on the planet.


Yes, there is a lot of evidence for self-awareness in animals. The gold standard test for self-awareness is the mirror self-recognition test (MSR)--we can tell from the behavior of an animal exposed to a mirror whether or not it can tell itself apart from other animals of the same species:

Animals that possess MSR typically progress through four stages of behavior when facing a mirror: (i) social responses, (ii) physical inspection (e.g., looking behind the mirror), (iii) repetitive mirror-testing behavior, and (iv) realization of seeing themselves. (Plotnik et al. 2006)

That quote was taken from the abstract of a paper showing self-recognition in a single African elephant, but there is evidence for self-awareness in magpies (Prior et al. 2008), dolphins (Harley 2013), chimpanzees (Povinelli et al. 2006) and orangutans (Suarez and Gallup 1981). Many of these studies use the Gallup mark test (Wikipedia):

Gallup anaesthetised the chimpanzees and then painted a red alcohol-soluble dye on the eyebrow ridge and on the top half of the opposite ear. When the dye dried, it had virtually no olfactory or tactile cues. Gallup then returned the chimpanzees to the cage (with the mirror removed) and allowed them to regain full consciousness. He then recorded the frequency which the chimpanzees spontaneously touched the marked areas of skin. After 30 minutes, the mirror was re-introduced into the room and the frequency of touching the marked areas again determined. The frequency of touching increased to 4-10 with the mirror present compared to only 1 when the mirror had been removed. The chimpanzees sometimes inspected their fingers visually or olfactorily after touching the marks.

Basically, an animal which has seen and recognized itself in a mirror recognizes the mark as being new, so it touches its own body on the marked area, which shows that the animal recognizes the image in the mirror as itself.


Sexuality, Gender and Justice

Few aspects of human biology are as complex&mdashor politically fraught&mdashas sexual orientation. A clear genetic link would suggest that gay people are &ldquoborn this way,&rdquo as opposed to having made a lifestyle choice. Yet some fear that such a finding could be misused &ldquocure&rdquo homosexuality, and most research teams have shied away from tackling the topic.

Now, a new study claims to dispel the notion that a single gene or handful of genes make a person prone to same-sex behavior. The analysis, which examined the genomes of nearly half a million men and women, found that although genetics are certainly involved in who people choose to have sex with, there are no specific genetic predictors. Yet some researchers question whether the analysis, which looked at genes associated with sexual activity rather than attraction, can draw any real conclusions about sexual orientation.

&ldquoThe message should remain the same that this is a complex behavior that genetics definitely plays a part in,&rdquo said study co-author Fah Sathirapongsasuti, a computational biologist at genetic testing company 23andMe in Mountain View, Calif., during a press conference. The handful of genetic studies conducted in the past few decades have looked at only a few hundred individuals at most&mdashand almost exclusively men. Other studies have linked sexual orientation with environmental factors such as hormone exposure before birth and having older brothers.

In the new study, a team led by Brendan Zietsch of the University of Queensland, Australia, mined several massive genome data banks, including that of 23andMe and the UK Biobank (23andMe did not fund the research). They asked more than 477,000 participants whether they had ever had sex with someone of the same sex, and also questions about sexual fantasies and the degree to which they identified as gay or straight.

The researchers found five single points in the genome that seemed to be common among people who had had at least one same-sex experience. Two of these genetic markers sit close to genes linked to sex hormones and to smell&mdashboth factors that may play a role in sexual attraction. But taken together, these five markers explained less than 1 percent of the differences in sexual activity among people in the study. When the researchers looked at the overall genetic similarity of individuals who had had a same-sex experience, genetics seemed to account for between 8 and 25 percent of the behavior. The rest was presumably a result of environmental or other biological influences. The findings were published Thursday in Science.

Despite the associations, the authors say that the genetic similarities still cannot show whether a given individual is gay. &ldquoIt&rsquos the end of the &rsquogay gene,&rsquo&rdquo says Eric Vilain, a geneticist at Children&rsquos National Health System in Washington, D.C., who was not involved in the study.

The research has limitations: almost all of the participants were from the U.S. or Europe, and the individuals also tended to be older&mdash51 years old on average in the 23andMe sample and at least 40 in the UK Biobank sample.

Still, researchers welcome the data. &ldquoA lot of people want to understand the biology of homosexuality, and science has lagged behind that human interest,&rdquo says William Rice, an evolutionary geneticist at the University of California, Santa Barbara, who also was not involved in the work. &ldquoIt&rsquos been a taboo topic, and now that we&rsquore getting information I think it&rsquos going to blossom.&rdquo

The study will not be the last word on the vexing question of what causes homosexuality, however. In 1993 geneticist Dean Hamer of the U.S. National Cancer Institute and his colleagues published a paper suggesting that an area on the X chromosome called Xq28 could contain a &ldquogay gene.&rdquo But other studies, including the new paper, found no such link, and Sathirapongsasuti says that the new study is the final nail in the coffin for Xq28 as a cause of same-sex attraction.

But Hamer, now retired, disagrees. His study, which analysed the genomes of 40 pairs of gay brothers, looked exclusively at people who identified as homosexual. He sees the new paper as an analysis of risky behavior or openness to experience, noting that participants who engaged in at least one same-sex experience were also more likely to report having smoked marijuana and having more sexual partners overall. Hamer says that the findings do not reveal any biological pathways for sexual orientation. &ldquoI&rsquom glad they did it and did a big study, but it doesn&rsquot point us where to look.&rdquo

Rice and Vilain agree that the conclusion is unclear. A more detailed questionnaire that looks at more aspects of sexuality and environmental influences would allow the researchers to better pinpoint the roots of attraction.

The authors say that they did see links between sexual orientation and sexual activity, but concede that the genetic links do not predict orientation. &ldquoI think it&rsquos true we&rsquore capturing part of that risk-taking behavior,&rdquo Sathirapongsasuti says, but the genetic links still suggested that same-sex behavior is related to attraction.

Nevertheless, Hamer and others praise the new contribution to a field that suffers from a dearth of good studies. &ldquoI hope it will be the first of many to come.&rdquo


Continental Drift: Theory, Evidences and Explanation | Biology

In this article we will discuss about Continental Drift:- 1. Continental Drift Theory 2. Peninsular India 3. Paleontological Evidence of Continental Drift 4. Physical Evidence of Continental Drift 5. Palaeoclimate 6. Explanation of the Continental Drift Theory in the Light of Vertebrate Distribution.

  1. Continental Drift Theory
  2. Peninsular India
  3. Paleontological Evidence of Continental Drift
  4. Physical Evidence of Continental Drift
  5. Palaeoclimate
  6. Explanation of the Continental Drift Theory in the Light of Vertebrate Distribution

1. Continental Drift Theory:

Most palaeontologists are in favour that in early palaeozoic, the six major land masses were coalesced and formed a single super land mass, called Pangaea (Fig. 1,6A), first named by a German meteorologist, Alfred Wegner in 1912.

A Paper, “The Origin of Continents and Oceans” (Die Enstehung der Kontenente) was published by him in German in 1912 and in his paper he postulated a theo­ry that the different continents coalesced in the early Palaeozoic period and subsequently broke in early or mid-Mesozoic and drifted apart to occupy the present positions.

This the­ory was not accepted in those days but widely accepted since 1960 with the studies of Palaeomagnetism of the ocean floor. From the mid-60s, the concept of Continental Drift has been incorporated into the view of tectonic plates.

According to recent findings, Wegner’s Continental Drift Theory has gained a wide acceptance among different scientists.

The exact period of coalescence and separation of different land masses is still controversial. At present different scientists have presented different opinions on the basis of analysis of palaeomagnetic data. There were two land masses throughout the Devonian period, of which is Laurasia (Angara), (Fig. 1,6B) a northern land mass that included St. Lawrence area of North America, Europe and parts of Asia.

Another southern land mass is called Gondwanaland (Fig. 1,6B) (named after a South Indian tribe, Gond) that is formed by the union of South America, Africa, Arabia, India, Madagascar, Antarctica and Australia. In the middle Carboniferous period, the two land-masses, converged to form a single super land-mass, is called Pangaea.

Pangaea lasted about 100 million of years. In the late Triassic or in the early Jurassic, the tectonic plate movement and volcanic activity forced to rupture the Pangaea into smaller present land masses (Fig. 1.6C).

In the Jurassic, the Gondwana rotated anticlockwise, and Europe and N. America separated from it and moved north­wards so that in the Jurassic, the Tethys Sea (named after the Greek ‘mother of seas’, the daughter of Oceanus) created in between Africa and Eurasia, and in coarse of time, it became the Mediterranean sea.

The breakup of Pangaea produced a series of Orogeny (mountain building) that continues up to date. During the Cretaceous period the formation of Rocky mountains and the Andes along North America and Western South America took place. Also in the N. America, the late Palaeozoic and Mesozoic era is marked by the Appalachian Revolution, in which, mountain formation took place by the compression of the tecto­nic forces.

In between middle to late Cretaceous peri­od, South America, Antarctica and Australia separated from Africa, and the Tethys Sea, opened to the Atlantic. In the Eocene, Australia separated from the Antarctica and India drifted northwards and joined with the Asia about 25 million years ago. The conti­nents are still drifting today. North America recedes westward and Australia northward at approximately 4 cm per year.

2. Peninsular India:

Peninsular India was an elevated land mass for the past 150 million years. The par­ting of Gondwana that started in the late Jurassic and in this division of lands, South America, Antarctica and Australia were inclu­ded in one land mass, and Africa, Madagascar and India in another mass.

India rotated away from Madagascar and began to move northwards about 100 Ma ago, and finally joined with Asian part of Laurasia about 25 Ma ago when the great Oligocene and Miocene oogenesis of the Himalayan-Alpine belt were uplifted.

3. Paleontological Evidence of Continental Drift:

The Tethys Sea, which extended as a shal­low epicontinental sea across southern Asia and north-Africa also washed the ancient northern coastline of India. The Jurassic fauna of Tethys Sea is known from Europe in the west and in the east it is in the Himalayan region.

There are still marine sediments which mark the northern coastline of India that ran from the Cutch in the North-west, eastward along the present course of the Narmada valley to Baroda.

The southern coastline of Indian peninsula is not known from the Jurassic sedi­ments but is known from the Cretaceous onwards of Tiruchirapalli district of southern India. From the analysis of faunal composition it is presumed that the nature of fauna which inhabited the sea to the south of India presum­ably the forerunner of the Indian Ocean.

After the elevation of the Himalayan- Alpine mountains at the end of Miocene, the realm of the Tethys became restricted to the present northern portion of the Indian Ocean. The palaeontological picture suggests that India has occupied the northern Indian Ocean since the late Jurassic times and its northern peninsular region has lain near the boundaries of both the Tethys and the Indian Ocean.

4. Physical Evidence of Continental Drift:

On the basis of Palaeomagnetic observa­tions, Adie (1965) and Creer (1966) refer that India formed part of a supercontinent, Gondwanaland located near the geographic South Pole in the early Mesozoic. The superconti­nent started to be disrupted in the mid-Jurassic times and the process of dismemberment ended in the Cretaceous and India drifted at least 60° of latitude northward in the post Jurassic time.

Bullard (1969) states that Palaeomagnetic work show that India has been moving northward for the past 100 Ma. McElhinny (1969) deduces that an Indo-Madagascar-Antarctica block broke away from Africa between 155 and 100 million years ago, opening up the Indian Ocean for the first time.

An India-Madagascar block then separated from Antarctica, and at first drifted southwards before reversing its course to move northward.

In the early period of Palaeozoic, from Cambrian to Devonian, the climate was more or less constant. Due to extensive glaciation from late Proterozoic, the entire earth became cool during the Cambrian period.

But this is not proved by the geological record. From the Ordovician to Silurian, a uniform tropical and subtropical climate prevailed in the vicinity of equator. Late Carboni­ferous and Permian is marked by the arid and cooler climate.

In the Triassic of Mesozoic, the climate was warm and arid. Jurassic was marked by moist climate. The Cretaceous was warm and moist for the most part. In the beginning of Tertiary, the tropical and sub-tropical regions extended its areas. At the end of Tertiary, the area of tropi­cal zone had gradually receded. Pleistocene was the period of glaciation and inter-glaciation.

6. Explanation of the Continental Drift Theory in the Light of Vertebrate Distribution:

Some of the enigmatic facts are that the present distribution of animals can be explained on the basis of Continental Drift Theory. It is an established fact that the distri­bution of mammals in the Northern hemi­sphere (Nearctic and Palaearctic region) has more similarity than the distribution of mam­mals in southern regions (Africa, South America and Australia).

It is believed that the animals of the northern region have spread in recent times due to land connections at the Bering Strait and it has been established from the theory of Continental Drift that the Laurasia started splitting and drifted apart at a later date than parts of Gondwana.

The mammals of the Southern hemisphere originated during the breakup of the conti­nents and underwent evolution after the split­ting. Once the mammals are isolated, they may evolve in different ways and, hence, as Gondwana began to split first, it is expected that the mammals of different continents evolved and radiated separately.

Thus the South American mammals like llamas, alpacas, sloths, armadillos are not found in other continents of Southern hemisphere. The giraffe, zebra, hippopotamus, chimpanzee and gorilla of the Ethiopian region and egg-laying mammals and some marsupials of Australian region are not found anywhere except these regions.

Though the distribution of mammals in the Gondwana is different, but, in contrast to that, the distribution of some fish, amphib­ians and reptiles are very similar in relation to the Continental Drift Theory.

The distribution of the lung fishes supports the Continental Drift. The dipnoans which appeared in mid-Devonian, are represented now by three genera Protopterus, Lepidosiren and Neoceratodus, and are found in Africa, South America and Australia respectively.

This distribution reminds us that the Gondwana lands in the mid-Devonian and Africa, South America and Australia were in contact at that period and, with the separation of these continents, the dipnoans are repre­sented by three separate genera at present.

Among amphibians, the distribution of the members does not support the Continen­tal Drift at random. Only the distribution of pipids in between Africa and South America, and salamanders in Nearctic and eastern part of Eurasia support the Continental Drift to some extent.

Among reptiles, the distribution of some members supports the Continental Drift. Some members of the primitive chelonians origina­ted during Permian period and still are con­fined to the Southern hemisphere. The crocodilians such as crocodiles and gharials originated in the Triassic period, now restric­ted to the Central America, Africa, North Australia and in some parts of Oriental region.


Animal minds Animals think, therefore…

IN 1992, at Tangalooma, off the coast of Queensland, people began to throw fish into the water for the local wild dolphins to eat. In 1998, the dolphins began to feed the humans, throwing fish up onto the jetty for them. The humans thought they were having a bit of fun feeding the animals. What, if anything, did the dolphins think?

Charles Darwin thought the mental capacities of animals and people differed only in degree, not kind—a natural conclusion to reach when armed with the radical new belief that the one evolved from the other. His last great book, “The Expression of Emotions in Man and Animals”, examined joy, love and grief in birds, domestic animals and primates as well as in various human races. But Darwin’s attitude to animals—easily shared by people in everyday contact with dogs, horses, even mice—ran contrary to a long tradition in European thought which held that animals had no minds at all. This way of thinking stemmed from the argument of René Descartes, a great 17th-century philosopher, that people were creatures of reason, linked to the mind of God, while animals were merely machines made of flesh—living robots which, in the words of Nicolas Malebranche, one of his followers, “eat without pleasure, cry without pain, grow without knowing it: they desire nothing, fear nothing, know nothing.”

For much of the 20th century biology cleaved closer to Descartes than to Darwin. Students of animal behaviour did not rule out the possibility that animals had minds but thought the question almost irrelevant since it was impossible to answer. One could study an organism’s inputs (such as food or the environment) or outputs (its behaviour). But the organism itself remained a black box: unobservable things such as emotions or thoughts were beyond the scope of objective inquiry. As one such “behaviourist” wrote in 1992, “attributing conscious thought to animals should be strenuously avoided in any serious attempt to understand their behaviour, since it is untestable [and] empty. ”.

By then, though, there was ever greater resistance to such strictures. In 1976 a professor at Rockefeller University in New York, Donald Griffen, had taken the bull by the horns (leaving aside what the bull might have felt about this) in a book called “The Question of Animal Awareness”. He argued that animals could indeed think and that their ability to do this could be subjected to proper scientific scrutiny.

In the past 40 years a wide range of work both in the field and the lab has pushed the consensus away from strict behaviourism and towards that Darwin-friendly view. Progress has not been easy or quick as the behaviourists warned, both sorts of evidence can be misleading. Laboratory tests can be rigorous, but are inevitably based on animals which may not behave as they do in the wild. Field observations can be dismissed as anecdotal. Running them for years or decades and on a large scale goes some way to guarding against that problem, but such studies are rare.

Nevertheless, most scientists now feel they can say with confidence that some animals process information and express emotions in ways that are accompanied by conscious mental experience. They agree that animals, from rats and mice to parrots and humpback whales, have complex mental capacities that a few species have attributes once thought to be unique to people, such as the ability to give objects names and use tools and that a handful of animals—primates, corvids (the crow family) and cetaceans (whales and dolphins)—have something close to what in humans is seen as culture, in that they develop distinctive ways of doing things which are passed down by imitation and example. No animals have all the attributes of human minds but almost all the attributes of human minds are found in some animal or other.

Consider Billie, a wild bottlenose dolphin which got injured in a lock at the age of five. She was taken to an aquarium in South Australia for medical treatment, during which she spent three weeks living with captive dolphins which had been taught various tricks. She herself, though, was never trained. After she was returned to the open sea local dolphin-watchers were struck to see her “tailwalking”—a move in which a dolphin stands up above the water by beating its flukes just below the surface, travelling slowly backwards in a vaguely Michael Jackson manner. It was a trick that Billie seemed to have picked up simply by watching her erstwhile pool mates perform. More striking yet, soon afterwards five other dolphins in her pod started to tailwalk, though the behaviour had no practical function and used up a lot of energy.

Such behaviour is hard to understand without imagining a mind that can appreciate what it sees and which intends to mimic the actions of others (see “The imitative dolphin”). That in turn implies things about the brain. If you had to take a bet on things to be found in Billie’s brain, you’d be well advised to put money on “mirror neurons”. Mirror neurons are nerve cells that fire when the sight of someone else’s action triggers a matched response—they seem to be what makes yawning contagious. A lot of learning may require this way of linking perception to action—and it seems that, in people, so may some forms of empathy.

Mirror neurons are important to scientists attempting to find the basis of the way the human mind works, or at least to find correlates of that working, in the anatomy of human brains. The fact that those anatomical correlates keep turning up in non-human brains, too, is one of the current reasons for seeing animals as also being things with minds. There are mirror neurons there are spindle cells (also called von Economo neurons) which play a role in the expression of empathy and the processing of social information. Chimpanzee brains have parts corresponding to Broca’s area and Wernicke’s area which, in people, are associated with language and communication. Brain mapping reveals that the neurological processes underlying what look like emotions in rats are similar to those behind what clearly are emotions in humans. As a group of neuroscientists seeking to sum the field up put it in 2012, “Humans are not unique in possessing the neurological substrates that generate consciousness. Non-human animals, including all mammals and birds, and many other creatures. also possess these neurological substrates.”

BBut to say that animals have a biological basis for consciousness is not the same as saying they actually think or feel. Here, ideas from the law may be more helpful than those from neurology. When someone’s state of being is clearly impaired by a calamity of some sort, it can fall to the courts to decide what level of legal protection should apply. In such cases courts apply tests such as: is he or she self-aware? Can he recognise others as individuals? Can he regulate his own behaviour? Does he experience pleasure or suffer pain (that is, show emotion)? Such questions reveal a lot about animals, too.

The most common test of self-awareness is the ability to recognise yourself in a mirror. It implies you are seeing yourself as an individual, separate from other beings. The test was formally developed in 1970 by Gordon Gallup, an American psychologist, though its roots go back further Darwin wrote about Jenny, an orang-utan, playing with a mirror and being “astonished beyond measure” by her reflection. Dr Gallup daubed an odourless mark on the face of his subjects and waited to see how they would react when they saw their reflection. If they touched the mark, it would seem they realised the image in the mirror was their own, not that of another animal. Most humans show this ability between the ages of one and two. Dr Gallup showed that chimpanzees have it, too. Since then, orang-utans, gorillas, elephants, dolphins and magpies have shown the same ability. Monkeys do not nor do dogs, perhaps because dogs recognise each other by smell, so the test provides them with no useful information.

Recognising yourself is one thing what of recognising others—not just as objects, but as things with purposes and desires like one’s own, but aimed at different ends. Some animals clearly pass this test too. Santino is a chimpanzee in Furuvik zoo in Sweden. In the 2000s zookeepers noticed that he was gathering little stockpiles of stones and hiding them around his cage, even constructing covers for them, so that at a later time he would have something to throw at zoo visitors who annoyed him. Mathias Osvath of Lund University argues that this behaviour showed various types of mental sophistication: Santino could remember a specific event in the past (being annoyed by visitors), prepare for an event in the future (throwing stones at them) and mentally construct a new situation (chasing the visitors away).

Philosophers call the ability to recognise that others have different aims and desires a “theory of mind”. Chimpanzees have this. Santino seemed to have understood that zookeepers would stop him throwing stones if they could. He therefore hid the weapons and inhibited his aggression: he was calm when collecting the stones, though agitated when throwing them. An understanding of the capabilities and interests of others also seems in evidence at the Centre for Great Apes, a sanctuary in Florida, where male chimpanzees living with Knuckles, a 16-year-old with cerebral palsy, do not subject him to their usual dominance displays. Chimps also understand that they can manipulate the beliefs of others they frequently deceive each other in competition for food.

Another test of legal personhood is the ability to experience pleasure or pain—to feel emotions. This has often been taken as evidence of full sentience, which is why Descartes’s followers thought animals were unable to feel, as well as reason. Peter Singer, an Australian philosopher and doyen of “animal rights”, argues that, of all the emotions, suffering is especially significant because, if animals share this human capacity, people should give consideration to animal suffering as they do to that of their own kind.

Animals obviously show emotions such as fear. But this can be taken to be instinctual, similar to what happens when people cry out in pain. Behaviourists had no trouble with fear, seeing it as a conditioned reflex that they knew full well how to create. The real question is whether animals have feelings which involve some sort of mental experience. This is not easy. No one knows precisely what other people mean when they talk about their emotions knowing what dumb beasts mean is almost impossible. That said, there are some revealing indications—most notably, evidence for what could be seen as compassion.

Some animals seem to display pity, or at least concern, for diseased and injured members of their group. Stronger chimps help weaker ones to cross roads in the wild. Elephants mourn their dead (see “The grieving elephant”). In a famous experiment, Hal Markowitz, later director of the San Francisco zoo, trained Diana monkeys to get food by putting a token in a slot. When the oldest female could not get the hang of it, a younger unrelated male put her tokens in the slot for her and stood back to let her eat.

There have also been observations of animals going out of their way to help creatures of a different species. In March 2008, Moko, a bottlenose dolphin, guided two pygmy sperm whales out of a maze of sandbars off the coast of New Zealand. The whales had seemed hopelessly disoriented and had stranded themselves four times. There are also well-attested cases of humpback whales rescuing seals from attack by killer whales and dolphins rescuing people from similar attacks. On the face of it, this sort of concern for others looks moral—or at least sentimental.

In a few examples the protecting animals have been seen to pay a price for their compassion. Iain Douglas-Hamilton, who studies elephants, describes a young female which had been so severely injured that she could only walk at a snail’s pace. The rest of her group kept pace with her to protect her from predators for 15 years, though this meant they could not forage so widely. As long ago as 1959, Russell Church of Brown University set up a test which allowed laboratory rats in half of a cage to get food by pressing a lever. The lever also delivered an electric shock to rats in the other half of the cage. When the first group realised that, they stopped pressing the lever, depriving themselves of food. In a similar test on rhesus monkeys reported in the American Journal of Psychiatry in 1964, one monkey stopped giving the signal for food for 12 days after witnessing another receive a shock. There are other examples of animals preferring some sort of feeling over food. In famous studies by an American psychologist, Harry Harlow, rhesus monkeys deprived of their mothers were given a choice between substitutes. One was made of wire and had a feeding bottle, the other was cloth, but without food. The infants spent almost all their time hugging the cloth mother.

IIf animals are self-aware, aware of others and have some measure of self-control, then they share some of the attributes used to define personhood in law. If they display emotions and feelings in ways that are not purely instinctive, there may also be a case for saying their feelings should be respected in the way that human feelings are. But the attribute most commonly thought of as distinctively human is language. Can animals be said to use language in a meaningful way?

Animals communicate all the time and don’t need big brains to do so. In the 1940s Karl von Frisch, an Austrian ethologist, showed that the “waggle dances” of honeybees pass on information about how far away food is and in what direction. Birds sing long, complex songs either to mark territory or as mating rituals. So do pods of whales (see “The singing whales”). It is hard, though, to say what information, or intention, goes into all this. The bees are more likely to be automatically downloading a report of their recent travels than saying, “There’s pollen thataway, slackers.”

The vocalisations of, say, vervet monkeys have more to them. Vervets make different alarm calls for different predators, demanding different responses. There is one for leopards (skitter up into the highest branches), for eagles (hide in the undergrowth) and for snakes (stand upright and look around). The monkeys need to recognise the different calls and know when to make which one. Animals brought up with humans can do much more. Chaser, a border collie, knows over 1,000 words. She can pull a named toy from a pile of other toys. This shows that she understands that an acoustical pattern stands for a physical object. Noam Chomsky, a linguist, once said only people could do that. Remarkably, if told to fetch a toy with a name she has not heard before placed in a pile of known, named objects, she works out what is being asked for. Betsy, another border collie, will bring back a photograph of something, suggesting she understands that a two-dimensional image can represent a three-dimensional object.

More impressive still are animals such as Washoe, a female chimpanzee which was taught sign language by two researchers at the University of Nevada. Washoe would initiate conversations and ask for things she wanted, like food. But evidence that many animals can, when brought up with humans, tell their thoughts to others using a human language is not quite the same as saying they use language as people do. Few have a smidgen of grammar, for example—that is, the ability to manipulate and combine words to create new meanings. It is true that dolphins in captivity can distinguish between “put the ball in the hoop” and “bring the hoop to the ball”. Alex, an African grey parrot, combined words to make up new ones: he called an apple a “bannery”, for example, a mixture of banana and cherry (see “The talkative parrot”). But these are exceptional cases and the result of intense collaboration with humans. The use of grammar—certainly a complex grammar—has not been discerned in the wild. Moreover, animals have no equivalent to the narratives that people tell one another.

If language can still be claimed as uniquely human, can anything else? Until recently, culture would have been held up as a second defining feature of humanity. Complex ways of doing things which are passed down not by genetic inheritance or environmental pressure but by teaching, imitation and conformism have been widely assumed to be unique to people. But it is increasingly clear that other species have their own cultures, too.

In “The Cultural Lives of Whales and Dolphins”, Hal Whitehead of Dalhousie University, Nova Scotia, and Luke Rendell of the University of St Andrews, in Scotland, argue that all cultures have five distinctive features: a characteristic technology teaching and learning a moral component, with rules that buttress “the way we do things” and punishments for infraction an acquired, not innate, distinction between insiders and outsiders and a cumulative character that builds up over time. These attributes together allow individuals in a group to do things that they would not be able to achieve by themselves.

For the first feature, look no further than the crow. New Caledonian crows are the champion toolmakers of the animal kingdom. They make hooks by snipping off V-shaped twigs and nibbling them into shape. They fashion Pandanus leaves into toothed saws. And in different parts of the island they make their tools in different ways. Studies by Gavin Hunt of the University of Auckland showed that the hooks and saws in two sites on New Caledonia differed systematically in size, in the number of cuts needed to make them and even according to whether they were predominantly left-handed or right-handed. To the extent that culture means “the way we do things around here”, the two groups of crows were culturally distinct.

Chimpanzees are now known to manipulate over two dozen implements: clubs to beat with, pestles to grind with, fly whisks, grass stalks with which to fish for termites, spongy leaves to soak up water, rocks as nutcrackers. Like New Caledonian crows, different groups use them slightly differently. William McGrew of Cambridge University argues that the tool sets of chimpanzees in western Tanzania are just as complex as the simplest human tools, such as early human artefacts found in east Africa or indeed those used in historic times by native peoples in Tasmania.

The skill needed to make and use tools is taught. It is not the only example of teaching that animals have to offer. Meerkats feed on scorpions—an exceptionally dangerous prey which you cannot learn to hunt by trial and error. So older meerkats teach younger ones gradually. First they incapacitate a scorpion and let the young meerkat finish it off. Then they let their students tackle a slightly less damaged specimen, and so on in stages until the young apprentice is ready to hunt a healthy scorpion on its own.

Pretty much all meerkats do this. Elsewhere what is taught can change, with just some animals picking up new tricks. As the story of Billie the tailwalker implies, whales and dolphins can learn fundamentally new behaviours from each other. In 1980, a humpback whale started to catch fish off Cape Cod in a new way. It would slam its flukes down on the surface of the water—lobtailing, as it is known—then dive and swim round emitting a cloud of bubbles. The prey, confused by the noise and scared of the rising circle of bubbles, bunched themselves together for protection. The whale would then surge up through the middle of the bubble cloud with a mouth full of fish.

BBubble feeding is a well known way for whales to freak out their food so is lobtailing. Making the first a systematic set-up to the second, though, was apparently an innovation—and became very popular. By 1989, just nine years after the first Cape Cod whale started lobtail feeding, almost half the humpbacks in the area were at it. Most were younger whales which, since their mothers did not use the new trick, could not have inherited it. Researchers think young whales copied the first practitioner, spreading the technique through imitation. How the first one got the idea is a mystery—as is the question of whether it is actually a superior way of feeding, or merely an increasingly fashionable one.

Cultures rely not only on technologies, techniques and teaching but on rules of accepted behaviour. That things should be fair seems a widespread requirement among social animals. At a canine research centre at Eotvos Lorand University in Budapest, for example, dogs frequently chosen to take part in tests are shunned by other dogs. It turns out that all the dogs want to take part in these tests because they receive human attention those which are chosen too often are seen as having got unfair advantage. Capuchin monkeys taking part in experiments keep track of the rewards they are getting. If one is offered a poor reward (such as a slice of cucumber), while another gets a tasty grape, the first will refuse to continue the test. Chimpanzees do this, too.

Most cultures distinguish between outsiders and insiders and animals are no exceptions. Orcas, also known as killer whales, are particularly striking in this regard, having a repertoire of calls which are distinctive to the pod in which they live, a sort of dialect. Dr Whitehead and Dr Rendell compare them to tribal markings. Orcas are unusual in that different pods tend to feed on different prey and rarely interbreed. Most of the time, pods studiously ignore each another. But occasionally one will ferociously attack another. This cannot have anything to do with competition for food or females. Lance Barrett-Lennard of the Vancouver Aquarium attributes it to xenophobia—a particularly extreme and aggressive way of distinguishing between insiders and outsiders.

But if animals display four of the five attributes that go to make up a culture, there is one they do not share. Perhaps the most distinctive thing about human cultures is that they change over time, building upon earlier achievements to produce everything from iPhones and modern medicine to democracy. Nothing like this has been observed in animals. Particular aspects of animal behaviour change in ways that might seem cultural, and disruptive change is certainly possible. In the 1990s, for example, South African culling policies that saw the oldest elephants shot and their children redistributed led to large changes in their normally orderly matriarchal societies. Young elephants became abnormally aggressive, since there were no longer any elders to rein them back. In other cases such disruption can seem, anthropomorphically, not so bad (see “The peaceful baboons”). But whether the shocks are good or bad, animal societies have yet to show steady, adaptive change—any cultural progress. Knowledge accumulates with the oldest individuals—when drought struck Tarangire national park in Tanzania in 1993 the elephant families that survived best were those led by matriarchs which remembered the severe drought of 1958—but it goes to the graveyard with them.

TThere is a great deal more to learn about animal minds. Grammatical language can pretty thoroughly be ruled out learned toolmaking for some species is now indubitable: but many conclusions are in the middle, neither definitively in nor out. Whether you accept them depends partly on the standard of evidence required. If the question of animal empathy were being tested in a criminal court, demanding proof beyond reasonable doubt, you might hesitate to find that it exists. If the trial were a civil one, requiring a preponderance of evidence, you would probably conclude that animals had empathy.

Using that standard, one can hazard three conclusions. First, various animals do have minds, The physiological evidence of brain functions, their communications and the versatility of their responses to their environments all strongly support the idea. Primates, corvids and cetaceans also have attributes of culture, if not language or organised religion (though Jane Goodall, a noted zoologist, sees chimps as expressing a pantheistic pleasure in nature).

Next, animals’ abilities are patchy compared with those of humans. Dogs can learn words but do not recognise their reflections. Clark’s nutcracker, a member of the crow family, buries up to 100,000 seeds in a season and remembers where it put them months later—but does not make tools, as other corvids do. These specific, focused abilities fit with some modern thinking about human minds, which sees them less as engines of pure reason that can be applied in much the same way to all aspects of life as bundles of subroutines for specific tasks. On this analysis a human mind might be a Swiss army knife, an animal mind a corkscrew or pair of tweezers.

This suggests a corollary—that there will be some dimensions in which animal minds exceed humans. Take the example of Ayumu, a young chimpanzee who lives at the Primate Research Institute of the University of Kyoto. Researchers have been teaching Ayumu a memory task in which a random pattern of numbers appears fleetingly on a touchscreen before being covered by electronic squares. Ayumu has to touch the on-screen squares in the same order as the numbers hidden beneath them. Humans get this test right most of the time if there are five numbers and 500 milliseconds or so in which to study them. With nine numbers, or less time, the human success rate declines sharply. Show Ayumu nine numbers flashed up for just 60 milliseconds and he will nonchalantly tap out the numbers in the right order with his knuckles.

There are humans with so called eidetic, or flash, memories who can do something similar—for chimps, though, this seems to be the norm. Is it an attribute that chimps have evolved since their last common ancestor with humans for some reason—or one that humans have lost over the same period of time? More deeply, how might it change what it is for a chimp to have a mind? How different is having minds in a society where everyone remembers such things? Animals might well think in ways that humans cannot yet decipher because they are too different from the ways humans think—adapted to sensory and mental realms utterly unlike that of the human, perhaps realms that have not spurred a need for language. There is, for example, no doubt that octopuses are intelligent they are ferociously good problem solvers. But can scientists begin to imagine how an octopus might think and feel?

All that said, the third general truth seems to be that there is a link between mind and society which animals display. The wild animals with the highest levels of cognition (primates, cetaceans, elephants, parrots) are, like people, long-lived species that live in complex societies, in which knowledge, social interaction and communication are at a premium. It seems reasonable to speculate that their minds—like human ones—may well have evolved in response to their social environment (see “The lonely orca”). And this may be what allows minds on the two sides of the inter-species gulf to bridge it.

Off Laguna, in southern Brazil, people and bottlenose dolphins have fished together for generations. The dolphins swim towards the beach, driving mullet towards the fishermen. The men wait for a signal from the dolphins—a distinctive dive—before throwing their nets. The dolphins are in charge, initiating the herding and giving the vital signal, though only some do this. The people must learn which dolphins will herd the fish and pay close attention to the signal, or the fishing will fail. Both groups of mammals must learn the necessary skills. Among the humans, these are passed down from father to son among the dolphins, from mother to calf. In this example, how much do the species differ?

This article appeared in the Essay section of the print edition under the headline "Animals think, therefore…"


News, Announcements & Press Releases

There is growing evidence that animals may share humans’ ability to reflect upon, monitor and regulate their states of mind, according to a study published in Trends in Cognitive Sciences this month. Dr David Smith, comparative psychologist at the University of Buffalo, makes this conclusion in a review of the new and rapidly developing area of studying animal metacognition. He was supported by the European Science Foundation EUROCORES programme ‘Consciousness in a natural and cultural context’ (CNCC).

Humans can feel uncertainty. You know if you do not know or remember – a perfect example of this is the feeling of something being on the tip of your tongue. This capacity to be aware of our own thinking is known as metacognition. Establishing whether non-human animals also share this sophisticated human capacity is important for understanding their consciousness and self-awareness. The study of metacognition is based on the idea that human minds in particular have a function that monitors or controls perception and memory.

“It is a crucial goal of comparative psychology to establish firmly whether animals share humans’ capacity to think about thinking,” says Dr David Smith. “Metacognition rivals language and tool use in its potential to establish important similarities or differences between human and animal minds.”

To find out whether non-human animals do have knowledge of their own cognitive states researchers have studied a dolphin, pigeons, rats, monkeys and apes using tests involving perception, memory and food-concealment. The results offer growing evidence that some animals do indeed have functional equivalents to humans’ consciousness and to humans’ cognitive self-awareness.

Among these species are dolphins and macaque monkeys (an Old World monkey species). Smith recounts the original animal-metacognition experiment with Natua the dolphin: “When uncertain, the dolphin clearly hesitated and wavered between his two possible responses. But when certain, he swam toward his chosen response so fast that his bow wave would soak the researchers’ electronic switches. Practicing safe science, the researchers were reduced to buying condoms to protect the apparatus from the exuberantly confident dolphin.”

In sharp contrast, other animals do not have the same capacity. Pigeons in several studies have so far not expressed any capacity for metacognition and several converging studies now show that capuchin monkeys (a New World monkey species) only express a limited capacity for metacognition. This raises important questions about the evolution of the reflective mind in primates and opens a new window on reflective mind in animals overall, illuminating its evolutionary emergence and allowing researchers to trace the precursors to human consciousness.

Comparative psychologists are cautious about labeling animals’ functional parallels with humans as a definite indicator of consciousness. Yet the fact that some animals’ flexibly use metacognition without training means it is likely to reflect their conscious awareness.

Smith is recognized for his research in the field of animal cognition. He and his colleagues pioneered the study of metacognition in nonhuman animals, and they have contributed some of the principal results in this area, including many results that involve the participation of Old World and New World monkeys who have been trained to use joysticks to participate in computer tasks. Smith is one of a growing number of American participants in EUROCORES, through support from the USA’s National Science Foundation. The metacognition project is led by professor Joëlle Proust from the Institut Jean-Nicod in Paris, France. Dr Smith collaborates with partners from Austria, France, Germany and the UK to develop comparative knowledge of metacognitive processes, by exploring how similar these capacities are in non-human animals, human children and human adults.

Dr Eva Hoogland, EUROCORES coordinator for the cognitive sciences at the European Science Foundation, comments: “The metacognition project is an exciting example of an international, interdisciplinary environment that is carefully prepared and managed, where partners from disciplines as diverse as developmental psychology, comparative biology and philosophy respect each other’s work. This study shows how this has resulted in opening up promising research avenues to answer some of the most important research questions that currently face us.”


Scientific evidence of reproduction urge? July 11, 2007 3:37 PM Subscribe

Disclaimer: I'm not well educated in Biology. Or at least as much as I should be.

I've always believed there is, because it was always taught as a given, but I've never actually read any scientific studies to that effect. Is our opinion that all animals have an innate/instinctual urge to reproduce purely based on non-scientific evidence?

This question is sparked because my brother feels that animals actually only have an innate/instinctual urge to have sex and that reproduction is only a by-product, in that it's not intentional, either intellectually or innately/instinctually (and that with humans it's a bit different due to the intellectual aspect of it). I disagreed in that I always thought the urge was to reproduce and that sex was the by-product. I do agree though that I feel with humans it's a bit different due to the intellectual aspect.

I tried to google for some scientific evidence, but any mention of such urge is only in passing or editorial.

I'm fairly certain that I hold the majority viewpoint, so I don't just need people saying yes or no, this is right or that is right (although, viewpoints based on actual knowledge of the field are welcomed). What I'm wondering is if we can scientifically study that these urges exist, and if so, what have been the outcomes of such studies?

Sorry about how long-winded this is, but I just wanted to get this question out there.

My sociology professor spoke at length about how humans don't have any instincts, so here's that theory:

Instincts are things that animals do without being taught, and all of the animals of a species do it. The common example is one bird that builds a spherical nest. even if it's raised in captivity and never meets another bird, it still builds the same shape nest. There's no behavior that all humans do without fail and without being taught - from reproduction, to sheltering themselves, even survival and walking upright (as evidenced by feral children) - all are learned behaviors.

I'll see if I can dig out the studies referenced this is a pretty common sociological theory.
posted by lhall at 3:46 PM on July 11, 2007

my brother feels that animals actually only have an innate/instinctual urge to have sex and that reproduction is only a by-product

Your brother's theory doesn't take into account organisms that reproduce asexually. Before animals gained the ability to reproduce sexually, they still had the instinct to reproduce.
posted by rancidchickn at 3:50 PM on July 11, 2007

It's not an either/or question. There's an urge to mate, and later there's an urge to care for the young (unless were talking about the many animals that don't invest any care at all towards their offspring).

Some animals have sexual behavior that has nothing to do with reproduction--homosexual pairings, or the sexual free-for-all of bonobo social relations, for example.
posted by hydrophonic at 3:57 PM on July 11, 2007

Response by poster: Thanks for the response. keep them coming. These explanations definitely do touch upon some of what I already knew, but for some reason I was having trouble backing up my understanding.

I should read more about biological and evolutionary science. Side question. anyone have any books relating to this discussion that a layman could pick up and enjoy/learn from? Bonus points if it's not ultra-dense.
posted by defenestration at 3:58 PM on July 11, 2007

It is more of a semantic question than a factual one that can be proved/disproved by experimentation. The urge is to engage in behavior that maximizes the chance for reproduction of the animals genes. Everything else is a side effect. I highly recommend The Selfish Gene for an engaging read that lays a good foundation for understanding this.

"no behavior that all humans do without fail and without being taught"

Pulling one's hand away from contact with flame would seem to be instinctual behaviour.
posted by Manjusri at 3:58 PM on July 11, 2007

Two things: 1) Not all reproduction is sexual. 2) Why stop at animals? Think about each of these, and whether it's having "sex" for pleasure. Chimpanzees. Rhinos. Mice. Chickens. Jellyfish. Elm trees. Yeast. Viruses. A few bits of RNA.

They all reproduce. Everything that fails to have a strong drive to reproduce doesn't make copies of itself, and when it dies, the ambivalence dies with it.

We all have a common ancestor: a teeny few bits of atoms that tended to make copies of itself from the environment. The copies were never always perfect, and some self-copying accident results were better at making self-copies than others. Most everything didn't work and died. But, the stuff that lived and made copies, lasts. 2 thousand thousand thousand years later, it led to hummingbirds outside, still making copies of themselves.

Everything that gets in the way of making copies is a hindrance to the survival of that hunk of information. The hunk of information in you, elm trees, rhinos, and viruses has a strong, strong history of giving your very good direction to reproduce.

What we call the act of sex is pleasurable and we do it because, to not do it or find it enjoyable makes us far less likely to pass on our genes. It's continually reinforced, and those who have a stronger drive will create offspring with that stronger drive.
posted by cmiller at 4:02 PM on July 11, 2007

Response by poster: I was definitely already aware of non-sexual reproduction. I just forgot to include it in the question.

I'm definitely buying "The Selfish Gene" the next time I'm at a book store. Any more insights and recommendations are welcome!
posted by defenestration at 4:09 PM on July 11, 2007

All great responses, and much to my satisfaction, it's not a cut-and-dry issue-- especially good point about the distinction between reproductive behavior and sexual behavior (specifically among Bonobos). I tend to agree that it's a semantical question as did my brother, but my main position was a naturalist one which precluded the notion of reproductive "urges" as it seems anthropomorphic. I defended this position by stating that sex was the behavior which led to reproduction, and that this behavior would be selected due to its positive gain. However, the intent to reproduce seemed, IMHO, too clairvoyant to be realistic. How does the Darwinian explanation tackle this?

It made sense to me that reproductive behaviors, like sex, were instinctual, but my position was firm that the behaviors were instinctual not the "intended" consequences.

Caring for young, etc. although not universal is a good argument towards knowledge of what was to come (e.g. building a nest, etc.), and it certainly makes the subject murky. If these animals are choosing to reproduce, then it's not strictly an instinct as we have been discussing.

More naturally, the asexual reproduction is instinctual, and a great point to make. My only concern is there is not intent towards these behaviors, and it's just a process these organism and even insects undergo.
posted by quanta and qualia at 4:12 PM on July 11, 2007

Biological Exuberance: Animal Homosexuality and Natural Diversity by Bruce Bagemihl is fascinating and quite readable by the non-expert.

"Courtship, sex, affection, gathering food, finding a home--they have all been observed among a range of partners, from heterosexual to homosexual to somewhere in between," Bagemihl says. "And there are some animals who don't have sex at all." Although he doesn't claim to know the motivations of animals, Bagemihl says he does know procreation is not always the driving force: "Same-sex couplings occur in the presence of the opposite sex, in and out of captivity, and in and out of mating season."
posted by Carol Anne at 4:21 PM on July 11, 2007

Pulling one's hand away from contact with flame would seem to be instinctual behaviour.

The sudden jerk your hand does when you get it into a flame without realizing it is a spinal reflex, not a "behavior" as that word is usually defined. That reflex can be overcome with attentive effort some people can hold their hands in flame until all the skin crisps away. (In fact, this ability was used in Frank Herbert's Dune as a test to segregate humans from animals, as it is a fundamentally human ability.)

Understanding the inner life of animals - how they make their decisions, what they are thinking or feeling from time to time - is not a scientific pursuit it is a philosophical one. The same is true of humans, in fact, and it will continue to be true unless reliable mind-reading technology is developed.
posted by ikkyu2 at 4:26 PM on July 11, 2007

+1 manjusri. If you have urges to engage in behaviors that maximize your reproductive chances, is that different from having an urge to reproduce? Especially if you're an animal with no theory of mind? Seems like a semantic distinction.

Taking care of the young is just one of the ways that sexually reproducing animals maximize their chances. There are all kinds of elaborate, subtle, or surprising ways that animals do this.

One recent news item discussed females in pair-bonding species that will cuckold their males, to get it on with a more genetically desirable male.

In baboon societies, you get dominant males with harems of females as one type of social group, and "bachelor tribes" as the other. When a bachelor tribe happens upon a harem, first the bachelors kill the dominant male. Then the bachelors all duke it out with each other, until only one is left. Then they kill any young offspring, which sends the females into estrus.

Dragonfly males have "spades" on their penises that they use to dig out any sperm left behind by other males when they impregnate females.

And so on.
posted by adamrice at 4:27 PM on July 11, 2007

Do some reading on the phenomena of estrus. Estrus is when a female of a species comes into season and becomes interested in mating. Sexual activity outside of estrus is not going to lead to reproduction.

In many species, sexual activity rarely happens outside of estrus. Males tend to live with males (or alone) and females tend to live with females and offspring. There may be one or two males as part of a herd, but they are generally there for the purposes of organization, defense, and protecting the right to mate when females are in season.

The females really don't want to have anything to do with the males in many species when they are not in estrus. That's actually one of the reasons that Pandas are so endangered. They are too cross to fuck. When its not breeding season, they can't stand to be around each other and wander far afield. The period of estrus is pretty narrow in that species, so they have a hard time finding mates when the season is actually right.

Humans (and a handful of other creatures) are novel in respect to estrus. We have what is known as hidden ovulation. It can be difficult to tell when a woman is ovulating and therefore ready to breed. This has a number of ramifications for our social structure and male/female interactions and those consequences are a matter of some debate.

So I would say that all animals have a reproduction instinct. Some animals also seem to have a sex drive that is not intrinsically related to breeding. I think that's a really neat thing about humans, but should not be abstracted to other animals.
posted by afflatus at 4:46 PM on July 11, 2007

The Selfish Gene, by Richard Dawkins would give you a fairly good scientific background on why your argument is incorrect.

Functionally: The instinct/urge is for sex and other behaviours like care of offspring etc., with reproduction as a by-product. Animals (including humans, regardless of what many sociologists think) have instincts which cause behaviours which cause reproduction, so the instincts cause reproduction indirectly. It doesn't make sense that a mouse has sex because it desires the presence of baby mice a few weeks down the line, for that to work the mouse would have to actualy understand the link between sex and birth, and I don't think mice know that, or reason that far ahead. Animals are born with instincts which cause wants and behaviors, not knowledge.

Teleologicaly: Those instincts are only there because they cause successful reproduction. The desire for sex (along with caring for offspring etc.) is an outcome of natural selection (for successful reproduction), so reproduction is the ultimate "reason" for the urge for sex, and indeed every other instinct.

In the life of one animal, sex causes reproduction. Over evolutionary timescales, reproduction causes sex.
posted by Canard de Vasco at 4:49 PM on July 11, 2007

There's no behavior that all humans do without fail and without being taught - from reproduction, to sheltering themselves, even survival and walking upright (as evidenced by feral children) - all are learned behaviors.

This theory is massively amplifying the significance of human exceptions, I think.

For example, lots of people argue that animals are less sensitive to low level environmental radiation than humans. However, the effect of such radiation on humans is actually very subtle, a slight increase in the leukemia rate here, a slight increase in breast cancer there.. It is easy to detect statistically unlikely things in human populations, but very hard to detect the same in animals.
posted by Chuckles at 5:02 PM on July 11, 2007

The central tenet of evolution:

That which reproduces most effectively in the previous generation is more prevalent in future generations.

Do you think that organisms that lack a desire to reproduce will last for very many generations in the presence of competing organisms having such an instinct?

Non-human animals (and seemingly, many humans as well) don't understand that sex -----> reproduction, hence the desire to have sex is what drives reproduction in animal populations.

Exercise: Consider the question for plants as well.
posted by mharper3 at 5:30 PM on July 11, 2007

I'd like to offer the ostrich as a case study.

In the wild, a male ostrich mates with as many females as he can. He also builds a nest. When his mates are ready to lay eggs, they show up and use his nest, leaving fertilized eggs behind which are (usually) his offspring. Once they've laid their eggs, he chases them away. (Until they show up ready to lay again.)

Eventually he ends up with quite a nest full of eggs, (nearly) all of which are his children. He then sits on them until they hatch, and cares for the chicks until they're old enough to have a decent chance of surviving on their own. And if anything or anyone tries to get near his chicks while he's caring for them -- including the females who contributed eggs to his clutch -- he will chase them away violently.

Which can be pretty violent. A full grown ostrich can easily kill a human with a disembowling kick.

It's obvious that the male ostrich is doing a lot more than just looking for opportunities to have sex with female ostriches.

Another example: the Nile crocodile. A female Nile crocodile climbs up on a river bank, dig a hole, lays her eggs, and buries them. Then she guards the nest until the eggs hatch. Afterwards, the young crocodiles stay near her for protection until they're pretty good sized.

When danger approaches, they hide in her mouth. (Early observation of this completely misunderstood it and assumed that the female crocodile was engaging in cannibalism because she was too stupid to know better. Later research has corrected that misapprehension.)

She doesn't eat while any of this is going on, a period of several months.
posted by Steven C. Den Beste at 5:42 PM on July 11, 2007

The central tenet of evolution: That which reproduces most effectively in the previous generation is more prevalent in future generations.

If there is a central tenet of evolution, that's not it. That might be a good tenet of natural selection though.

How can you study your question scientifically? I think the question is ill-formed. You would need to define how you would know that something is reacting to an urge to reproduce rather than an urge to have sex. And if you want to have a meaningful discussion with your brother on the subject, you'll have to agree on the definitions.

Until you do that, you can't really get a good answer here.
posted by grouse at 6:00 PM on July 11, 2007

"The sudden jerk your hand does when you get it into a flame without realizing it is a spinal reflex, not a "behavior" as that word is usually defined."

Webster's defines behavior as "b: anything that an organism does involving action and response to stimulation." Are you using more of a technical/jargon definition that excludes reflexes?

I'm also a bit puzzled by the original assertion that instincts cannot be overridden. Websters defines instinct as: " a largely inheritable and unalterable tendency", which is a bit vague but implies that there are exceptions.
posted by Manjusri at 6:30 PM on July 11, 2007

My understanding is that humans are born with far less pre-programmed (instinctual) behavior than most other species, but that they certainly do have instincts. Anyone who has ever been around an infant knows that they do not need to be taught to cry when they're hungry or to suckle. There is at least enough instinctual behavior in humans to help them survive infancy.

The instinct to care for infants is probably unrelated to sex, as I see it. Of course, it's likely that it has also been selected for, because the descendants of animals that care for their young are more likely to survive to adulthood than the descendants of animals that don't. But it does seem logical that there was no real way for early humans, or any animal for that matter, to truly understand the connection between sex (especially if they have a lot of it with a lot of partners) and birth (some lengthy period of time afterward).

This book review relates some interesting points on caring for infants, including the idea that at some times, infanticide is the wiser evolutionary choice,
posted by Miko at 7:17 PM on July 11, 2007

My sociology professor spoke at length about how humans don't have any instincts . There's no behavior that all humans do without fail and without being taught

Your professor is utterly, totally full of shit.

The list of biologically driven behaviors on display by humans is nearly endless. Starting with infants, you have APGAR grimace tests, suckling reflexes, Moro reflexes, the Palmar grasp, Plantar grasp, rooting reflexes, stepping reflexes, tonic neck reflexes, swimming reflexes, etc, etc. These are all things humans do without being taught.
posted by Cool Papa Bell at 8:42 PM on July 11, 2007

Anyone who has ever been around an infant knows that they do not need to be taught to cry when they're hungry or to suckle.

As a three month premie baby, I was born not knowing how to suckle. While 'taught' is probably not entirely accurate given the likely impossibility of teaching anything at that age, it seems at least somewhat true to say that I needed to be taught to suckle.

That said, and so I actually contribute to the thread in this post, it seems to me that 'reproduction' is a bit high level, and what we can observe is an instinct towards activities leading to and associated with reproduction.

As for the issue of asexual reproduction, is there actually anything that reproduces asexually which is complex enough to possess the abstraction of 'instinct'? I can't think of anything, so it seems like a bit of a false issue on this topic. Likewise, plants don't have an instinct towards reproduction because I don't see where they can be meaningfully said to have an instinct towards anything.

Maybe we need to come to an agreed upon definition of what an instinct is here, because it doesn't seem immediately obvious that everyone here is in agreement on this.
posted by Arturus at 8:57 PM on July 11, 2007

"My sociology professor spoke at length about how humans don't have any instincts"

This is apparently a common assertion in introductory sociology texts, but it sounds highly dubious, and perhaps based on differences in use (or misuse) of terminology as Arturus noted.
posted by Manjusri at 10:39 PM on July 11, 2007

The "nature versus nurture" argument is an old one in social science circles, but for many involved it's become politically incorrect to contend that anything is "nature". To even suggest such a thing will land you in hot water. You wouldn't believe the names you'll get called. Just ask Lawrence Summers.

That sociology professor who claimed that humans don't have any instincts was toeing the party line. It would be interesting to know what he'd say about that subject in private, to people he trusts.

It's sad. That's not how science is supposed to be done. This kind of abuse of the process is one of the reasons why the social sciences have fallen into such disrepute in the last twenty or thirty years.
posted by Steven C. Den Beste at 11:34 PM on July 11, 2007

While 'taught' is probably not entirely accurate given the likely impossibility of teaching anything at that age, it seems at least somewhat true to say that I needed to be taught to suckle.

No, it's not really true at all. It's just not at all a valid description of what occurred in your example. As a preemie, you lacked the neurological development at that stage for the reflex to be present in a significant fashion. Given a few more weeks in utero, you'd have been fully cooked, so to speak. Instead, the rehabilitative process included treatments where you were prompted to suckle, and given opportunity to suckle something that provided additional neurological stimulus (i.e. feeling something in your mouth, feeling/tasting nutritional stimulus, etc), in order to hopefully speed along the final neurological development that results in the healthy reflex.

Calling this rehab "teaching and learning" is a misuse of terms that leads to greater misunderstanding. If you lift weights, you are not "teaching" your muscles to grow bigger. If you are frightened by a scary movie, your heart does not "learn" to beat faster.

When sociologists get off the ranch and start ignoring the biology involved, and spread their wrongheaded ideas around, that's when you get an AskMe question with a lot of wrong answers. -)
posted by Cool Papa Bell at 11:41 PM on July 11, 2007

Understanding the inner life of animals - how they make their decisions, what they are thinking or feeling from time to time - is not a scientific pursuit it is a philosophical one. The same is true of humans, in fact, and it will continue to be true unless reliable mind-reading technology is developed.

Such a technology may already exist, at least to a limited extent.
posted by flabdablet at 6:07 AM on July 12, 2007

You wouldn't believe the names you'll get called. Just ask Lawrence Summers.

Sorry for the derail, but Lawrence Summers wasn't in the position of making a scientific point when he proposed that males and females might have different abilities in math and science. He was the president of Harvard which is an administrative position and I assume he had a large influence on who was hired or fired at the University. For a person in such a position to make the proposition he made is completely irresponsible and female scientists had every right to doubt that he would treat them fairly.
posted by afu at 6:15 AM on July 12, 2007 [1 favorite]

I agree that this discussion and what happened to Lawrence Summers are two vastly different things but not least because there is a difference in saying some few behaviors are instinctual at birth, and that abilities such as skill at math are instinctual at birth. Most scientists are agreed, as noted above, that some behaviors appear spontaneously in humans, but very few, in comparison with other species. Humans have much more learned behavior than instinctual behavior. Suggesting that skill at math is biologically pre-determined by gender shows a reductive understanding of nature and nurture that is embarrasingly reductive and unscientific Summers' misunderstandings set him up to be a very poor administrator. But it isn't an entirely black-and-white phenomenon (it's all instinctual! It's all learned!) Most human behavior is learned, but nature and nurture interact in complex ways which are not yet well understood.

Cool Papa Bell is right about the preemie argument, as well. In a state of nature, a baby born without the ability to suckle would not be likely to survive. Premature infants are able to survive to the degree that they are in today's world only because centuries of observation, reliable data, experiment, and standard medical practices have created protocols for treating babies like this that enable them to overcome the lack of development at premature birth, which once was almost always a death sentence. Babies who had completely developed in the womb and had the suckling instinct firmly in place would be more likely to survive and thus reproduce. Most babies don't have to be taught - the ones who are able to learn despite developmental delays or early birth are benefitting from the helps of modern medical science which can in some cases counteract a life-threatening biological disadvantage.
posted by Miko at 6:37 AM on July 12, 2007

"My sociology professor spoke at length about how humans don't have any instincts"

This is apparently a common assertion in introductory sociology texts, but it sounds highly dubious, and perhaps based on differences in use (or misuse) of terminology as Arturus noted.

As ikkuy2 pointed out, instinct is generally distinguished from a reflex. An example of instinct would be the elaborate mating rituals of many birds. If you raise a male bird from the time it is hatched and never allow it to see another bird of its species and then expose it to a female at the appropriate age, it will perform a mating dance that is indistinguishable from birds that have grown up around their own kind.

The complexity and variety of instincts lessens with the animal's ability to learn, so mammals in general have fewer instincts that birds or reptiles. A monkey raised in captivity will not know the mating signals of wild monkey, and indeed may never develop any effective mating behaviour at all---my primates prof showed us a picture once of a juvenile male monkey raised in captivity masturbating next to an adult female who was presenting (the monkey signal for 'take me now'). This lack of knowledge about how to mate has been widely observed in monkeys raised in isolation or near isolation. Similarly, apes have little success raising their own first babies in zoos, such that the babies are often either removed for the first few weeks and then returned, or removed and given to an ape who has had experience raising babies (who may then be moved to the same enclosure as the bio-mother so that she can learn parenting skills).

Apes also have enough intelligence to be aware those instincts that they do have. A chimp in captivity was once observed to come across some food that was not enough to share. It began to call out--an instinct for chimps--but clamped its hand over its mouth so as to muffle the sound and eat the food itself. It seems reasonable to suggest that as intelligence increasing allows for overriding and changing instinctual behaviour, instincts will become less selected for.

By the time you get to the intelligence of humans, there do not seem to be any encoded behaviours. Saying that humans do not have instincts IS NOT THE SAME as saying that their biology does not affect their behaviour. If humans had instincts in the way the word is normally applied to animal behaviour, we'd find universals that are far more specific than the list Pinker uses. We'd find really specific things that didn't vary a lot between societies or languages. So, if absolutely everyone in the world, regardless of the language they learned, called out the equivalent of "Hey, look at this!" every time they found a bush with berries on it, that would be an instinct. If all dates followed the same format, that would be an instinct. On the other hand, finding that every society develops music certainly implies something about a biological influence, but a proclivity for music at the level of society is not the same as an instinct. It would be instinct if every person was born knowing the same song.

So, SCDB, it is not actually toeing any "party line" to say that humans don't have instincts. It is using the term in the biological rather than the colloquial sense. Larry Summers didn't say that human's have instincts. He said their were innate differences in genders. Whether that is right or wrong is irrelevant to the question of whether humans have instincts.

Whether desires are instinctual is much harder to determine. You have to use indirect measures, which is tricky. I would separate out a few factors to look at a desire to reproduce: there is the desire to have sex, the desire to have one's own biological children, and the desire to raise children.

In humans, the desire to have sex acts independently of the desire to have or raise children, as evidenced by the number of people who do not want to have or raise children at the time that they are having sex. I don't think we need to look for an instinct to have sex, since there is abundant evidence that animals will engage in pleasurable pursuits when the opportunity arises, and sex for humans is (once you learn how to do it right) generally a pleasurable experience.

The desire to have one's own biological children is a bit more difficult to sort out. On the side of innate desire, there are the people who go to extraordinary lengths to have a child that is biologically linked to them, rather than adopting, or adopting as a final measure only. On the side of not innate, there are/have been many societies where certain social conventions and/or rituals are considered the primary source of relationship, and biology is secondary or unimportant. Further, in the societies where people go to great lengths to have biological children, biological relationships are often considered more 'real' than other relationships, thus complicating whether the desire to have one's own children derives from something innate or a cultural value.

People from every living society seem to have the desire to raise children. Again, it is difficult to determine the role of cultural values in this. In industrial societies we have seen the number of by-choice childless couples increase, but that again could come from cultural values rather than a (lack of) innate desire. Anthropologists have done studies that show people reshaping their desires according to certain principles, which suggests that many desires which might be termed 'innate' in humans are also fairly plastic.

My own feelings, which are based on education in human and primate evolution and human social diversity, but which are not based on some specific study, are that the desire to *raise* children has biological influences, but not in as strict a way as an instinct. The desire to have sex is certainly linked to biology, through the physicality of pleasure. However, I suspect that the 'desire to reproduce' in the specific sense that 'reproduce' implies is a cultural desire formed at the intersection of valuing biological relationships and learning about evolution.
posted by carmen at 7:25 AM on July 12, 2007 [1 favorite]

As ikkuy2 pointed out, instinct is generally distinguished from a reflex.

An example of instinct would be the elaborate mating rituals of many birds.

So, instinct implies a fairly complex action, over time.

The complexity and variety of instincts lessens with the animal's ability to learn,

That is a really interesting observation, but.. Well, we need a concise definition of instinct then - draw a line.

it began to call out--an instinct for chimps--but clamped its hand over its mouth so as to muffle the sound and eat the food itself.

Well, comparing the bird to ikkyu2's reflex example.. I have to think the chimps action sounds a lot more like reflex than complex behaviour to me..

I certainly can appreciate that the bird's nest, or spider's web, are entirely more complex than any "instinctual" behaviours humans exhibit. But, reflex doesn't seem to cover enough territory to capture unconscious human flirting displays. Women wear less clothing closer to ovulation, men tend to be more competitive as soon as a woman is around to impress, and so on..

Could it be that the definition of instinct is teleological? Humans do not have instincts, all other animals have instincts, and to maintain this truth as we learn more, the definition is changed. Is there another word for actions which are between instinct and reflex (maybe I just missed it up thread)?
posted by Chuckles at 9:53 AM on July 12, 2007

Women wear less clothing closer to ovulation, men tend to be more competitive as soon as a woman is around to impress, and so on..

These examples do not hold up cross-culturally. While there may be greater or less degrees of interaction between biology, desire, and behaviour going on in these types of examples, it is not particularly meaningful to describe them as either "reflex" or "instinct".

Is there another word for actions which are between instinct and reflex (maybe I just missed it up thread)?

"As ikkuy2 pointed out, instinct is generally distinguished from a reflex."

Ikkyu2 was asserting a division between behavior and reflex in general usage. However, it appears that reflex is a subset of behavior, and of instinct in colloquial usage.

The technical/jargon usage of instinct seems somewhat muddled, with attempts to shape the definition to reach the conclusion that humans are free from instincts. This article relates some of the history of its usage, and contains the spot-on quote: "In both popular and scientific literature the term instinct has been given such a variety of meanings that it is not possible to frame for it an adequate definition which would meet with general acceptance."

Perhaps the term "innate behavior" is at least partially intended to clarify this. If so, then it is a questionable choice of words as the issue still appears muddled. A quick google didn't turn up much information, but of the three links I found: one asserts that reflexes are a subset of innate behavior. Wikipedia is ambiguous and a thin entry besides, and this definition appears to exclude reflexes.
posted by Manjusri at 12:01 PM on July 12, 2007 [1 favorite]

I may have been wrong in my assertion, actually. I am so steeped in neurology that it's not always clear to me what "general usage" is.

When I think of "behavior" I think of the things that simply don't happen anymore after bilateral frontal lobectomy, as distinct from the lower reflexes (withdraw from pain, patellar tap) which are preserved.
posted by ikkyu2 at 2:47 PM on July 12, 2007

My sociology professor spoke at length about how humans don't have any instincts, so here's that theory:

That's absurd. For one thing, even a single example disproves it. Sneezing, hickups, and sleep, blinking as things are coming close are all instinctual for example. Man, why would a sociology professor know about that anyway?
posted by delmoi at 7:18 PM on July 21, 2007

Could it be that the definition of instinct is teleological? Humans do not have instincts, all other animals have instincts, and to maintain this truth as we learn more, the definition is changed. Is there another word for actions which are between instinct and reflex (maybe I just missed it up thread)?

That seems to be be the problem. After all, "instinct" "reflex" and "behavior" are all general terms thought up by people, not hard and fast rules like you might find in physics. No one would ever confuse gravity with electromotive force, for example.

So the question of where to draw the line can be more complicated. If it's an "instinct" for a chimp to call out when it sees food, how is that different then a person calling out during orgasm, or in pain?
posted by delmoi at 7:33 PM on July 21, 2007


2. Philosophical Assumptions in the Study of Animal Cognition

One of the ways in which philosophy connects to the study of animal cognition has to do with the fact that, as with any science, there are numerous philosophical assumptions embedded in the practice of comparative cognition research. Scientists, like all humans, make choices about how to use terms, they are swayed to formulate questions and seek data given the values they have, they have assumptions about the topics under study, as well as views about what makes for a scientifically respectable interpretation of data and the virtues of a good scientific explanation. Scientists have the goal of seeking objectivity, but seeking objectivity can itself introduce new biases, as we know from work in the philosophy of science on objectivity and feminist philosophy of science. This is as true in animal cognition research as in any other science (Andrews 2020c).

Philosophical assumptions in animal cognition research are present, most obviously, in how the capacities under study are defined. For instance, when scientists embark on an experimental project to determine whether a certain species has capacities such as altruism, cooperation, or empathy, they assume a certain definition of these terms. These definitions are often operational, meaning that they focus on the behavioral or physiological reactions that one would have to see in an animal to confirm that they possess this capacity. But even operational definitions come with philosophical baggage. For instance, some scientists consider that finding emotional contagion in an animal is enough to claim that that species is capable of empathy. This presupposes that a particular philosophical account of empathy is true, and not others. Some phenomenological accounts of empathy, for instance, do not require emotional state-matching (Zahavi & Overgaard 2012), and so, under these accounts, finding emotional contagion in an animal would not suffice to claim that they are capable of empathy.

Simply stating how one uses a particular term might seem to be good science, for this is thought to offer clarity regarding what exactly is being studied. Terms are often defined without any argumentation, or simply by reproducing the definition given by another scientist or one that is generally accepted in a certain scientific tradition. However, different definitions come with different assumptions and implications. They can establish which animals we study in search for a certain capacity, the methods we use to study it, and what counts as evidence for that capacity. Moreover, determining that an animal has this or that capacity can have profound ethical implications, so how we choose to use our terms is far from normatively neutral.

This discussion of how terms are used in science is a starting point for understanding the theory-ladenness and value-ladenness of science. Considerations from the philosophy of science can help to bring to light how the choices scientists make impact the research. We will next consider how value-ladenness, theory-ladenness, concerns about objectivity, and appeals to the virtue of simplicity impact comparative cognition research.

2.1 Theory-ladenness

Observations and definitions of terms can be theory-laden, in that they depend on a set of theoretical assumptions for their interpretation. Theory plays a role in background assumptions in animal cognition about the continuity or discontinuity of the mental across species, or regarding the relationship between certain behaviors and certain capacities, or how best to get good results from research subjects.

For example, Darwin&rsquos perception of psychological continuity between species is an example of a theory-laden commitment that shapes how the data is perceived. Among today&rsquos scientists, there is a diversity of views about continuity and discontinuity. Some scientists may be described as &ldquoromantic&rdquo because they take animals and humans to share many psychological properties the way Darwin described it. Other scientists may be described as &ldquokilljoy&rdquo because they deny many human capacities to animals. This terminology, first used by Daniel Dennett (1983), and criticized by Sara Shettleworth (2010b), highlights an example of a theory-laden approach to science. Attempts to avoid working from one of these two perspectives are attempts to find objectivity, which as we will see admits of its own challenges.

Animal cognition research can also be theory-laden within particular research paradigms. We can look at two current examples. The decades-long use of the false belief task as a measure for theory of mind in children and animals can be traced back to three commentaries on a scientific study of theory of mind (Premack & Woodruff 1978) by philosophers Dennett (1978), Jonathan Bennett (1978), and Gilbert Harman (1978), who first suggested this methodology. The suggestion reflected current philosophical theorizing associated with functionalism in the philosophy of mind, and a theory of action according to which behaviors are caused by propositional attitudes&mdashbeliefs and desires.

Another example is seen in research on social learning and imitation in nonhuman great apes that is grounded in the theory that the significant difference between humans and other animals rests on the cumulative nature of human culture (e.g., Dennett 2017 Henrich 2016 Heyes 2018 Sterelny 2012 Tomasello 2016). According to this theory, humans learn cultural behavior by a process of high-fidelity imitation in which even causally irrelevant aspects of the behavior are copied&mdashoverimitation&mdashand subsequent generations innovate improvements that lead to an ever-growing technological society. Overimitation is understood here to be a capacity towards which humans are strongly drawn and that is thought to be driven by social factors, such as a desire to &ldquofit in&rdquo. Great apes have been tested for overimitation several times and scientists have failed to find robust evidence that they overimitate (though, complicating this story, there is evidence that chimpanzees sometimes overimitate those with whom they have a long-term relationship (Myowa-Yamakoshi & Matsuzawa 2000) and that domestic dogs sometimes overimitate their human companion (Huber et al. 2018 Huber et al. 2020)). In contrast, young children will readily overimitate the actions of selective experimenters (Clay & Tennie 2018). Scientists have described this pattern of results as a &ldquofailure&rdquo on behalf of the apes and a &ldquosuccess&rdquo on behalf of the children, and are taken to bolster the claim that apes lack cumulative culture given theories of cultural learning.

2.2 Value-ladenness

Philosophers of science have highlighted that scientific studies are also value-laden, insofar as scientists&rsquo values shape how they do their science and the data that it ultimately delivers. This is also evident in the study of animal cognition. Values shape the choice of capacities to study. For example, during the twentieth century, a great deal of effort went into attempting to teach various forms of linguistic communication to great apes, an endeavor that stems from the value that humans place on language. Values also shape the methods that scientists use. Experimental evidence coming from labs is often considered more valuable than observational evidence coming from field studies, because the increase in ability to control the different variables that can affect an animal&rsquos performance is thought to compensate for the loss of ecological validity that comes with the highly artificial experimental settings. Values also shape how scientists interpret the results of their studies. The data that an empirical study delivers is useless without an interpretation, and this interpretation cannot be fully disentangled from the values upheld by scientists. This is because scientists have to choose which statistical methods to apply, which theories to accept, and what narrative they follow when writing up the results in a scientific paper.

The research on animal culture is another example of value-ladenness, given that the driving force behind it is a fascination with human culture, and an attempt to map out the cognitive differences between our own and the other ape species that can explain the origins of our uniqueness. This is why the negative results in experiments that have tested for great ape overimitation are described as a &ldquofailure&rdquo on their behalf. However, by its very definition overimitation is the imitation of actions that are irrelevant to producing the desired goal, so one could easily turn these results on their head and describe this as evidence that apes will go for the more efficient solution to the problem, which children fail to do. The fact that these results are not described this way evidences a value-laden perspective, from which overimitation is seen as a &ldquodesirable&rdquo capacity, insofar as it is linked to other characteristically human traits, such as rituals, normativity, and cumulative technologies.

Another good example of how the design of studies in comparative cognition is value-laden is the mirror self-recognition (MSR) test, which was originally envisioned by Gordon Gallup (1970) to probe animals&rsquo self-awareness. In this test, an animal is first allowed to become familiarized with a mirror. In a second step, the animal is anesthetized and an odorless mark is painted on their forehead. The behavior of the animal in front of the mirror is then observed, to see whether they interact with the mark, which would be a sign that they understand that the individual in the mirror is themself. Passing the MSR test is viewed as a sign of self-awareness, though whether it shows awareness of one&rsquos own mind or just one&rsquos body is disputed (Heyes 1994, 2008 Povinelli 1998). Leaving aside whether the MSR test actually addresses this capacity, which is an issue of theory-ladenness, we would like to draw attention to the methodology itself. As has long been argued, animals can fail this test for reasons that have nothing to do with their lack of self-awareness. For instance, gorillas&rsquo failure to pass the test has been linked to the fact that this species tends to avoid eye contact with conspecifics, because it is considered a threat (Shillito, Gallup, & Beck 1999). Other species might fail the test because vision is not their primary sensory modality. Dogs, for instance, fail the MSR test but pass an analogous olfactory test (Horowitz 2017). In this alternative, dogs are presented with urine samples from themselves and other dogs. Results show that they spend less time sniffing their own urine marks, which evidences some form of olfactory self-recognition. For humans, however, vision is the primary and most cherished sensory modality and this often leads to study designs that, like the MSR test, are vision-centered. Joint attention, for instance, is measured by means of gaze-following (Carpenter & Call 2013), which Maria Botero (2016) has criticized, arguing that for nonhuman primates touch might also be a medium for joint attention to emerge.

Despite these problems, the MSR test is still treated as the gold standard in the study of self-awareness, which illustrates the value that humans place on visual self-recognition, and our failure to appreciate that other animals may recognize themselves through other sensory modalities. Additionally, the MSR test presupposes that the animals tested care in some sense about their appearance. This is a necessary requirement for them to be motivated to interact with the mark. The fact that this is taken for granted in discussions on the MSR test illustrates the importance that we grant to mirrors delivering our expected self-reflection. However, it is not immediately obvious that all animals should care about this. Instead, whether or not they do will likely be shaped by their particular ecology and evolutionary history. For example, a tiny fish, the cleaner wrasse, has been recently shown to pass the MSR test, attempting to scrape off the mark in the presence of a mirror and ignoring it if it was colorless or there was no mirror available (Kohda et al. 2019). This initially surprising result becomes less so when one considers that this species of fish feeds off the parasites that they visually detect on the skin of other fish, and the mark that was planted on their head was intentionally made to mimic the color, size, and shape of one of these parasites. In the cleaner wrasse&rsquos Umwelt, this mark is clearly something worth attending to, but it might not be the case for all species. Other animals, for instance, might care much more if their coat smells in an unexpected way than if there is some debris on their head.

Further exemplifying value-ladenness, it is not uncommon for experimental results in comparative psychology to be communicated in a way that depicts the human species as &ldquosuperior,&rdquo which can be illustrated by some experiments that have compared chimpanzee performance to our own. Daniel Povinelli and colleagues (1999), for example, found that chimpanzees outperform children in a gaze-following task, but interpreted the children&rsquos poorer performance as evidence of their possession of superior cognitive capacities (see also Leavens, Bard, & Hopkins 2017). In this experiment, the subjects had to follow the experimenter&rsquos gaze to find a reward, which both children and chimpanzees could do. In one condition, however, the experimenter was oriented towards the reward but gazing above it. While the chimpanzees could still use this gaze to find the reward, the children chose randomly in these trials. Povinelli et al. interpreted this as evidence that children, but not chimpanzees, have a theory of mind, which allowed the children to interpret the above-target gaze as a distracted one that was not worth following. From the chimpanzees&rsquo perspective, however, one could also argue for their superiority in this task, since they were much more successful than the children in actually finding the reward. In another experiment (Jensen, Call, & Tomasello 2007), chimpanzees were tested in an ultimatum game, where one chimpanzee (the proposer) could choose how a reward was distributed among themself and another chimpanzee (the responder). The responder could either choose to accept the offer (in which case, they both kept it) or reject it (which would have meant they both lost it). The experimenters found that the responder would always accept the offer, regardless of how unfair the distribution was, so long as their part was higher than zero. Chimpanzees behaved in the way that economic theory predicts of the fully rational agent, in contrast to humans, who would typically rather be left with zero than accept an unfair distribution. Although the researchers acknowledge this, they also make a point of highlighting how these results illustrate that chimpanzees lack the sensitivity to fairness that characterizes (superior) human societies, though they fail to consider that chimpanzees may have fairness norms in domains outside of food sharing. A further example of results being interpreted to fit the narrative of human superiority comes from a study by Sana Inoue and Tetsuro Matsuzawa (2007), where a chimpanzee named Ayumu was found to strongly outperform humans in terms of speed and accuracy in a working memory task. Although in this paper the superiority of the chimp was acknowledged without caveats, one of the authors (Matsuzawa 2010) later used these results to highlight the superior cognitive capacities of humans, arguing for the existence of a trade-off between memory and abstraction in the phylogeny and ontogeny of our species, such that by letting go of the photographic memory that our ancestors likely shared with chimpanzees, we could develop our characteristic skills for complex representation and language.

2.3 Objectivity

While one might hope that an appeal to objectivity would help to offset the theory-ladenness and value-ladenness of science, philosophers of science tend to describe objectivity as an ideal, rather than an achievable result, of good science. Given that objectivity can be understood as a commitment to value- and theory-free faithfulness to facts that avoid any personal bias or perspective&mdasha view from nowhere&mdashit is at best a guide to balancing biases rather than avoiding them.

Animal cognition research has a particular interest in being objective, for some of the reasons reviewed in the previous two sections. Furthermore, as we saw in section 1, Darwin may have emphasized similarities over differences between species, suggesting a bias toward seeing other animals as like us. If humans are biased towards seeing animals as too much like humans, then some principles to protect against that bias may be needed. We see this in two classic principles used in comparative psychology: Morgan&rsquos Canon and a prohibition against anthropomorphism.

The British biologist and psychologist C. Lloyd Morgan was worried that the methods of Darwin emphasized our tendencies to see others as like us, and he felt the need to offer a corrective to our egocentric biases in the form of what is now called Morgan&rsquos Canon:

In no case may we interpret an action as the outcome of the exercise of a higher psychical faculty, if it can be interpreted as the outcome of the exercise of one which stands lower in the psychological scale. (Morgan 1894: 53)

Morgan describes his Canon by telling us the story of Tony, a fox-terrier pup who was able to open the gate from his garden and escape into the road by putting his head under the latch of the gate, lifting the latch and waiting for the gate to swing open. While it might look like Tony was smart&mdashthat he had a goal and knew how to achieve it by lifting the latch, Morgan invites us to consider other explanations. Perhaps Tony saw that the latch was liftable, and lifted it, without knowing the gate would open or wanting to get out. Perhaps Tony had associated the latch with getting out to the street, and wanted to get to the street, so he pushed against the latch without knowing how it worked. The worry is that, by thinking that Tony was a clever little dog, an observer is being anthropomorphic.

Though in its strictly literal sense, &ldquoanthropomorphism&rdquo simply makes reference to the attribution of human-like characteristics to animals, in its common usage in comparative cognition it has a very negative connotation, such that it is used to make reference to the mistaken attribution of human-like characteristics to animals. Thus understood, anthropomorphism is connected to humans&rsquo well-documented tendency to over-attribute mental states. This was already demonstrated in the 1940s, in a famous study by Fritz Heider and Marianne Simmel (1944). In this study, human participants were shown a video that depicted animated geometric shapes moving around the screen (see Heider and Simmel (1944) animation in Other Internet Resources for the video). Although the shapes make no sounds and have no facial expressions, the participants couldn&rsquot help but interpret their movement in intentional terms and construct a narrative regarding their &ldquointeractions&rdquo. This is a particularly egregious example of our anthropomorphic biases, which lead us to be inclined to interpret the behavior of entities (human, non-human, and beyond) in human-like terms. Scientists studying animal cognition are well aware of this problematic tendency, and they attempt to counter it by stressing the importance of avoiding anthropomorphism.

However, the dictum regarding the need to avoid anthropomorphism is also a philosophical assumption embedded in comparative cognition and, as such, it can be questioned. It has been argued, for instance, that a blanket ban against the attribution of human-like qualities to animals would beg the question by assuming that said qualities are indeed uniquely human (Fitzpatrick 2017a). Frans de Waal (1999) uses the term &ldquoanthropodenial&rdquo to refer to the a priori rejection of the possibility that humans and animals may share characteristics, and argues that it is just as worrisome as anthropomorphism. He points out that there is an imbalance in comparative cognition, such that the over-attribution of mental capacities to animals is seen as much more problematic than the under-attribution. The reason behind this is that comparative psychologists are wary of violating Morgan&rsquos Canon. However, by looking to preserve cognitive parsimony at all costs, de Waal argues, comparative psychologists may be disregarding evolutionary parsimony, which dictates that we should offer explanations that posit the fewest possible changes in the phylogenetic tree. We should thus steer clear of both anthropodenial and anthropomorphism. A similar point was made by Kristin Andrews and Brian Huss (2014), who coined the term &ldquoanthropectomy&rdquo to refer to the mistake of denying that an animal has a certain characteristically human capacity when in fact they do have that capacity. They argue that anthropomorphism and anthropectomy are both errors, they both amount to a false depiction of how the world actually is, and so there is no reason to fear one over the other.

While anti-anthropomorphism attempts to avoid the bias of seeing mind when it isn&rsquot there, it risks placing emphasis on another bias that humans are subject to, namely the bias of human exceptionalism. Anthropectomy arises in cases in which humans see their own cognitive capacities as sophisticated, and other species as only having diluted versions of them. For example, it is anthropectic to observe human cultural practices and infer that because other species don&rsquot have opera houses they don&rsquot have culture. When we make self-serving bias errors and assume too quickly that we are better than other people, we get ourselves and other people wrong. Anthropectic thinking extends this bias toward other animals.

Morgan also warned scientists against anthropectomy. In what we can call Morgan&rsquos Challenge, he warned us not to too quickly ascribe sophisticated capacities as explanations of human behavior:

To interpret animal behavior one must learn also to see one&rsquos own mentality at levels of development much lower than one&rsquos top-level of reflective self-consciousness. It is not easy, and savors somewhat of paradox. (Morgan 1932: 250)

Morgan&rsquos idea that we have to avoid exaggerating human capacities and wrongly denying that we share capacities with other animals has been taken up in current discussions of philosophers and psychologists. For example, as Shettleworth (2010b) points out, much human behavior is controlled by simple and often unconscious cognitive mechanisms of the sort we usually only associate with animals. We tend to disregard this and, as a result, we sometimes engage in anthropomorphism towards humans! Not only is there a tendency to forget that human behavior is often caused by simple mechanisms, one can also see a propensity to exaggerate the prowess of humans at various tasks. If we are wrong about human capacities, and then deny a continuity claim because we fail to find those confabulated capacities in other animals, we commit a double error&mdashconfabulating human cognitive mechanisms plus anthropocentrism, an error Cameron Buckner calls &ldquoanthropofabulation&rdquo (Buckner 2013). For example, despite the fact that we know that human memory is constructive and confabulatory to a large degree, many comparative psychologists engage in anthropofabulation by expecting animals to be capable of mentally replaying past events in order to be credited with episodic memory (Suddendorf & Corballis 2007 Tulving 1985 cited in Buckner 2013: 830).

One way to think of Morgan&rsquos general idea is that to explain animal behavior we need to have a variety of competing hypotheses, and from that set we can then pick the one that fits best. Morgan&rsquos Canon suggests that psychologically lower explanations are best, appealing to a kind of simplicity principle associated with Ockham&rsquos Razor. But what counts as best is a classic problem in the philosophy of science. We turn to this issue next.

2.4 Simplicity

Another aspect of the science of animal cognition where philosophical assumptions become evident is in the preference given to simpler explanations of animal behavior. Simplicity has long been seen as a virtue of scientific explanations, but it is notoriously difficult to decide what makes one explanation simpler than another.

In comparative psychology, and especially when following Morgan&rsquos Canon, simpler explanations are taken by default as the null hypothesis. This means that they are assumed to be true unless proven otherwise, and that the burden of proof falls on the side of more complex explanations. For example, research into causal reasoning in animals takes as its null hypothesis animals&rsquo ability to learn associations between different events. The experiments then probe into whether animals are not just capable of this simple form of learning, but whether they have the more complex capacity to comprehend that some events cause others. When the experimental data support both a simple hypothesis and a more complex one, it is standard among scientists to prefer the simpler explanation. For instance, some have argued that the experiments done to test whether animals are capable of metacognition (or thinking about their thoughts) can in fact be passed by reasoning solely in first-order terms (or thinking about the world). Since first-order thinking is assumed to be a simpler explanation of the data, it has been argued that this explanation ought to be favored (e.g., Carruthers 2003 Hampton 2009).

The idea that simpler explanations ought to be preferred is viewed by many as self-evident, a result of applying Ockham&rsquos Razor to the study of animal cognition, and the claim that an explanation is &ldquosimpler&rdquo or &ldquomore parsimonious&rdquo is often offered as the equivalent to &ldquoEnd of discussion!&rdquo However, in the past couple of decades philosophers have argued that this preference for simplicity in comparative cognition is much more problematic than it may initially seem. For one, many have argued that there is no reason why a simpler explanation will be more likely to be true in this context. Simpler explanations may have other virtues, such as being easier to understand or to describe, but since animal behavior is the result of natural selection, and not the outcome of a process of intelligent design that infallibly delivers optimal solutions, there is no reason to think that it is more likely to be caused by simpler processes (Mikhalevich, Powell, & Logan 2017). In fact, some have argued that when scientists claim a preference for simpler explanations, it is actually some other consideration that is doing the epistemic work, like the idea that we should avoid anthropomorphic descriptions of animal behavior (Fitzpatrick 2017a Sober 1998). We see this, for instance, in Shettleworth&rsquos (2010a) equation of simpler explanations with associative ones. She argues that associative explanations are to be preferred, but this is not a result of a preference for simplicity per se, but stems from the idea that association is phylogenetically widespread in the animal kingdom, and so associative explanations are, in her view, more likely to be true.

In fact, there is no such thing as simplicity per se. Instead, what we have are different types of simplicity depending on how and what we focus on &ldquosimplifying&rdquo. Irina Mikhalevich (2017) has argued that simplification can be done through homogenization, by aiming at fewer entity types through reduction, by decreasing the number of entities or through idealization, by removing entities that are not seen as crucial. Simon Fitzpatrick (2009) distinguishes five different notions of the ideal of simplicity that are used in the literature on comparative cognition:

  1. simplicity as psychological unity (preferring explanations that unify different behaviors by positing a single cognitive mechanism)
  2. simplicity as parsimony of mental representations (preferring X over Y as an explanation whenever Y entails X but not vice versa)
  3. simplicity as less cognitive sophistication (preferring explanations that attribute the least cognitively sophisticated mechanisms)
  4. simplicity as analogy (preferring explanations that posit similar cognitive mechanisms to explain similar behavior in different species)
  5. simplicity as evolutionary parsimony (preferring explanations that posit mechanisms inherited from a common ancestor to explain similar behavior in different species).

Mike Dacey (2016) distinguishes further notions of simplicity, such as a preference for those mechanisms that demand less computational memory, time, or energy, those that require less input data, or those that posit fewer changes in the phylogenetic tree. We previewed an argument along these lines in our discussion of anthropomorphism and anthropodenial&mdashwhich is simpler, positing a simpler cognitive capacity and more changes over evolutionary time, or positing a more sophisticated cognitive capacity and fewer changes over evolutionary time?

These different notions of simplicity are not only conceptually distinct, they are often mutually incompatible or conflicting. Nowhere has this become more evident than in the chimpanzee theory of mind debate. The last few years have witnessed a huge controversy over the experimental results in this research area. A number of studies have shown that chimpanzees can use information regarding what a competitor has visual access to in order to decide whether or not to go for a reward (Hare et al. 2000 Hare, Call, & Tomasello 2001 Karg et al. 2015 Melis, Call, & Tomasello 2006). The controversy arose because these experimental results can in principle be explained by positing that chimpanzees were reasoning about the mental states of others (in this case, about what they could or could not see), or that they were solely reasoning about their behavior (e.g., about their bodily orientation or the existence of an uninterrupted line of gaze between their eyes and the reward). What is interesting about this controversy is that the defenders of each of these options were both claiming to be offering the simplest explanation of the data. For Michael Tomasello and Josep Call (2006), it&rsquos more parsimonious to assume that chimpanzees can understand what others can and cannot see rather than posit that they have learned a different behavioral rule for every relevant situation that involves a competitor&rsquos line of gaze. For Povinelli and Jennifer Vonk (2004), mental state attributions can only be inferred from another&rsquos behavior, which means that every mindreader must also be a behavior reader. Since the opposite is not the case, it&rsquos more parsimonious to assume that chimpanzees are behavior readers rather than mindreaders. The chimpanzee theory of mind controversy perfectly illustrates that the issue of simplicity is a philosophical one, meaning that the question of whether, to what extent, and in what sense simpler explanations ought to be preferred is one that cannot be settled solely through empirical measurements or mathematical calculations, but requires philosophical analysis and argumentation (for discussions, see Clatterbuck 2017, 2018 Sober 2015).

2.5 Summary

Objectivity and simplicity are taken as goals of science, which is understood to aim at tracking the facts of the world. The value-ladenness and theory-ladenness of science, however, shows how difficult it is to achieve those goals. While this is true of science in general, with animal cognition the stakes are different. Human use of animals in food, in medicine, in work, as entertainment, and as companions raises a host of sometimes conflicting goals that scaffold motivated reasoning and risk implicit bias. The quest for truth about animal capacities relates to an over-arching philosophical theme in comparative cognition: the human-animal divide and the question of the extent and areas of continuity between the minds of humans and other animals. This is not a normatively-neutral issue, but rather, the picture that science delivers of the human-animal divide can serve to justify or undermine the practices that constitute the human-animal relationship. As we will see in section 3, addressing the biases and philosophical assumptions that shape the science of animal cognition is of great importance for getting things right philosophically as well as ethically.


Contents

Introduction Edit

There are questions regarding what part of the brain allows us to be self-aware and how we are biologically programmed to be self-aware. V.S. Ramachandran has speculated that mirror neurons may provide the neurological basis of human self-awareness. [5] In an essay written for the Edge Foundation in 2009, Ramachandran gave the following explanation of his theory: ". I also speculated that these neurons can not only help simulate other people's behavior but can be turned 'inward'—as it were—to create second-order representations or meta-representations of your own earlier brain processes. This could be the neural basis of introspection, and of the reciprocity of self awareness and other awareness. There is obviously a chicken-or-egg question here as to which evolved first, but. The main point is that the two co-evolved, mutually enriching each other to create the mature representation of self that characterizes modern humans." [6]

Body Edit

Health Edit

In health and medicine, body awareness is a construct that refers to a person's overall ability to direct their focus on various internal sensations accurately. Both proprioception and interoception allow individuals to be consciously aware of multiple sensations. [7] Proprioception allows individuals and patients to focus on sensations in their muscles and joints, posture, and balance, while interoception is used to determine sensations of the internal organs, such as fluctuating heartbeat, respiration, lung pain, or satiety. Over-acute body-awareness, under-acute body-awareness, and distorted body-awareness are symptoms present in a variety of health disorders and conditions, such as obesity, anorexia nervosa, and chronic joint pain. [8] For example, a distorted perception of satiety present in a patient suffering from anorexia nervosa

Human development Edit

Bodily self-awareness in human development refers to one's awareness of their body as a physical object, with physical properties, that can interact with other objects. Tests have shown that at the age of only a few months old, toddlers are already aware of the relationship between the proprioceptive and visual information they receive. [9] This is called first-person self-awareness.

At around 18 months old and later, children begin to develop reflective self-awareness, which is the next stage of bodily awareness and involves children recognizing themselves in reflections, mirrors, and pictures. [10] Children who have not obtained this stage of bodily self-awareness yet will tend to view reflections of themselves as other children and respond accordingly, as if they were looking at someone else face to face. In contrast, those who have reached this level of awareness will recognize that they see themselves, for instance seeing dirt on their face in the reflection and then touching their own face to wipe it off.

Slightly after toddlers become reflectively self-aware, they begin to develop the ability to recognize their bodies as physical objects in time and space that interact and impact other objects. For instance, a toddler placed on a blanket, when asked to hand someone the blanket, will recognize that they need to get off it to be able to lift it. [9] This is the final stage of body self-awareness and is called objective self-awareness.

Non-human animals Edit

The most relevant conducted "mirror tests" have been done on chimpanzees, elephants, dolphins and magpies.

Apes Edit

Chimpanzees and other apes – species which have been studied extensively – compare the most to humans with the most convincing findings and straightforward evidence in the relativity of self-awareness in animals so far. [11]

Dolphins Edit

Dolphins were put to a similar test and achieved the same results. Diana Reiss, a psycho-biologist at the New York Aquarium discovered that bottlenose dolphins can recognize themselves in mirrors. [12]

Magpies Edit

Researchers also used the mark test or mirror test [13] to study the magpie's self-awareness. As a majority of birds are blind below the beak, Prior et al. [11] marked the birds’ neck with three different colors: red, yellow, and black (as an imitation, as magpies are originally black). When placed in front of a mirror, the birds with the red and yellow spots began scratching at their necks, signaling the understanding of something different being on their bodies. During one trial with a mirror and a mark, three out of the five magpies showed a minimum of one example of self-directed behavior. The magpies explored the mirror by moving toward it and looking behind it. One of the magpies, Harvey, during several trials would pick up objects, pose, do some wing-flapping, all in front of the mirror with the objects in his beak. This represents a sense of self-awareness knowing what is going on within himself and in the present. The authors suggest that self-recognition in birds and mammals may be a case of convergent evolution, where similar evolutionary pressures result in similar behaviors or traits, although they arrive at them via different routes. [14]

A few slight occurrences of behavior towards the magpie's own body happened in the trial with the black mark and the mirror. It is assumed in this study [11] that the black mark may have been slightly visible on the black feathers. Prior et al. [11] stated, "This is an indirect support for the interpretation that the behavior towards the mark region was elicited by seeing the own body in the mirror in conjunction with an unusual spot on the body."

The behaviors of the magpies clearly contrasted with no mirror present. In the no-mirror trials, a non-reflective gray plate of the same size and in the same position as the mirror was swapped in. There were not any mark directed self-behaviors when the mark was present, in color, or in black. [11] Prior's et al. [11] data quantitatively matches the findings in chimpanzees. In summary of the mark test, [11] the results show that magpies understand that a mirror image represents their own body magpies show to have self-awareness.

The four stages in the mirror test Edit

During the test, the experimenter looks for the animals to undergo four stages:

  1. social response,
  2. physical mirror inspection,
  3. repetitive mirror testing behavior, and
  4. the mark test, which involves the animals spontaneously touching a mark on their body which would have been difficult to see without the mirror. [15]

Three "types" of self-awareness Edit

David DeGrazia states that there are three types of self-awareness in animals.

  1. Bodily self-awareness
    1. This sense of awareness allows animals to understand that they are different from the rest of the environment it is also the reason why animals do not eat themselves. Bodily-awareness also includes proprioception and sensation.
    1. This type of awareness is seen in highly social animals and is the awareness that they have a role within themselves in order to survive. This type of awareness allows animals to interact with each other.
    1. This awareness is responsible for animals to understand feelings, desires, and beliefs. [16]

    The "red-spot" technique Edit

    The red-spot technique created and experimented by Gordon G. Gallup [17] studies self-awareness in animals (primates). In this technique, a red odorless spot is placed on an anesthetized primate's forehead. The spot is placed on the forehead so that it can only be seen through a mirror. Once the individual awakens, independent movements toward the spot after seeing their reflection in a mirror are observed. During the red-spot technique, after looking in the mirror, chimpanzees used their fingers to touch the red dot that was on their forehead and, after touching the red dot they would even smell their fingertips. [18] "Animals that can recognize themselves in mirrors can conceive of themselves," says Gallup. Another prime example are elephants. Three elephants were exposed to large mirrors where experimenters studied the reaction when the elephants saw their reflection. These elephants were given the "litmus mark test" in order to see whether they were aware of what they were looking at. This visible mark was applied on the elephants and the researchers reported a large progress with self-awareness. The elephants shared this success rate with other animals such as monkeys and dolphins. [19]

    Cooperation and evolutionary problems Edit

    An organism can be effectively altruistic without being self-aware, aware of any distinction between egoism and altruism, or aware of qualia in others. This by simple reactions to specific situations which happens to benefit other individuals in the organism's natural environment. If self-awareness led to a necessity of an emotional empathy mechanism for altruism and egoism being default in its absence, that would have precluded evolution from a state without self-awareness to a self-aware state in all social animals. The ability of the theory of evolution to explain self-awareness can be rescued by abandoning the hypothesis of self-awareness being a basis for cruelty. [20] [21]

    Self-awareness has been called "arguably the most fundamental issue in psychology, from both a developmental and an evolutionary perspective." [22]

    Self-awareness theory, developed by Duval and Wicklund in their 1972 landmark book A theory of objective self awareness, states that when we focus our attention on ourselves, we evaluate and compare our current behavior to our internal standards and values. This elicits a state of objective self-awareness. We become self-conscious as objective evaluators of ourselves. [23] However self-awareness is not to be confused with self-consciousness. [24] Various emotional states are intensified by self-awareness. However, some people may seek to increase their self-awareness through these outlets. People are more likely to align their behavior with their standards when made self-aware. People will be negatively affected if they don't live up to their personal standards. Various environmental cues and situations induce awareness of the self, such as mirrors, an audience, or being videotaped or recorded. These cues also increase accuracy of personal memory. [25] In one of Andreas Demetriou's neo-Piagetian theories of cognitive development, self-awareness develops systematically from birth through the life span and it is a major factor for the development of general inferential processes. [26] Moreover, a series of recent studies showed that self-awareness about cognitive processes participates in general intelligence on a par with processing efficiency functions, such as working memory, processing speed, and reasoning. [27] Albert Bandura's theory of self-efficacy builds on our varying degrees of self-awareness. It is "the belief in one's capabilities to organize and execute the courses of action required to manage prospective situations." A person's belief in their ability to succeed sets the stage to how they think, behave and feel. Someone with a strong self-efficacy, for example, views challenges as mere tasks that must be overcome, and are not easily discouraged by setbacks. They are aware of their flaws and abilities and choose to utilize these qualities to the best of their ability. Someone with a weak sense of self-efficacy evades challenges and quickly feels discouraged by setbacks. They may not be aware of these negative reactions, and therefore do not always change their attitude. This concept is central to Bandura's social cognitive theory, "which emphasizes the role of observational learning, social experience, and reciprocal determinism in the development of personality." [28]

    Developmental stages Edit

    Individuals become conscious of themselves through the development of self-awareness. [22] This particular type of self-development pertains to becoming conscious of one's own body and mental state of mind including thoughts, actions, ideas, feelings and interactions with others. [29] "Self-awareness does not occur suddenly through one particular behavior: it develops gradually through a succession of different behaviors all of which relate to the self." [30] The monitoring of one's mental states is called metacognition and it is considered to be an indicator that there is some concept of the self. [31] It is developed through an early sense of non-self components using sensory and memory sources. In developing self–awareness through self-exploration and social experiences one can broaden one's social world and become more familiar with the self.

    According to Emory University's Philippe Rochat, there are five levels of self-awareness which unfold in early development and six potential prospects ranging from "Level 0" (having no self-awareness) advancing complexity to "Level 5" (explicit self-awareness). [22]

    • Level 0: Confusion. At this level the individual has a degree of zero self-awareness. This person is unaware of any mirror reflection or the mirror itself. They perceive the mirror as an extension of their environment. Level 0 can also be displayed when an adult frightens himself in a mirror mistaking his own reflection as another person just for a second.
    • Level 1: Differentiation. The individual realizes the mirror is able to reflect things. They see that what is in the mirror is different from what is surrounding them. At this level they can differentiate between their own movement in the mirror and the movement of the surrounding environment.
    • Level 2: Situation. At this point an individual can link the movements on the mirror to what is perceived within their own body. This is the first hint of self-exploration on a projected surface where what is visualized on the mirror is special to the self.
    • Level 3: Identification. This stage is characterized by the new ability to identify self: an individual can now see that what's in the mirror is not another person but actually them. It is seen when a child, instead of referring to the mirror while referring to themselves, refers to themselves while looking in the mirror.
    • Level 4: Permanence. Once an individual reaches this level they can identify the self beyond the present mirror imagery. They are able to identify the self in previous pictures looking different or younger. A "permanent self" is now experienced.
    • Level 5: Self-consciousness or "meta" self-awareness. At this level not only is the self seen from a first person view but it is realized that it is also seen from a third person's view. They begin to understand they can be in the mind of others. For instance, how they are seen from a public standpoint. [22]

    It is to be kept in mind that as an infant comes into this world, they have no concept of what is around them, nor for the significance of others around them. It is throughout the first year that they gradually begin to acknowledge that their body is actually separate from that of their mother, and that they are an "active, causal agent in space". By the end of the first year, they additionally realize that their movement, as well, is separate from movement of the mother. That is a huge advance, yet they are still quite limited and cannot yet know what they look like, "in the sense that the infant cannot recognize its own face". [32] By the time an average toddler reaches 18–24 months, they will discover themselves and recognize their own reflection in the mirror, [33] however research has found that this age varies widely with differing socioeconomic levels and differences relating to culture and parenting. [34] They begin to acknowledge the fact that the image in front of them, who happens to be them, moves indicating that they appreciate and can consider the relationship between cause and effect that is happening. [32] By the age of 24 months the toddler will observe and relate their own actions to those actions of other people and the surrounding environment. [33] Once an infant has gotten a lot of experience, and time, in front of a mirror, it is only then that they are able to recognize themselves in the reflection, and understand that it is them. For example, in a study, an experimenter took a red marker and put a fairly large red dot (so it is visible by the infant) on the infant's nose, and placed them in front of a mirror. Prior to 15 months of age, the infant will not react to this, but after 15 months of age, they will either touch their nose, wondering what it is they have on their face, or point to it. This indicates the appearance that they recognize that the image they see in the reflection of the mirror is themselves. [9] There is somewhat of the same thing called the mirror-self recognition task, and it has been used as a research tool for numerous years, and has given, and lead to, key foundations of the infant's sense/awareness of self. [9] For example, "for Piaget, the objectification of the bodily self occurs as the infant becomes able to represent the body's spatial and causal relationship with the external world (Piaget, 1954).< [9] Facial recognition places a big pivotal point in their development of self-awareness. [32] By 18 months, the infant can communicate their name to others, and upon being shown a picture they are in, they can identify themselves. By two years old, they also usually acquire gender category and age categories, saying things such as "I am a girl, not a boy" and "I am a baby or child, not a grownup". Evidently, it is not at the level of an adult or an adolescent, but as an infant moves to middle childhood and onwards to adolescence, they develop a higher level of self-awareness and self-description. [32]

    As infants develop their senses, using multiple senses of in order to recognize what is around them, infants can become affected by something known as "facial multi stimulation". In one experiment by Filippetti, Farroni, and Johnson, an infant of around five months in age is given what is known as an “enfacement illusion”. [35] “Infants watched a side-by-side video display of a peer’s face being systematically stroked on the cheek with a paintbrush. During the video presentation, the infant’s own cheek was stroked in synchrony with one video and in asynchrony with the other”. [35] Infants were proven to recognize and project an image of a peer with that of their own, showing beginning signs of facial recognition cues onto one's self, with the assistance of an illusion.

    Piaget Edit

    Around school age a child's awareness of personal memory transitions into a sense of one's own self. At this stage, a child begins to develop interests along with likes and dislikes. This transition enables the awareness of an individual's past, present, and future to grow as conscious experiences are remembered more often. [33] As a preschooler, they begin to give much more specific details about things, instead of generalizing. For example, the preschooler will talk about the Los Angeles Lakers basketball team, and the New York Rangers hockey team, instead of the infant just stating that he likes sports. Furthermore, they will start to express certain preferences (e.g., Tod likes mac and cheese) and will start to identify certain possessions of theirs (e.g., Lara has a bird as a pet at home). At this age, the infant is in the stage Piaget names the pre operational stage of development. The infant is very inaccurate at judging themselves because they do not have much to go about. For example, an infant at this stage will not associate that they are strong with their ability to cross the jungle gym at their school, nor will they associate the fact that they can solve a math problem with their ability to count. [32]

    One becomes conscious of their emotions during adolescence. Most children are aware of emotions such as shame, guilt, pride and embarrassment by the age of two, but do not fully understand how those emotions affect their life. [36] By age 13, children become more in touch with these emotions and begin to apply them to their own lives. A study entitled "The Construction of the Self" found that many adolescents display happiness and self-confidence around friends, but hopelessness and anger around parents due to the fear of being a disappointment. Teenagers were also shown to feel intelligent and creative around teachers, and shy, uncomfortable and nervous around people they were not familiar with. [37]

    In adolescent development, the definition self-awareness also has a more complex emotional context due to the maturity of adolescents compared to those in the early childhood phase, and these elements can include but are not limited to self-image, self-concept, and self–consciousness along many other traits that can relate to Rochat's final level of self awareness, however it is still a distinct concept within its own previous definition. [38] Social interactions mainly separate the element of self-awareness in adolescent rather than in childhood, as well as further developed emotional recognition skills in adolescents. Sandu, Pânișoară, and Pânișoară demonstrate these in their work with teenagers and demonstrates that there is a mature sense of self-awareness with students who were aged 17, which in term provides a clear structure with how elements like self-concept, self-image, and self-consciousness relate to self-awareness. [38]

    Mental health Edit

    As children reach their adolescent stages of life, the acute sense of emotion has widened into a meta cognitive state in which mental health issues can become more prevalent due to their heightened emotional and social development. [39] There are elements of contextual behavioral science such as Self-as-Content, Self-as-Process and Self-as-Context, involved with adolescent self-awareness that can associate with mental health. [39] Moran, Almada, and McHugh presented the idea that these domains of self are associated with adolescent mental health in various capacities. [39] Anger management is also a domain of mental health that is associated with the concept of self-awareness in teens. [40] Self-awareness training has been linked to lowering anger management issues and reducing aggressive tendencies in adolescents: “Persons having sufficient self-awareness promote relaxation and awareness about themselves and when going angry, at the first step they become aware of anger in their inside and accept it, then try to handle it”. [40]

    Locke Edit

    An early philosophical discussion of self-awareness is that of John Locke. Locke was apparently influenced by René Descartes' statement normally translated 'I think, therefore I am' (Cogito ergo sum). In chapter XXVII "On Identity and Diversity" of Locke's An Essay Concerning Human Understanding (1689) he conceptualized consciousness as the repeated self-identification of oneself through which moral responsibility could be attributed to the subject—and therefore punishment and guiltiness justified, as critics such as Nietzsche would point out, affirming ". the psychology of conscience is not 'the voice of God in man' it is the instinct of cruelty . expressed, for the first time, as one of the oldest and most indispensable elements in the foundation of culture." [41] [42] [43] John Locke does not use the terms self-awareness or self-consciousness though. [44]

    According to Locke, personal identity (the self) "depends on consciousness, not on substance". [45] [ failed verification ] We are the same person to the extent that we are conscious of our past and future thoughts and actions in the same way as we are conscious of our present thoughts and actions. If consciousness is this "thought" which doubles all thoughts, then personal identity is only founded on the repeated act of consciousness: "This may show us wherein personal identity consists: not in the identity of substance, but . in the identity of consciousness." [45] For example, one may claim to be a reincarnation of Plato, therefore having the same soul. However, one would be the same person as Plato only if one had the same consciousness of Plato's thoughts and actions that he himself did. Therefore, self-identity is not based on the soul. One soul may have various personalities.

    Locke argues that self-identity is not founded either on the body or the substance, as the substance may change while the person remains the same. "Animal identity is preserved in identity of life, and not of substance", as the body of the animal grows and changes during its life. [45] describes a case of a prince and a cobbler in which the soul of the prince is transferred to the body of the cobbler and vice versa. The prince still views himself as a prince, though he no longer looks like one. This border-case leads to the problematic thought that since personal identity is based on consciousness, and that only oneself can be aware of his consciousness, exterior human judges may never know if they really are judging—and punishing—the same person, or simply the same body. Locke argues that one may be judged for the actions of one's body rather than one's soul, and only God knows how to correctly judge a man's actions. Men also are only responsible for the acts of which they are conscious. This forms the basis of the insanity defense which argues that one cannot be held accountable for acts in which they were unconsciously irrational, or mentally ill [46] — In reference to man's personality, Locke claims that "whatever past actions it cannot reconcile or appropriate to that present self by consciousness, it can be no more concerned in it than if they had never been done: and to receive pleasure or pain, i.e. reward or punishment, on the account of any such action, is all one as to be made happy or miserable in its first being, without any demerit at all." [47]

    The medical term for not being aware of one's deficits is anosognosia, or more commonly known as a lack of insight. Having a lack of awareness raises the risks of treatment and service nonadherence. [48] Individuals who deny having an illness may be against seeking professional help because they are convinced that nothing is wrong with them. Disorders of self-awareness frequently follow frontal lobe damage. [49] There are two common methods used to measure how severe an individual's lack of self-awareness is. The Patient Competency Rating Scale (PCRS) evaluates self-awareness in patients who have endured a traumatic brain injury. [50] PCRS is a 30-item self-report instrument which asks the subject to use a 5-point Likert scale to rate his or her degree of difficulty in a variety of tasks and functions. Independently, relatives or significant others who know the patient well are also asked to rate the patient on each of the same behavioral items. The difference between the relatives’ and patient's perceptions is considered an indirect measure of impaired self-awareness. The limitations of this experiment rest on the answers of the relatives. Results of their answers can lead to a bias. This limitation prompted a second method of testing a patient's self-awareness. Simply asking a patient why they are in the hospital or what is wrong with their body can give compelling answers as to what they see and are analyzing. [51]

    Anosognosia Edit

    Anosognosia was a term coined by Joseph Babinski to describe the clinical condition in which an individual suffered from left hemiplegia following a right cerebral hemisphere stroke yet denied that there were any problems with their left arm or leg. This condition is known as anosognosia for hemiplegia (AHP). This condition has evolved throughout the years and is now used to describe people who lack subjective experience in both neurological and neuropsychological cases. [52] A wide variety of disorders are associated with anosognosia. For example, patients who are blind from cortical lesions might in fact be unaware that they are blind and may state that they do not suffer from any visual disturbances. Individuals with aphasia and other cognitive disorders may also suffer from anosognosia as they are unaware of their deficiencies and when they make certain speech errors, they may not correct themselves due to their unawareness. [53] Individuals who suffer from Alzheimer's disease lack awareness this deficiency becomes more intense throughout their disease. [54] A key issue with this disorder is that people who do have anosognosia and suffer from certain illnesses may not be aware of them, which ultimately leads them to put themselves in dangerous positions and/or environments. [53] To this day there are still no available treatments for AHP, but it has been documented that temporary remission has been used following vestibular stimulation. [55]

    Dissociative identity disorder Edit

    Dissociative identity disorder or multiple personality disorder (MPD) is a disorder involving a disturbance of identity in which two or more separate and distinct personality states (or identities) control an individual's behavior at different times. [56] One identity may be different from another, and when an individual with DID is under the influence of one of their identities, they may forget their experiences when they switch to the other identity. "When under the control of one identity, a person is usually unable to remember some of the events that occurred while other personalities were in control." [57] They may experience time loss, amnesia, and adopt different mannerisms, attitudes, speech and ideas under different personalities. They are often unaware of the different lives they lead or their condition in general, feeling as though they are looking at their life through the lens of someone else, and even being unable to recognize themselves in a mirror. [58] Two cases of DID have brought awareness to the disorder, the first case being that of Eve. This patient harbored three different personalities: Eve White the good wife and mother, Eve Black the party girl, and Jane the intellectual. Under stress, her episodes would worsen. She even tried to strangle her own daughter and had no recollection of the act afterward. Eve went through years of therapy before she was able to learn how to control her alters and be mindful of her disorder and episodes. Her condition, being so rare at the time, inspired the book and film adaptation The Three Faces of Eve, as well as a memoir by Eve herself entitled I'm Eve. Doctors speculated that growing up during the Depression and witnessing horrific things being done to other people could have triggered emotional distress, periodic amnesia, and eventually DID. [59] In the second case, Shirley Mason, or Sybil, was described as having over 16 separate personalities with different characteristics and talents. Her accounts of horrific and sadistic abuse by her mother during childhood prompted doctors to believe that this trauma caused her personalities to split, furthering the unproven idea that this disorder was rooted in child abuse, while also making the disorder famous. In 1998 however, Sybil's case was exposed as a sham. Her therapist would encourage Sybil to act as her other alter ego although she felt perfectly like herself. Her condition was exaggerated in order to seal book deals and television adaptations. [59] Awareness of this disorder began to crumble shortly after this finding. To this day, no proven cause of DID has been found, but treatments such as psychotherapy, medications, hypnotherapy, and adjunctive therapies have proven to be very effective. [60]

    Autism spectrum disorder Edit

    Autism spectrum disorder (ASD) is a range of neurodevelopmental disabilities that can adversely impact social communication and create behavioral challenges (Understanding Autism, 2003). [61] "Autism spectrum disorder (ASD) and autism are both general terms for a group of complex disorders of brain development. These disorders are characterized, in varying degrees, by difficulties in social interaction, verbal and nonverbal communication and repetitive behaviors." [62] ASDs can also cause imaginative abnormalities and can range from mild to severe, especially in sensory-motor, perceptual and affective dimensions. [63] Children with ASD may struggle with self-awareness and self acceptance. Their different thinking patterns and brain processing functions in the area of social thinking and actions may compromise their ability to understand themselves and social connections to others. [64] About 75% diagnosed autistics are mentally handicapped in some general way and the other 25% diagnosed with Asperger's Syndrome show average to good cognitive functioning. [65] It is well known that children suffering from varying degrees of autism struggle in social situations. Scientists at the University of Cambridge have produced evidence that self-awareness is a main problem for people with ASD. Researchers used functional magnetic resonance scans (FMRI) to measure brain activity in volunteers being asked to make judgments about their own thoughts, opinions, preferences, as well as about someone else's. One area of the brain closely examined was the ventromedial pre-frontal cortex (vMPFC) which is known to be active when people think about themselves. [66]

    A study out of Stanford University has tried to map out brain circuits with understanding self-awareness in Autism Spectrum Disorders. [67] This study suggests that self-awareness is primarily lacking in social situations but when in private they are more self-aware and present. It is in the company of others while engaging in interpersonal interaction that the self-awareness mechanism seems to fail. Higher functioning individuals on the ASD scale have reported that they are more self-aware when alone unless they are in sensory overload or immediately following social exposure. [68] Self-awareness dissipates when an autistic is faced with a demanding social situation. This theory suggests that this happens due to the behavioral inhibitory system which is responsible for self-preservation. This is the system that prevents human from self-harm like jumping out of a speeding bus or putting our hand on a hot stove. Once a dangerous situation is perceived then the behavioral inhibitory system kicks in and restrains our activities. "For individuals with ASD, this inhibitory mechanism is so powerful, it operates on the least possible trigger and shows an over sensitivity to impending danger and possible threats. [68] Some of these dangers may be perceived as being in the presence of strangers, or a loud noise from a radio. In these situations self-awareness can be compromised due to the desire of self preservation, which trumps social composure and proper interaction.

    The Hobson hypothesis reports that autism begins in infancy due to a lack of cognitive and linguistic engagement, which results in impaired reflective self-awareness. In this study ten children with Asperger Syndrome were examined using the Self-understanding Interview. This interview was created by Damon and Hart and focuses on seven core areas or schemas that measure the capacity to think in increasingly difficult levels. This interview will estimate the level of self understanding present. "The study showed that the Asperger group demonstrated impairment in the 'self-as-object' and 'self-as-subject' domains of the Self-understanding Interview, which supported Hobson's concept of an impaired capacity for self-awareness and self-reflection in people with ASD." [69] Self-understanding is a self description in an individual's past, present and future. Without self-understanding it is reported that self-awareness is lacking in people with ASD.

    Joint attention (JA) was developed as a teaching strategy to help increase positive self-awareness in those with autism spectrum disorder. [70] JA strategies were first used to directly teach about reflected mirror images and how they relate to their reflected image. Mirror Self Awareness Development (MSAD) activities were used as a four-step framework to measure increases in self-awareness in those with ASD. Self-awareness and knowledge is not something that can simply be taught through direct instruction. Instead, students acquire this knowledge by interacting with their environment. [70] Mirror understanding and its relation to the development of self leads to measurable increases in self-awareness in those with ASD. It also proves to be a highly engaging and highly preferred tool in understanding the developmental stages of self- awareness.

    There have been many different theories and studies done on what degree of self-awareness is displayed among people with autism spectrum disorder. Scientists have done research about the various parts of the brain associated with understanding self and self-awareness. Studies have shown evidence of areas of the brain that are impacted by ASD. Other theories suggest that helping an individual learn more about themselves through Joint Activities, such as the Mirror Self Awareness Development may help teach positive self-awareness and growth. In helping to build self-awareness it is also possible to build self-esteem and self acceptance. This in turn can help to allow the individual with ASD to relate better to their environment and have better social interactions with others.

    Schizophrenia Edit

    Schizophrenia is a chronic psychiatric illness characterized by excessive dopamine activity in the mesolimbic tract and insufficient dopamine activity in the mesocortical tract leading to symptoms of psychosis along with poor cognition in socialization. Under the Diagnostic and Statistical Manual of Mental Disorders, people with schizophrenia have a combination of positive, negative and psychomotor symptoms. These cognitive disturbances involve rare beliefs and/or thoughts of a distorted reality that creates an abnormal pattern of functioning for the patient. The cause of schizophrenia has a substantial genetic component involving many genes. While the heritability of schizophrenia has been found to be around 80%, only about 60% of sufferers report a positive family history of the disorder, and ultimately the cause is thought to be a combination of genetic and environmental factors. [71] It is believed that the experience of stressful life events is an environmental factor that can trigger the onset of schizophrenia in individuals who already are at risk from genetics and age. [72] The level of self-awareness among patients with schizophrenia is a heavily studied topic.

    Schizophrenia as a disease state is characterized by severe cognitive dysfunction and it is uncertain to what extent patients are aware of this deficiency. Medalia and Lim (2004) investigated patients’ awareness of their cognitive deficit in the areas of attention, nonverbal memory, and verbal memory. [73] Results from this study (N=185) revealed large discrepancy in patients’ assessment of their cognitive functioning relative to the assessment of their clinicians. Though it is impossible to access one's consciousness and truly understand what a schizophrenic believes, regardless in this study, patients were not aware of their cognitive dysfunctional reasoning. In the DSM-5, to receive a diagnosis of schizophrenia, they must have two or more of the following symptoms in the duration of one month: delusions*, hallucinations*, disorganized speech*, grossly disorganized/catatonic behavior and negative symptoms (*these three symptoms above all other symptoms must be present to correctly diagnose a patient.) Sometimes these symptoms are very prominent and are treated with a combination of antipsychotics (i.e. haloperidol, loxapine), atypical antipsychotics (such as clozapine and risperidone) and psychosocial therapies that include family interventions and socials skills. When a patient is undergoing treatment and recovering from the disorder, the memory of their behavior is present in a diminutive amount thus, self-awareness of diagnoses of schizophrenia after treatment is rare, as well as subsequent to onset and prevalence in the patient.

    The above findings are further supported by a study conducted by Amador and colleagues. [74] The study suggests a correlation exists between patient insight, compliance, and disease progression. Investigators assess insight of illness was assessed via Scale to Assess Unawareness of Mental Disorder and was used along with rating of psychopathology, course of illness, and compliance with treatments in a sample of 43 patients. Patients with poor insight are less likely to be compliant with treatment and are more likely to have a poorer prognosis. Patients with hallucinations sometimes experience positive symptoms, which can include delusions of reference, thought insertion/withdrawal, thought broadcast, delusions of persecution, grandiosity, and many more. These psychoses skew the patient's perspectives of reality in ways in which they truly believe are really happening. For instance, a patient that is experiencing delusions of reference may believe while watching the weather forecast that when the weatherman says it will rain, he is really sending a message to the patient in which rain symbolizes a specific warning completely irrelevant to what the weather is. Another example would be thought broadcast, which is when a patient believes that everyone can hear their thoughts. These positive symptoms sometimes are so severe to where the schizophrenic believes that something is crawling on them or smelling something that is not there in reality. These strong hallucinations are intense and difficult to convince the patient that they do not exist outside of their cognitive beliefs, making it extremely difficult for a patient to understand and become self-aware that what they are experiencing is in fact not there.

    Furthermore, a study by Bedford and Davis [75] (2013) was conducted to look at the association of denial vs. acceptance of multiple facets of schizophrenia (self-reflection, self-perception, and insight) and its effect on self-reflection (N=26). Study results suggest patients with increased disease denial have lower recollection for self-evaluated mental illnesses. To a great extent, disease denial creates a hardship for patients to undergo recovery because their feelings and sensations are intensely outstanding. But just as this and the above studies imply, a large proportion of schizophrenics do not have self-awareness of their illness for many factors and severity of reasoning of their diagnoses.

    Bipolar disorder Edit

    Bipolar disorder is an illness that causes shifts in mood, energy, and ability to function. Self-awareness is crucial in those suffering from this disease, as they must be able to distinguish between feeling a certain way because of the disorder or because of separate issues. "Personality, behavior, and dysfunction affect your bipolar disorder, so you must 'know' yourself in order to make the distinction." [76] This disorder is a difficult one to diagnose, as self-awareness changes with mood. "For instance, what might appear to you as confidence and clever ideas for a new business venture might be a pattern of grandiose thinking and manic behavior". [77] Issues occur between understanding irrationality in a mood swing and being completely wrapped in a manic episode, rationalizing that the exhibited behaviors are normal.

    It is important to be able to distinguish what are symptoms of bipolar disorder and what is not. A study done by Mathew et al. was done with the aim of "examining the perceptions of illness in self and among other patients with bipolar disorder in remission". [78]

    The study took place at the Department of Psychiatry, Christian Medical College, Vellore, India, which is a centre that specializes in the "management of patients with mental and behavioural disorders". [78] Eighty two patients (thirty two female and fifty male) agreed to partake in the study. These patients met the "International Classification of Diseases – 10 diagnostic criteria for a diagnosis of bipolar disorder I or II and were in remission" [78] and were put through a variety of baseline assessments before beginning the study. These baseline assessments included using a vignette, which was then used as an assessment tool during their follow-up. Patients were then randomly divided into two groups, one who would be following a "structured educational intervention programme" [78] (experimental group), while the other would be following "usual care" (control group).

    The study was based on an interview in which patients were asked an array of open-eded questions regarding topics such as "perceived causes, consequences, severity and its effects on body, emotion, social network and home life, and on work, severity, possible course of action, help-seeking behaviour and the role of the doctor/healer". [78] The McNemar test was then used to compare the patients perspective of the illness versus their explanation of the illness. The results of the study show that the beliefs that patients associated with their illness corresponds with the possible causes of the disorder, [78] whereas "studies done among patients during periods of active psychosis have recorded disagreement between their assessments of their own illness". [79] This ties in to how difficult self-awareness is within people who suffer from bipolar disorder.

    Although this study was done on a population that were in remission from the disease, the distinction between patients during "active psychosis" versus those in remission shows the evolution of their self-awareness throughout their journey to recovery.

    Self-discrimination in plants is found within their roots, tendrils and flowers that avoid themselves but not others in their environment. [80]

    Self-incompatibility mechanism providing evidence for self-awareness in plants Edit

    Self-awareness in plants is a fringe topic in the field of self-awareness, and is researched predominantly by botanists. The claim that plants are capable of perceiving self lies in the evidence found that plants will not reproduce with themselves due to a gene selecting mechanism. In addition, vining plants have been shown to avoid coiling around themselves, due to chemical receptors in the plants' tendrils. Unique to plants, awareness of self means that the plant can recognise self, whereas all other known conceptions of self-awareness is the ability to recognise what is not self. [ citation needed ]

    Recognition and rejection of self in plant reproduction Edit

    Research by June B. Nasrallah discovered that the plant's pollination mechanism also serves as a mechanism against self-reproduction, which lays out the foundation of scientific evidence that plants could be considered as self-aware organisms. The SI (Self-incompatibility) mechanism in plants is unique in the sense that awareness of self derives from the capacity to recognise self, rather than non-self. The SI mechanism function depends primarily on the interaction between genes S-locus receptor protein kinase (SRK) and S-locus cysteine-rich protein gene (SCR). In cases of self-pollination, SRK and SCR bind to activate SKR, Inhibiting pollen from fertilizing. In cases of cross-pollination, SRK and SCR do not bind and therefore SRK is not activated, causing the pollen to fertilise. In simple terms, the receptors either accept or reject the genes present in the pollen, and when the genes are from the same plant, the SI mechanism described above creates a reaction to prevent the pollen from fertilising. [ citation needed ]

    Self-discrimination in the tendrils of the vine Cayratia japonica mediated by physiological connection Edit

    The research by Yuya Fukano and Akira Yamawo provides a link between self-discrimination in vining plants and amongst other classifications where the mechanism discovery has already been established. It also contributes to the general foundation of evidence of self-discrimination mechanisms in plants. The article makes the claim that the biological self-discrimination mechanism that is present in both flowering plants and ascidians, are also present in vining plants. They tested this hypothesis by doing touch tests with self neighbouring and non-self neighbouring pairs of plants. the test was performed by placing the sets of plants close enough for their tendrils to interact with one-another. Evidence of self-discrimination in above-ground plants is demonstrated in the results of the touch testing, which showed that in cases of connected self plants, severed self plants and non-self plants, the rate of tendril activity and likeliness to coil was higher among separated plants than those attached via rhizomes. [ citation needed ]

    Theater also concerns itself with other awareness besides self-awareness. There is a possible correlation between the experience of the theater audience and individual self-awareness. As actors and audiences must not "break" the fourth wall in order to maintain context, so individuals must not be aware of the artificial, or the constructed perception of his or her reality. This suggests that both self-awareness and the social constructs applied to others are artificial continuums just as theater is. Theatrical efforts such as Six Characters in Search of an Author, or The Wonderful Wizard of Oz, construct yet another layer of the fourth wall, but they do not destroy the primary illusion. Refer to Erving Goffman's Frame Analysis: An Essay on the Organization of Experience. [ citation needed ]

    In science fiction, self-awareness describes an essential human property that often (depending on the circumstances of the story) bestows personhood onto a non-human. If a computer, alien or other object is described as "self-aware", the reader may assume that it will be treated as a completely human character, with similar rights, capabilities and desires to a normal human being. [81] The words "sentience", "sapience" and "consciousness" are used in similar ways in science fiction.

    In order to be "self-aware," robots can use internal models to simulate their own actions. [82]


    STSR tests confirm that dogs have self-awareness

    A new study carried out by the Department of Psychology at Barnard College in the U.S. used a sniff test to evaluate the ability of dogs to recognize themselves. The results have been published in the journal Behavioural Processes.

    The experiment confirms the hypothesis of dog self-cognition proposed last year by Prof. Roberto Cazzolla Gatti of the Biological Institute of the Tomsk State University, Russia. Dr. Alexandra Horowitz, the lead researcher, wrote, "While domestic dogs, Canis familiaris, have been found to be skillful at social cognitive tasks and even some meta-cognitive tasks, they have not passed the test of mirror self-recognition (MSR)."

    Prof. Horowitz borrowed the "Sniff test of self-recognition (STSR)" proposed by Prof. Cazzolla Gatti in 2016 to shed light on methods of testing for self-recognition, and applied it to 36 domestic dogs accompanied by their owners.

    This study confirmed the previous evidence proposed with the STSR by Dr. Cazzolla Gatti showing that "dogs distinguish between the olfactory 'image' of themselves when modified: Investigating their own odour for longer when it had an additional odour accompanying it than when it did not. Such behaviour implies a recognition of the odour as being of or from 'themselves.'"

    Prof. Cazzolla Gatti firstly suggested the hypothesis of self-cognition in dogs in a 2016 pioneering paper entitled after the novel by Lewis Carroll "Self-consciousness: beyond the looking-glass and what dogs found there."

    As the Associate Professor of the Tomsk State University anticipated: "this sniff-test could change the way some experiments on animal behaviour are validated." Soon, the study of Dr. Horowitz followed.

    "I believe that dogs and other animals, being much less sensitive to visual stimuli than humans and many apes, cannot pass the mirror test because of the sensory modality chosen by the investigator to test self-awareness. This in not necessarily due to the absence of this cognitive ability in some animal species," says Cazzolla Gatti.

    Prof. Cazzolla Gatti's idea, as recently confirmed by Dr. Horowitz on a larger samples of dogs, shows that "the sniff test of self-recognition (STSR), even when applied to multiple individuals living in groups and with different ages and sexes, provides significant evidence of self-awareness in dogs, and can play a crucial role in showing that this capacity is not a specific feature of only great apes, humans and a few other animals, but it depends on the way in which researchers try to verify it."

    The innovative approach to test the self-awareness highlighted the need to shift the paradigm of the anthropocentric idea of consciousness to a species-specific perspective. As Prof. Cazzolla Gatti anticipated last year in his paper: "We would never expect that a mole or a bat can recognize themselves in a mirror, but now we have strong empirical evidence to suggest that if species other than primates are tested on chemical or auditory perception base we could get really unexpected results."

    This new study published in the journal Behavioural Processes, validating the sniff test of self-recognition (STSR) and the hypothesis of a self-awareness in dogs and other animals developed by Prof. Roberto Cazzolla Gatti, pushes ethologists to move "beyond the looking-glass to see what other animals can found there."


    Self-awareness is what makes us human

    I now run a neuroscience lab dedicated to the study of self-awareness at University College London. My team is one of several working within the Wellcome Centre for Human Neuroimaging, located in an elegant town house in Queen Square in London. The basement of our building houses large machines for brain imaging, and each group in the Centre uses this technology to study how different aspects of the mind and brain work: how we see, hear, remember, speak, make decisions, and so on. The students and postdocs in my lab focus on the brain's capacity for self-awareness. I find it a remarkable fact that something unique about our biology has allowed the human brain to turn its thoughts on itself.

    Until quite recently, however, this all seemed like nonsense. As the nineteenth-century French philosopher Auguste Comte put it: "The thinking individual cannot cut himself in two — one of the parts reasoning, while the other is looking on. Since in this case the organ observed and the observing organ are identical, how could any observation be made?" In other words, how can the same brain turn its thoughts upon itself?

    Comte's argument chimed with scientific thinking at the time. After the Enlightenment dawned on Europe, an increasingly popular view was that self-awareness was special and not something that could be studied using the tools of science. Western philosophers were instead using self-reflection as a philosophical tool, much as mathematicians use algebra in the pursuit of new mathematical truths. René Descartes relied on self-reflection in this way to reach his famous conclusion, "I think, therefore I am," noting along the way that "I know clearly that there is nothing that can be perceived by me more easily or more clearly than my own mind." Descartes proposed that a central soul was the seat of thought and reason, commanding our bodies to act on our behalf. The soul could not be split in two — it just was. Self-awareness was therefore mysterious and indefinable, and off-limits to science.

    Credit: FRED TANNEAU via Getty Images

    We now know that the premise of Comte's worry is false. The human brain is not a single, indivisible organ. Instead, the brain is made up of billions of small components — neurons — that each crackle with electrical activity and participate in a wiring diagram of mind-boggling complexity. Out of the interactions among these cells, our entire mental life — our thoughts and feelings, hopes and dreams — flickers in and out of existence. But rather than being a meaningless tangle of connections with no discernible structure, this wiring diagram also has a broader architecture that divides the brain into distinct regions, each engaged in specialized computations. Just as a map of a city need not include individual houses to be useful, we can obtain a rough overview of how different areas of the human brain are working together at the scale of regions rather than individual brain cells. Some areas of the cortex are closer to the inputs (such as the eyes) and others are further up the processing chain. For instance, some regions are primarily involved in seeing (the visual cortex, at the back of the brain), others in processing sounds (the auditory cortex), while others are involved in storing and retrieving memories (such as the hippocampus).

    In a reply to Comte in 1865, the British philosopher John Stuart Mill anticipated the idea that self-awareness might also depend on the interaction of processes operating within a single brain and was thus a legitimate target of scientific study. Now, thanks to the advent of powerful brain imaging technologies such as functional magnetic resonance imaging (fMRI), we know that when we self-reflect, particular brain networks indeed crackle into life and that damage or disease to these same networks can lead to devastating impairments of self-awareness.

    I often think that if we were not so thoroughly familiar with our own capacity for self-awareness, we would be gobsmacked that the brain is able to pull off this marvelous conjuring trick. Imagine for a moment that you are a scientist on a mission to study new life-forms found on a distant planet. Biologists back on Earth are clamoring to know what they're made of and what makes them tick. But no one suggests just asking them! And yet a Martian landing on Earth, after learning a bit of English or Spanish or French, could do just that. The Martians might be stunned to find that we can already tell them something about what it is like to remember, dream, laugh, cry, or feel elated or regretful — all by virtue of being self-aware.

    But self-awareness did not just evolve to allow us to tell each other (and potential Martian visitors) about our thoughts and feelings. Instead, being self-aware is central to how we experience the world. We not only perceive our surroundings we can also reflect on the beauty of a sunset, wonder whether our vision is blurred, and ask whether our senses are being fooled by illusions or magic tricks. We not only make decisions about whether to take a new job or whom to marry we can also reflect on whether we made a good or bad choice. We not only recall childhood memories we can also question whether these memories might be mistaken.

    Self-awareness also enables us to understand that other people have minds like ours. Being self-aware allows me to ask, "How does this seem to me?" and, equally importantly, "How will this seem to someone else?" Literary novels would become meaningless if we lost the ability to think about the minds of others and compare their experiences to our own. Without self-awareness, there would be no organized education. We would not know who needs to learn or whether we have the capacity to teach them. The writer Vladimir Nabokov elegantly captured this idea that self-awareness is a catalyst for human flourishing:

    "Being aware of being aware of being. In other words, if I not only know that I am but also know that I know it, then I belong to the human species. All the rest follow s— the glory of thought, poetry, a vision of the universe. In that respect, the gap between ape and man is immeasurably greater than the one between amoeba and ape."

    In light of these myriad benefits, it's not surprising that cultivating accurate self-awareness has long been considered a wise and noble goal. In Plato's dialogue Charmides, Socrates has just returned from fighting in the Peloponnesian War. On his way home, he asks a local boy, Charmides, if he has worked out the meaning of sophrosyne — the Greek word for temperance or moderation, and the essence of a life well lived. After a long debate, the boy's cousin Critias suggests that the key to sophrosyne is simple: self-awareness. Socrates sums up his argument: "Then the wise or temperate man, and he only, will know himself, and be able to examine what he knows or does not know…No other person will be able to do this."

    Likewise, the ancient Greeks were urged to "know thyself" by a prominent inscription carved into the stone of the Temple of Delphi. For them, self-awareness was a work in progress and something to be striven toward. This view persisted into medieval religious traditions: for instance, the Italian priest and philosopher Saint Thomas Aquinas suggested that while God knows Himself by default, we need to put in time and effort to know our own minds. Aquinas and his monks spent long hours engaged in silent contemplation. They believed that only by participating in concerted self-reflection could they ascend toward the image of God.

    Credit: Dimas Ardian via Getty Images

    A similar notion of striving toward self-awareness is seen in Eastern traditions such as Buddhism. The spiritual goal of enlightenment is to dissolve the ego, allowing more transparent and direct knowledge of our minds acting in the here and now. The founder of Chinese Taoism, Lao Tzu, captured this idea that gaining self-awareness is one of the highest pursuits when he wrote, "To know that one does not know is best Not to know but to believe that one knows is a disease."

    Today, there is a plethora of websites, blogs, and self-help books that encourage us to "find ourselves" and become more self-aware. The sentiment is well meant. But while we are often urged to have better self-awareness, little attention is paid to how self-awareness actually works. I find this odd. It would be strange to encourage people to fix their cars without knowing how the engine worked, or to go to the gym without knowing which muscles to exercise. This book aims to fill this gap. I don't pretend to give pithy advice or quotes to put on a poster. Instead, I aim to provide a guide to the building blocks of self-awareness, drawing on the latest research from psychology, computer science, and neuroscience. By understanding how self-awareness works, I aim to put us in a position to answer the Athenian call to use it better.