Would you trust these eyes?

January 30, 2013

A recent blog post at Science 2.0 reported on a fascinating study by a team of researchers from Charles University in Prague. These researchers published in PLOS One their findings on the relationship between eye colour and perceived trustworthiness of faces.

Which of these two people looks most trustworthy?


Photos by Randen Pederson (L) and “Garrett” (R) – click for details

The team (Karel Kleisner, Lenka Priplatova, Peter Frost, and Jaroslav Flegr) showed photographs of 40 male and 40 female faces to their subjects, and asked them to rate how trustworthy the faces looked. They found a significant relationship – brown-eyed faces were perceived as more trustworthy.


Kleisner, Priplatova, Frost, and Flegr found (after controlling for “dominance” and ”attractiveness”), a very significant (p < 0.001) link between eye colour and perceived trustworthiness, in both male and female faces (image from their paper)

The genius of Kleisner et al., however, was not to leave it at that, but to repeat the experiment after altering the eye colours on the photographs. This revealed that eye colour per se had no effect. Rather, the perceived trustworthiness was linked to aspects of facial shape – aspects that normally correlate with eye colour.

They write: “brown-eyed faces tended to have a rounder and broader chin, a broader mouth with upward-pointing corners, relatively bigger eyes, and eyebrows closer to each other. This was also the pattern of a trustworthy face.

These results are consistent with earlier work by Alexander Todorov, Sean Baron, and Nikolaas Oosterhof, who found that not only are certain facial shapes perceived as trustworthy, but that these faces generate a measurable response in the Amygdala, detectable with functional Magnetic Resonance Imaging. I’m not sure why this brain response exists, but presumably it is one of the things which confidence tricksters exploit.


Would you trust this brown-eyed man? Frank Abagnale, whose life was portrayed in the film Catch Me If You Can (photo by “Marcus JB”)

– Tony


Emotion and Intelligence

January 14, 2013

A recent blog post in Science 2.0 refers to the 2004 book The First Idea: How Symbols, Language, and Intelligence Evolved from Our Primate Ancestors to Modern Humans by the late Stanley I. Greenspan and by Stuart Shanker. Drawing particularly on personal studies of child development, Greenspan and Shanker claim that “our highest level mental capacities, such as reflective thinking, only develop fully when infants and children are engaged in certain types of nurturing learning interactions.

They go onto to argue that the various stages of child development involve an intertwined growth of emotional and cognitive skills, and that these cannot be separated.

This raises the question as to whether (strong) Artificial Intelligence is possible. Can an unemotional thinking entity, like Data in Star Trek, actually exist? Such issues are explored further in a 2002 book edited by Robert Trappl, Paolo Petta, and Sabine Payr.


Unemotional thinkers in fiction, like Star Trek’s Data, actually do display various emotions – if not, the reader/viewer would lose interest

Greenspan and Shanker’s theories also have implications for child-rearing. If they are correct, emotionally rich interactions with caregivers are essential for the development of intelligence. For example, they argue (in contrast to Noam Chomsky and Steven Pinker), that language does not develop “spontaneously,” but is critically dependent on those interactions. Greenspan and Shanker write: “A child’s first words, her early word combinations, and her first steps towards mastering grammar are not just guided by emotional content, but, indeed, are imbued with it.


Emotionally rich interaction (photo: Robert Whitehead, 2006)

– Tony


School Shootings – Can Potential Shooter Profiles be Identified?

January 8, 2013

In light of the recent shootings, in Newtown, Connecticut. New debates have been sparked on the idea of gun control and identification of potentially unstable individuals who could commit such crimes.

An article on Science Daily website, School Shootings: What We Know and What We Can Do, highlighted some recent research studying past events and tragedies to accumulate a profile of potential shooters and how these individuals can be identified ahead of time. The article uses research by Dr. Daniel J. Flannery on explaining how shooters demonstrate similar features such as depression, low self-esteem, narcissism and a fascination with death. However these key aspects and similarities across shootings are not strong enough to produce conclusive profiles which could allow for future prevention of such tragedies.

Other research has produced similar findings, Leary, Kowalski, Smith and Philips (2003) analysed multiple shootings between 1995 to 2001. They found that depression, low self-esteem and narcissism were all present in the individuals involved in the shootings. However they all also shared one more common attribute and that was social rejection. Social rejection alone cannot fully explain these acts of violence as most people navigate through life and at some stage are exposed to social rejection. However this social rejection coupled with psychological problems or a fascination with death may lead to acts of violence occurring.

Unfortunately research in this area is inconclusive and therefore specific attributes and characteristics have not been idenitifed to put in place preventative measures to reduce the chances of such tragedies occuring again.

– Stefano


Human Sciences, Statistics, and R

January 6, 2013

The use of statistics has long been important in the human sciences. An early example is an analysis by William Sealy Gosset (alias “Student”) of biometric data obtained by Scotland Yard around 1900. The heights of 3,000 male criminals fit a bell curve almost perfectly:


Histogram © A. H. Dekker, produced using R software

Standard statistical methods allow the identification of correlations, which mark possible causal links:


XKCD teaches us that “Correlation doesn’t imply causation, but it does waggle its eyebrows suggestively and gesture furtively while mouthing ‘look over there.’”

Newer, more sophisticated statistical methods allow the exploration of time series and spatial data. For example, this project looks at the spatial distribution of West Nile virus (WNV) – which disease clusters are significant, and which are merely tragic coincidence:


Distribution of significant clusters of human WNV in the Chicago region, from Ruiz et al.

SPSS has been the mainstay of statistical analysis in the human sciences, but many newer techniques are better supported in the free R toolkit. For example, this paper discusses detecting significant clusters of diseases using R. The New York Times has commented on R’s growing popularity, and James Holland Jones points out that R is used by the majority of academic statisticians (and hence includes the newest developments in statistics), R has good help resources, and R makes really cool graphics.


A really cool graph in R, using the ggplot2 R package (from Jeromy Anglim’s Psychology and Statistics Blog)

An increasing quantity of human-science-related instructional material is available in R, including:

Through the igraph, sna, and other packages (and the statnet suite), R also provides easy-to-use facilities for social network analysis, a topic dear to my heart. For example, the following code defines the valued centrality measure proposed in this paper:

library("igraph")
valued.centrality <- function (g) {
  recip <- function (x) if (x == 0) 0 else 1/x
  f <- function (r) sum(sapply(r, recip)) / (length(r) - 1)
  apply (shortest.paths(g), MARGIN=1, f)
}

This definition has the advantage of allowing disconnected network components, so that we can use these centrality scores to add colour to a standard plot (using the igraph package within R):


Social network diagram, produced using R software, coloured using centrality scores

– Tony


Suicide and the Military

December 18, 2012

A recent (2011) report by Margaret Harrell and Nancy Berglass (“Losing the Battle: The Challenge of Military Suicide”) looks at suicide in the U.S. military.


U.S. military active duty suicide rates, compared to general U.S. population (from Harrell and Berglass)

The U.S. military has seen an increase in suicide among its serving and former personnel, with suicide now killing more troops than enemy fire does. Junior enlisted personnel appear to be most at risk. Among U.S. veterans, it is estimated that there is one suicide death every 80 minutes and, although only 1% of Americans have seen military service, veterans account for 20% of U.S. suicides. The U.S. Army in particular has seen suicides increase markedly since 2004.

From Vung Tau, riding Chinooks, to the dust at Nui Dat,
I’d been in and out of choppers now for months.
But we made our tents a home, VB and pinups on the lockers,
And an Asian orange sunset through the scrub.

And can you tell me, doctor, why I still can’t get to sleep?
And why the Channel Seven chopper chills me to my feet?
And what’s this rash that comes and goes,
Can you tell me what it means?
God help me, I was only nineteen.
– Redgum, I Was Only 19

Harrell and Berglass make a number of recommendations to help address this suicide problem, including that the U.S. Army establish a “unit cohesion period” on return from deployment. Stress factors among soldiers include encountering dead bodies – and this is a stress factor which may also apply to troops on humanitarian relief missions. A 2003 book notes that “there is growing evidence that the stress of peace support operations can be as psychologically damaging as conventional warfare.” Given these stresses, addressing the problem of military suicide will require recognising post-traumatic stress disorder (PTSD) as a “real injury.” Various helpful recommendations for dealing with PTSD have also been made in the past.

According to the Centers for Disease Control and Prevention (CDC) there are over 36,000 suicides in the U.S. each year (and almost half a million cases of self-inflicted injury). Risk factors include stressful life events and feeling alone – two factors common among post-deployment military personnel who have left their unit.

Australian Defence Force (ADF) personnel appear to think about suicide more than the general community. According to one study, 3.9% of the ADF had suicidal ideation. However, comparing the 8 suicide deaths per year of ADF personnel against the 2,300 suicide deaths per year in the general Australian community shows that ADF personnel are slightly less likely to die by suicide than their civilian counterparts. This may indicate that ADF suicide prevention strategies are having some positive effects.


Arlington National Cemetery, Virginia, USA

Some of the recommendations from Harrell and Berglass also appeared in a 2010 U.S. Military Task Force report. That report highlighted in particular the need for reducing the stigma of soldiers seeking help, and for removing cultural and organisational barriers to doing so. RAND has also produced a lengthy report on preventing suicide in the U.S. military. In its July 2012 special issue on the topic, Time Magazine printed some helpful advice for wives and husbands of military personnel – see here. The Australian Defence Force has a fact sheet here.

In the end, though, suicide is everyone’s business, for “no man is an island … every man is a piece of the continent, a part of the main … any man’s death diminishes me.”


U.S. Army Suicide Prevention Poster (2011)

This post is dedicated to all my friends who wear, or who have worn, a uniform, and to everyone who has been affected by suicide.

– Tony


Driver Distraction – are we being distracted away from real solutions?

October 29, 2012
Person using cell phone while driving.

Person using cell phone while driving. (Photo credit: Wikipedia)

 

Driver distraction is a major cause of accidents on our roads.  More research into driver distraction is therefore welcome.

 

Traditional media available in vehicles, such as radios and entertainment systems, can affect visual attention, and the current use of GPS navigation assistance and personal communication devices have been repeatedly shown to interfere with the primary driving vigilance and motor control task.  Such research has informed road authorities to restrict their use, particularly with regard to mobile phone use and texting for example.

 

The introduction of Intelligent Transport Systems has the potential to overload the driver if such systems are not tailored to driver workload according to location, traffic density and other ambient conditions.  A more accurate and layered approach to driver workload and attention level can provide a structure upon which information can be conveyed appropriately and distractions reduced or minimised.

Hume Freeway looking south towards Victoria, r...

Hume Freeway looking south towards Victoria, running parallel to Albury railway station. (Photo credit: Wikipedia)

 

There are many instances however where use of communication devices in vehicles may actually help in vigilance (for example, long distance driving) and the ability to operate a mobile office whilst driving has undoubted productivity benefits.  The problem is that the driving environment can potentially change in milliseconds and cause an overload of the driver’s available perceptual and attentional resources.

 

The aspect of appropriate speed limits in long distance driving could also be re-addressed.  There has been a spate of single driver transport accidents on the Hume Freeway recently which have had tragic (and potentially disastrous consequences). Apart from driver fatigue and scheduling issues, is the posted speed limit too low to maintain sufficient driver arousal levels?  Long boring drives at inappropriately low speeds could also encourage further distraction (such as use of mobile devices) perhaps even increasing the potential danger.  This may not be ideal given that the severity of an incident at 100kph is still as catastrophic as one at 110 or 120kph.  If speed kills, then surely it follows that the only safe speed is zero. There is a balance between the efficient movement of goods and the consequences of error.  Does this mean that interstate transport needs to travel at 40kph so that any incidents that occur are relatively minor? Would this result in more incidents because drivers would be bored out of their minds whilst blowing out transport schedules? There needs to be an open discussion by professionals in the field as to assessment of all of the risks rather than the current thinking that slowing traffic is the only solution as seems to be the case at present.  The posted speed relative to the road design and condition could be reducing driver arousal and performance level below the desired optimal range.

 

60KM/H Speed limit sign in Australia.

60KM/H Speed limit sign in Australia. (Photo credit: Wikipedia)

The Human Machine Interface (HMI) remains central to safe and effective vehicle operation as the information it provides will allow the driver to retain effective control of the vehicle and help influence or even determine their behaviour. Information flow to the driver must be intuitive, readily understood, timely, and be responsive to driver attentional and distraction state.  Unfortunately, there are many who have very little understanding of this requirement. For example, presentation of the “bells and whistles” mentioned in this proposed level crossing warning system  may actually distract a driver at the worst possible time and cause more problems than the technology is trying to solve. Human scientists understand the many facets that determine the best way of presenting information so that it is perceived, recognised and acted upon in an optimal manner.  If the proponent of this system had engaged human factors expertise in the first place or heeded their advice he may not have made such ill-considered comments. Perhaps he may get some guidance as to the importance of listening to human factors professionals when he presents his data at the upcoming ITS conference in Vienna. It would be good to have some critical assessment by any human scientists attending this conference of any actual (rather than derived or contrived) interactions that occurred during the trials which were conducted of this system.

 

The area of human interaction with technology is very complex and simplistic approaches (such as further unrestrained visual or auditory “bells and whistles blaring”) will rarely be the best solution.  All the more reason to design and test proposals from a human science standpoint, heed the results and incorporate them into any proposed system.  Which underscores the importance of the many respected facilities that incorporate human science input as a keystone of their research in ITS applications, vehicle design and driver behaviour.  Hopefully their research findings are weighted appropriately (ie. seen as valid and reliable) by the governing authorities as compared to those of the “bell and whistle” variety.

 

 

 

 

 

 

 

 

 


The future world of neuroscience

October 3, 2012

English: Drawing of the human brain, from the ...

Just read a post on the increasing ability of neuroscientists to image and understand the brain.  The author, Kathleen Taylor makes a very interesting observation that perhaps in the future the research of the brain will surpass the physical sciences in importance.  She is probably biased given her neuroscience background but I feel that she has highlighted some fundamental questions with regard to how humans will interact with the world (or perhaps the universe?) and with each other in the future.

She makes the point that the physical sciences have largely been insulated from how the knowledge gained from research in this area is used, given that there is no human input into their experimentation.  The research is largely introspective or governed by mathematics or similarly prescriptive methods.  The potential consequences of the research is not addressed at any time (at least in a formal sense) as there is little input from others apart from peers and supervisors with a similar research background.

The difference between the physical and social sciences has been commented on in previous blog posts.  Physical scientists, although brilliant in their own field, tend to make assumptions as to how humans fit into their models and how their research can be applied.  Human behaviour is commonly included as a probability which then influences the remainder of the postulated model to provide results which do not necessarily reflect what actually happens in the real world.  However, either these discrepancies are ignored, or assumed to be just part of a distribution of human behaviour.  A system is then designed using such flawed thinking and typically, it is the poor old human operators who have to adapt and make up for such sloppy design when they have to make things work.

Alternatively, these operators are seen as the problem when the system is subsequently audited as the ‘brilliant’ system design is hardly ever tested and/or seen to be at fault.  At last there are glimmers of hope as safety management systems are identifying that these ‘brilliant’ systems are more often than not the cause of many failings, not just from the operator perspective.  So the ‘human error‘ which historically has almost always been attributed as the cause of an accident is sheeted home to where it belonArachnoidgs in the first place – the arrogant human who designed the system who was either unaware of or was permitted to ignore the fact that an inherent part of the design process is to understand how the human operator thinks and acts when interfacing with their system.

On the other hand, social scientists and human scientists in particular have a core theme that human behaviour is far more complex and determined by sensory and perceptual aspects initially, then modified by cognitive processes which are also subject to change.  These factors need to be addressed when modelling how a human operates with a machine or amongst themselves to make decisions etc.  As discussed in Kathleen’s article, the brain is such a complex organ and it is subject to a massive range of inputs that we are only now becoming aware of how it works, and how to manipulate it.  Perhaps in the new millennium, neuroscience may have similar advances as occurred in physics (relativity, quantum mechanics and understanding of atomic and sub-atomic structure for example) during the last.

Kathleen highlights that the ethics of operating on the neural and molecular scale within the human brain and the resultant impact it may have on the individual concerned will be a central theme going forward.  This is especially pertinent when entities such as commercial or government interests will be in a position to manipulate these factors and it is therefore something which needs to be addressed well prior to this particular genie escaping the bottle.

Which leads back to KathleEnglish: Computer tomography of human brain, f...en’s major point.  She contends that neuroscience may overtake the physical sciences as the whole consciousness experience will determine how the human species develops into the future.  The social/psychological/physiological sciences understand these aspects and, most importantly, understand the need for an ethical framework when addressing these matters.  So at least we will be better placed than the current situation, where the physical scientists, who have neither of these fundamentals, seem to determine how technology develops and is applied.  Perhaps we will then have a more level research field where social and human scientists are included at the very beginning and can (heaven forbid!) inform how technology is developed and applied to best advantage for the human user who will ultimately directly interact with it.