Would you trust these eyes?

January 30, 2013

A recent blog post at Science 2.0 reported on a fascinating study by a team of researchers from Charles University in Prague. These researchers published in PLOS One their findings on the relationship between eye colour and perceived trustworthiness of faces.

Which of these two people looks most trustworthy?


Photos by Randen Pederson (L) and “Garrett” (R) – click for details

The team (Karel Kleisner, Lenka Priplatova, Peter Frost, and Jaroslav Flegr) showed photographs of 40 male and 40 female faces to their subjects, and asked them to rate how trustworthy the faces looked. They found a significant relationship – brown-eyed faces were perceived as more trustworthy.


Kleisner, Priplatova, Frost, and Flegr found (after controlling for “dominance” and ”attractiveness”), a very significant (p < 0.001) link between eye colour and perceived trustworthiness, in both male and female faces (image from their paper)

The genius of Kleisner et al., however, was not to leave it at that, but to repeat the experiment after altering the eye colours on the photographs. This revealed that eye colour per se had no effect. Rather, the perceived trustworthiness was linked to aspects of facial shape – aspects that normally correlate with eye colour.

They write: “brown-eyed faces tended to have a rounder and broader chin, a broader mouth with upward-pointing corners, relatively bigger eyes, and eyebrows closer to each other. This was also the pattern of a trustworthy face.

These results are consistent with earlier work by Alexander Todorov, Sean Baron, and Nikolaas Oosterhof, who found that not only are certain facial shapes perceived as trustworthy, but that these faces generate a measurable response in the Amygdala, detectable with functional Magnetic Resonance Imaging. I’m not sure why this brain response exists, but presumably it is one of the things which confidence tricksters exploit.


Would you trust this brown-eyed man? Frank Abagnale, whose life was portrayed in the film Catch Me If You Can (photo by “Marcus JB”)

– Tony


Emotion and Intelligence

January 14, 2013

A recent blog post in Science 2.0 refers to the 2004 book The First Idea: How Symbols, Language, and Intelligence Evolved from Our Primate Ancestors to Modern Humans by the late Stanley I. Greenspan and by Stuart Shanker. Drawing particularly on personal studies of child development, Greenspan and Shanker claim that “our highest level mental capacities, such as reflective thinking, only develop fully when infants and children are engaged in certain types of nurturing learning interactions.

They go onto to argue that the various stages of child development involve an intertwined growth of emotional and cognitive skills, and that these cannot be separated.

This raises the question as to whether (strong) Artificial Intelligence is possible. Can an unemotional thinking entity, like Data in Star Trek, actually exist? Such issues are explored further in a 2002 book edited by Robert Trappl, Paolo Petta, and Sabine Payr.


Unemotional thinkers in fiction, like Star Trek’s Data, actually do display various emotions – if not, the reader/viewer would lose interest

Greenspan and Shanker’s theories also have implications for child-rearing. If they are correct, emotionally rich interactions with caregivers are essential for the development of intelligence. For example, they argue (in contrast to Noam Chomsky and Steven Pinker), that language does not develop “spontaneously,” but is critically dependent on those interactions. Greenspan and Shanker write: “A child’s first words, her early word combinations, and her first steps towards mastering grammar are not just guided by emotional content, but, indeed, are imbued with it.


Emotionally rich interaction (photo: Robert Whitehead, 2006)

– Tony


School Shootings – Can Potential Shooter Profiles be Identified?

January 8, 2013

In light of the recent shootings, in Newtown, Connecticut. New debates have been sparked on the idea of gun control and identification of potentially unstable individuals who could commit such crimes.

An article on Science Daily website, School Shootings: What We Know and What We Can Do, highlighted some recent research studying past events and tragedies to accumulate a profile of potential shooters and how these individuals can be identified ahead of time. The article uses research by Dr. Daniel J. Flannery on explaining how shooters demonstrate similar features such as depression, low self-esteem, narcissism and a fascination with death. However these key aspects and similarities across shootings are not strong enough to produce conclusive profiles which could allow for future prevention of such tragedies.

Other research has produced similar findings, Leary, Kowalski, Smith and Philips (2003) analysed multiple shootings between 1995 to 2001. They found that depression, low self-esteem and narcissism were all present in the individuals involved in the shootings. However they all also shared one more common attribute and that was social rejection. Social rejection alone cannot fully explain these acts of violence as most people navigate through life and at some stage are exposed to social rejection. However this social rejection coupled with psychological problems or a fascination with death may lead to acts of violence occurring.

Unfortunately research in this area is inconclusive and therefore specific attributes and characteristics have not been idenitifed to put in place preventative measures to reduce the chances of such tragedies occuring again.

– Stefano


Human Sciences, Statistics, and R

January 6, 2013

The use of statistics has long been important in the human sciences. An early example is an analysis by William Sealy Gosset (alias “Student”) of biometric data obtained by Scotland Yard around 1900. The heights of 3,000 male criminals fit a bell curve almost perfectly:


Histogram © A. H. Dekker, produced using R software

Standard statistical methods allow the identification of correlations, which mark possible causal links:


XKCD teaches us that “Correlation doesn’t imply causation, but it does waggle its eyebrows suggestively and gesture furtively while mouthing ‘look over there.’”

Newer, more sophisticated statistical methods allow the exploration of time series and spatial data. For example, this project looks at the spatial distribution of West Nile virus (WNV) – which disease clusters are significant, and which are merely tragic coincidence:


Distribution of significant clusters of human WNV in the Chicago region, from Ruiz et al.

SPSS has been the mainstay of statistical analysis in the human sciences, but many newer techniques are better supported in the free R toolkit. For example, this paper discusses detecting significant clusters of diseases using R. The New York Times has commented on R’s growing popularity, and James Holland Jones points out that R is used by the majority of academic statisticians (and hence includes the newest developments in statistics), R has good help resources, and R makes really cool graphics.


A really cool graph in R, using the ggplot2 R package (from Jeromy Anglim’s Psychology and Statistics Blog)

An increasing quantity of human-science-related instructional material is available in R, including:

Through the igraph, sna, and other packages (and the statnet suite), R also provides easy-to-use facilities for social network analysis, a topic dear to my heart. For example, the following code defines the valued centrality measure proposed in this paper:

library("igraph")
valued.centrality <- function (g) {
  recip <- function (x) if (x == 0) 0 else 1/x
  f <- function (r) sum(sapply(r, recip)) / (length(r) - 1)
  apply (shortest.paths(g), MARGIN=1, f)
}

This definition has the advantage of allowing disconnected network components, so that we can use these centrality scores to add colour to a standard plot (using the igraph package within R):


Social network diagram, produced using R software, coloured using centrality scores

– Tony


Suicide and the Military

December 18, 2012

A recent (2011) report by Margaret Harrell and Nancy Berglass (“Losing the Battle: The Challenge of Military Suicide”) looks at suicide in the U.S. military.


U.S. military active duty suicide rates, compared to general U.S. population (from Harrell and Berglass)

The U.S. military has seen an increase in suicide among its serving and former personnel, with suicide now killing more troops than enemy fire does. Junior enlisted personnel appear to be most at risk. Among U.S. veterans, it is estimated that there is one suicide death every 80 minutes and, although only 1% of Americans have seen military service, veterans account for 20% of U.S. suicides. The U.S. Army in particular has seen suicides increase markedly since 2004.

From Vung Tau, riding Chinooks, to the dust at Nui Dat,
I’d been in and out of choppers now for months.
But we made our tents a home, VB and pinups on the lockers,
And an Asian orange sunset through the scrub.

And can you tell me, doctor, why I still can’t get to sleep?
And why the Channel Seven chopper chills me to my feet?
And what’s this rash that comes and goes,
Can you tell me what it means?
God help me, I was only nineteen.
– Redgum, I Was Only 19

Harrell and Berglass make a number of recommendations to help address this suicide problem, including that the U.S. Army establish a “unit cohesion period” on return from deployment. Stress factors among soldiers include encountering dead bodies – and this is a stress factor which may also apply to troops on humanitarian relief missions. A 2003 book notes that “there is growing evidence that the stress of peace support operations can be as psychologically damaging as conventional warfare.” Given these stresses, addressing the problem of military suicide will require recognising post-traumatic stress disorder (PTSD) as a “real injury.” Various helpful recommendations for dealing with PTSD have also been made in the past.

According to the Centers for Disease Control and Prevention (CDC) there are over 36,000 suicides in the U.S. each year (and almost half a million cases of self-inflicted injury). Risk factors include stressful life events and feeling alone – two factors common among post-deployment military personnel who have left their unit.

Australian Defence Force (ADF) personnel appear to think about suicide more than the general community. According to one study, 3.9% of the ADF had suicidal ideation. However, comparing the 8 suicide deaths per year of ADF personnel against the 2,300 suicide deaths per year in the general Australian community shows that ADF personnel are slightly less likely to die by suicide than their civilian counterparts. This may indicate that ADF suicide prevention strategies are having some positive effects.


Arlington National Cemetery, Virginia, USA

Some of the recommendations from Harrell and Berglass also appeared in a 2010 U.S. Military Task Force report. That report highlighted in particular the need for reducing the stigma of soldiers seeking help, and for removing cultural and organisational barriers to doing so. RAND has also produced a lengthy report on preventing suicide in the U.S. military. In its July 2012 special issue on the topic, Time Magazine printed some helpful advice for wives and husbands of military personnel – see here. The Australian Defence Force has a fact sheet here.

In the end, though, suicide is everyone’s business, for “no man is an island … every man is a piece of the continent, a part of the main … any man’s death diminishes me.”


U.S. Army Suicide Prevention Poster (2011)

This post is dedicated to all my friends who wear, or who have worn, a uniform, and to everyone who has been affected by suicide.

– Tony


Driver Distraction – are we being distracted away from real solutions?

October 29, 2012
Person using cell phone while driving.

Person using cell phone while driving. (Photo credit: Wikipedia)

 

Driver distraction is a major cause of accidents on our roads.  More research into driver distraction is therefore welcome.

 

Traditional media available in vehicles, such as radios and entertainment systems, can affect visual attention, and the current use of GPS navigation assistance and personal communication devices have been repeatedly shown to interfere with the primary driving vigilance and motor control task.  Such research has informed road authorities to restrict their use, particularly with regard to mobile phone use and texting for example.

 

The introduction of Intelligent Transport Systems has the potential to overload the driver if such systems are not tailored to driver workload according to location, traffic density and other ambient conditions.  A more accurate and layered approach to driver workload and attention level can provide a structure upon which information can be conveyed appropriately and distractions reduced or minimised.

Hume Freeway looking south towards Victoria, r...

Hume Freeway looking south towards Victoria, running parallel to Albury railway station. (Photo credit: Wikipedia)

 

There are many instances however where use of communication devices in vehicles may actually help in vigilance (for example, long distance driving) and the ability to operate a mobile office whilst driving has undoubted productivity benefits.  The problem is that the driving environment can potentially change in milliseconds and cause an overload of the driver’s available perceptual and attentional resources.

 

The aspect of appropriate speed limits in long distance driving could also be re-addressed.  There has been a spate of single driver transport accidents on the Hume Freeway recently which have had tragic (and potentially disastrous consequences). Apart from driver fatigue and scheduling issues, is the posted speed limit too low to maintain sufficient driver arousal levels?  Long boring drives at inappropriately low speeds could also encourage further distraction (such as use of mobile devices) perhaps even increasing the potential danger.  This may not be ideal given that the severity of an incident at 100kph is still as catastrophic as one at 110 or 120kph.  If speed kills, then surely it follows that the only safe speed is zero. There is a balance between the efficient movement of goods and the consequences of error.  Does this mean that interstate transport needs to travel at 40kph so that any incidents that occur are relatively minor? Would this result in more incidents because drivers would be bored out of their minds whilst blowing out transport schedules? There needs to be an open discussion by professionals in the field as to assessment of all of the risks rather than the current thinking that slowing traffic is the only solution as seems to be the case at present.  The posted speed relative to the road design and condition could be reducing driver arousal and performance level below the desired optimal range.

 

60KM/H Speed limit sign in Australia.

60KM/H Speed limit sign in Australia. (Photo credit: Wikipedia)

The Human Machine Interface (HMI) remains central to safe and effective vehicle operation as the information it provides will allow the driver to retain effective control of the vehicle and help influence or even determine their behaviour. Information flow to the driver must be intuitive, readily understood, timely, and be responsive to driver attentional and distraction state.  Unfortunately, there are many who have very little understanding of this requirement. For example, presentation of the “bells and whistles” mentioned in this proposed level crossing warning system  may actually distract a driver at the worst possible time and cause more problems than the technology is trying to solve. Human scientists understand the many facets that determine the best way of presenting information so that it is perceived, recognised and acted upon in an optimal manner.  If the proponent of this system had engaged human factors expertise in the first place or heeded their advice he may not have made such ill-considered comments. Perhaps he may get some guidance as to the importance of listening to human factors professionals when he presents his data at the upcoming ITS conference in Vienna. It would be good to have some critical assessment by any human scientists attending this conference of any actual (rather than derived or contrived) interactions that occurred during the trials which were conducted of this system.

 

The area of human interaction with technology is very complex and simplistic approaches (such as further unrestrained visual or auditory “bells and whistles blaring”) will rarely be the best solution.  All the more reason to design and test proposals from a human science standpoint, heed the results and incorporate them into any proposed system.  Which underscores the importance of the many respected facilities that incorporate human science input as a keystone of their research in ITS applications, vehicle design and driver behaviour.  Hopefully their research findings are weighted appropriately (ie. seen as valid and reliable) by the governing authorities as compared to those of the “bell and whistle” variety.

 

 

 

 

 

 

 

 

 


The future world of neuroscience

October 3, 2012

English: Drawing of the human brain, from the ...

Just read a post on the increasing ability of neuroscientists to image and understand the brain.  The author, Kathleen Taylor makes a very interesting observation that perhaps in the future the research of the brain will surpass the physical sciences in importance.  She is probably biased given her neuroscience background but I feel that she has highlighted some fundamental questions with regard to how humans will interact with the world (or perhaps the universe?) and with each other in the future.

She makes the point that the physical sciences have largely been insulated from how the knowledge gained from research in this area is used, given that there is no human input into their experimentation.  The research is largely introspective or governed by mathematics or similarly prescriptive methods.  The potential consequences of the research is not addressed at any time (at least in a formal sense) as there is little input from others apart from peers and supervisors with a similar research background.

The difference between the physical and social sciences has been commented on in previous blog posts.  Physical scientists, although brilliant in their own field, tend to make assumptions as to how humans fit into their models and how their research can be applied.  Human behaviour is commonly included as a probability which then influences the remainder of the postulated model to provide results which do not necessarily reflect what actually happens in the real world.  However, either these discrepancies are ignored, or assumed to be just part of a distribution of human behaviour.  A system is then designed using such flawed thinking and typically, it is the poor old human operators who have to adapt and make up for such sloppy design when they have to make things work.

Alternatively, these operators are seen as the problem when the system is subsequently audited as the ‘brilliant’ system design is hardly ever tested and/or seen to be at fault.  At last there are glimmers of hope as safety management systems are identifying that these ‘brilliant’ systems are more often than not the cause of many failings, not just from the operator perspective.  So the ‘human error‘ which historically has almost always been attributed as the cause of an accident is sheeted home to where it belonArachnoidgs in the first place – the arrogant human who designed the system who was either unaware of or was permitted to ignore the fact that an inherent part of the design process is to understand how the human operator thinks and acts when interfacing with their system.

On the other hand, social scientists and human scientists in particular have a core theme that human behaviour is far more complex and determined by sensory and perceptual aspects initially, then modified by cognitive processes which are also subject to change.  These factors need to be addressed when modelling how a human operates with a machine or amongst themselves to make decisions etc.  As discussed in Kathleen’s article, the brain is such a complex organ and it is subject to a massive range of inputs that we are only now becoming aware of how it works, and how to manipulate it.  Perhaps in the new millennium, neuroscience may have similar advances as occurred in physics (relativity, quantum mechanics and understanding of atomic and sub-atomic structure for example) during the last.

Kathleen highlights that the ethics of operating on the neural and molecular scale within the human brain and the resultant impact it may have on the individual concerned will be a central theme going forward.  This is especially pertinent when entities such as commercial or government interests will be in a position to manipulate these factors and it is therefore something which needs to be addressed well prior to this particular genie escaping the bottle.

Which leads back to KathleEnglish: Computer tomography of human brain, f...en’s major point.  She contends that neuroscience may overtake the physical sciences as the whole consciousness experience will determine how the human species develops into the future.  The social/psychological/physiological sciences understand these aspects and, most importantly, understand the need for an ethical framework when addressing these matters.  So at least we will be better placed than the current situation, where the physical scientists, who have neither of these fundamentals, seem to determine how technology develops and is applied.  Perhaps we will then have a more level research field where social and human scientists are included at the very beginning and can (heaven forbid!) inform how technology is developed and applied to best advantage for the human user who will ultimately directly interact with it.


Epistemology and Simulation

September 28, 2012

Epistemology is the study of knowledge. What is knowledge exactly? Well, I’m happy (for reasons argued elsewhere) to use the definition going back to Plato, that of justified true belief. For example, I know – or at least believe – that there’s a tree growing outside my window.


The tree outside my window (my photo)

Ultimately, this belief is grounded in the way the human visual system works, and on the way in which my perception of the tree triggers remembrance of trees past. All this falls within the scope of cognitive psychology, and experimental work in this area has told us a great deal about how human perception works.


The human visual system, from Gray’s Anatomy, 1918

Is my belief true? You’ll have to judge that for yourself (although the photograph may help convince you). Is it justified? Well, that’s the domain of philosophy – am I justified in trusting my senses?

In his Confessions, Saint Augustine takes “seven and three are ten” as a touchstone of truth, and in his City of God, he writes “the man who says that seven and three are eleven, says what cannot be true under any circumstances.” I agree with him. Here again, my belief falls within the scope of cognitive psychology (and developmental psychology, since Cuisenaire rods helped convince me of this back in kindergarten).


Seven and three are ten

Is my belief true? Once again, judge that for yourself. Is it justified? Well, that’s the domain of mathematics this time (and, as an older child, I learned to prove 7 + 3 = 10 mathematically).

Shortly, I hope to attend the 5th Epistemological Perspectives on Simulation (EPOS) Conference at Trinity University (San Antonio, Texas). We will be exploring whether it is possible to know things (especially things about social phenomena) as the result of a simulation (and, if so, how). It promises to be an interesting event. Papers from two of the four past instances of this conference series can be found here (2004) and here (2008).


(Public domain photo)

– Tony


(Religion) Lost in Space

September 24, 2012
Luna 9 :*Denomination: 2 Forint

Luna 9 :*Denomination: 2 Forint (Photo credit: Wikipedia)

The concept that any future interstellar exploration be free of organised religion has recently been discussed.  Some have expressed the view that religion is toxic for human interaction and cooperation as is evidenced in many unsavoury incidents throughout history and is currently being witnessed with respect to the YouTube video denigrating the prophet Mohammed and subsequent reaction to it.

Humans have many attributes which may be positive or negative depending on the context.  Adaptability and imagination are very valuable human abilities, but these skills are not required nor perhaps desirable in a situation that requires heuristic thinking.  Conversely, applying a flawed or inappropriate heuristic can have disastrous consequences, or prevent a more appropriate paradigm from being developed.

A human will always be influenced in how they act and think by their prior experience. Even the application of the scientific method cannot eliminate these influences. The ability to assess complex data for example can be affected by education, training, aptitude and a host of other factors, which can vary according to the information being assessed.  It probably explains the range of specialities within a discipline, for example, in medicine, physics, chemistry, engineering and psychology.  With regard to future space exploration, the various TV program depictions such as Star Trek portray a range of specialists in the crew, making the assumption that all of these skills will be required to fully comprehend the magnitude and complexity of space.

Given that previous experience or belief systems are an inherent part of the human condition, it seems logical that a religious aspect will also then be represented within the crew of any intergalactic mission if it is to be truly representative of the human species.  And as bigoted or fundamentalist religious views are by definition extreme values within a normal population, it is highly unlikely that these would be represented to any significant statistical level.

With regard to positive and negative attributes, religion has been blamed for many ills, many of which can be justified.  However, religion should also be recognised for its many positive aspects, such as altruistic value systems, beneficence, the intrinsic value of individuals regardless of race, social standing or wealth, the existence and importance of a fundamental natural order and the concept of stewardship and responsible use of resources that then derive from it. Many advances in science were made possible by the religious systems of the time, such as astronomy and mathematics, although some of the authorities subsequently disputed the findings for whatever reason.  Is it so different to what is currently occurring where the evidence supporting climate change is disputed by certain sections within a secular society without any obvious underlying religious philosophical rationale?  It seems that belief systems generally, not just religious ones, are the root cause of disagreement.  This can be beneficial in the search for scientific truth and the progression of understanding – perhaps conflict is a positive human attribute as long as it is confined within an intellectual framework.So the discussion regarding the crew mix for future space exploration missions should expand to include all human experience and belief systems.  Perhaps religion can help unlock the mysteries of the human mind and the continuing quest of the species to explore and understand the universe.  All of which relates back to human science.


The rise of the drones

September 17, 2012
Kawasaki KAQ-1 Drone at Castle Air Museum

Kawasaki KAQ-1 Drone at Castle Air Museum (Photo credit: Wikipedia)

The ever increasing ability and availability of drone technology will have a major impact on defence and law-enforcement operations in the future.  Where capabilities of platforms used to be defined by the ability to deploy various assets within the capability of a class or type of vessel ( in naval operations for example, a carrier vs an offshore combatant vessel), the latter smaller platforms may soon be able to deploy an aerial capability which could previously only be provided by the larger and far more expensive vessels.

As noted in the article, this can have far-reaching consequences for navies where the majority of tasks are routine patrol and ‘constabulary’ operations of protecting sea lanes and territorial integrity.  With drone technology, a single smaller platform could perform tasks which currently require several more capable and expensive assets (for example, the patrol of a shipping route subject to piracy).

A telling point made in the article is the close involvement of the human in the loop.  Similar to other advances in automation, the command and control function remains within the human domain.  As stated “…there will inevitably be a human in the operational Observe, Orient, Decide and Act (OODA) loop – be they a remote operator, data handler oFull diagram originally drawn by John Boyd for...r decision maker within any of a number of mission sets.”

So the design of the HMI will determine how successful this shift in technology will be.  As has been seen in Afghanistan, the ability of the remote operator of a drone aircraft to gain and maintain situational awareness to perform their mission without unintended consequences will greatly depend on the amount, type and quality of information available to them and what range of tasks they need to perform.  Many combat aircraft have a crew of two due to the separate and demanding pilot and SA/engagement tasks, and military drone strike operations seem to reflect this crewing model. Perhaps this model is a historical legacy which may also change in the future as drones dispense with the constraint of having to fit the aircrew into the platform.

This may cause a shift in emphasis of the Human Factors and Ergonomics discipline.  A lot of effort was traditionally expended in the physical anthropometric ergonomics aspect of the human in the loop probably because it was so obvious. For example, range of movement, posture etc within a cockpit could be calculated and the 95thpercentile adopted as a standard which could then be used to determine interaction within the crew space available in the airframe. As we all know, engineers love standards, so perhaps this aspect was pursued to the detriment of equally or possibly more important aspects of the human/machine interface.

Computer Workstation Variables

Similar adoption of standards cannot be readily applied to much more esoteric aspect of neurological interaction with a system. For example, although it provides a very good framework to predict and test how an operator will interface with a HMI, Multiple Resource Theory doesn’t provide the level of certainty available from physical ergonomic models. Each aspect needs to be tested according to the many variables which could arise and the neural adaptability inherent in the human which makes them so important to the command and control function.  That’s why the non-physical human interaction field is so interesting to us practitioners (and perhaps perplexes many physical scientists who cannot seem to grasp the notion that humans cannot be treated as a simple probability or linear contributor in their decision models).

So while drone technology will enhance capability, it will only do so effectively if there is a requisite paradigm shift in how the interface is designed to incorporate the more difficult ‘neural ergonomic’ aspects described above.  Perhaps we can finally move away from the tyranny of standards which are sometimes adopted without further thought for the equally important sensory, perceptual and cognitive aspects which we pesky Human Factors types are constantly trying to highlight to our seemingly dullard peers in other fields, sometimes with, but unfortunately many times without success.