Simulation to help driver training?

October 10, 2012

Researchers are using driving simulators to help inform new drivers of the pitfalls of texting whilst driving.  Use of mobile communication devices is probably approaching the danger level of driving under the influence of drugs or alcohol or fatigue and is fast becoming one of the major problems on our roads.English: Fotograph of the SIMUVEG Driving Simu...

I have come across patents for devices which display SMS, Facebook and Twitter onto the windscreen of a vehicle.  I am amazed that the inventors have no perception of the dangers that they are advocating in their quest for a buck.

The main problem is that we are able to multi-task within the cognitive capability that we possess and in many driving situations we are able to ‘get away with it’.  We are not aware of how this affects our driving ability until an emergency situation occurs which approaches or exceeds our sensory, perceptual, cognitive or motor control limits.  This can happen in milliseconds.

English: bmw x5 idrive conroller Deutsch: bmw ...Unfortunately, when allied to the design limitations of the vehicles, this can be deadly.  For example, a major manufacturer has adopted the use of an LCD head down display for all of their vehicles.  As I have posted previously, this will require the driver to take their eyes off the road and use focal vision to control fine motor movement to touch the right portion of the screen.  Pretty poor design if you ask me – more likely a cost control measure without any idea of the consequences.  Obviously no human science input there, similar to BMW’s original iDrive. Allied to this is a curious standard that has been adopted that a function should not require a driver to take their attention away from the driving task for more than 3 seconds.  Wouldn’t a better design allow the driver to control things without having to take their eyes off the road? It is informative that BMW’s current system actually incorporates haptic feedback to achieve this result.

Perhaps we need some special driver training and further reinforcement on the dangers of texting whilst driving.  Or be informed by the latest research in human behaviour arousal and performance monitoring to help determine when the driver can perform tasks.  Given that vehicles now communicate with mobile devices using Bluetooth or similar protocols, any SMS, Facebook or Twitter feeds could be disabled whilst in the vehicle or tailored to operate when the vehicle is stationary, thereby reducing distraction which is acknowledged as the major cause of accidents. This is where ITS can assist if it is designed to integrate with the human who will remain in executive control of a vehicle for some time yet, despite the advances in automation.

All of which demonstrates that human scientists should be involved at the outset in the design and development of the HMI, and inform the powers that be on how the various technologies should be managed to reduce the dangers on our roads.  It would certainly be an improvement on the current situation.

Advertisement

The rise of the drones

September 17, 2012
Kawasaki KAQ-1 Drone at Castle Air Museum

Kawasaki KAQ-1 Drone at Castle Air Museum (Photo credit: Wikipedia)

The ever increasing ability and availability of drone technology will have a major impact on defence and law-enforcement operations in the future.  Where capabilities of platforms used to be defined by the ability to deploy various assets within the capability of a class or type of vessel ( in naval operations for example, a carrier vs an offshore combatant vessel), the latter smaller platforms may soon be able to deploy an aerial capability which could previously only be provided by the larger and far more expensive vessels.

As noted in the article, this can have far-reaching consequences for navies where the majority of tasks are routine patrol and ‘constabulary’ operations of protecting sea lanes and territorial integrity.  With drone technology, a single smaller platform could perform tasks which currently require several more capable and expensive assets (for example, the patrol of a shipping route subject to piracy).

A telling point made in the article is the close involvement of the human in the loop.  Similar to other advances in automation, the command and control function remains within the human domain.  As stated “…there will inevitably be a human in the operational Observe, Orient, Decide and Act (OODA) loop – be they a remote operator, data handler oFull diagram originally drawn by John Boyd for...r decision maker within any of a number of mission sets.”

So the design of the HMI will determine how successful this shift in technology will be.  As has been seen in Afghanistan, the ability of the remote operator of a drone aircraft to gain and maintain situational awareness to perform their mission without unintended consequences will greatly depend on the amount, type and quality of information available to them and what range of tasks they need to perform.  Many combat aircraft have a crew of two due to the separate and demanding pilot and SA/engagement tasks, and military drone strike operations seem to reflect this crewing model. Perhaps this model is a historical legacy which may also change in the future as drones dispense with the constraint of having to fit the aircrew into the platform.

This may cause a shift in emphasis of the Human Factors and Ergonomics discipline.  A lot of effort was traditionally expended in the physical anthropometric ergonomics aspect of the human in the loop probably because it was so obvious. For example, range of movement, posture etc within a cockpit could be calculated and the 95thpercentile adopted as a standard which could then be used to determine interaction within the crew space available in the airframe. As we all know, engineers love standards, so perhaps this aspect was pursued to the detriment of equally or possibly more important aspects of the human/machine interface.

Computer Workstation Variables

Similar adoption of standards cannot be readily applied to much more esoteric aspect of neurological interaction with a system. For example, although it provides a very good framework to predict and test how an operator will interface with a HMI, Multiple Resource Theory doesn’t provide the level of certainty available from physical ergonomic models. Each aspect needs to be tested according to the many variables which could arise and the neural adaptability inherent in the human which makes them so important to the command and control function.  That’s why the non-physical human interaction field is so interesting to us practitioners (and perhaps perplexes many physical scientists who cannot seem to grasp the notion that humans cannot be treated as a simple probability or linear contributor in their decision models).

So while drone technology will enhance capability, it will only do so effectively if there is a requisite paradigm shift in how the interface is designed to incorporate the more difficult ‘neural ergonomic’ aspects described above.  Perhaps we can finally move away from the tyranny of standards which are sometimes adopted without further thought for the equally important sensory, perceptual and cognitive aspects which we pesky Human Factors types are constantly trying to highlight to our seemingly dullard peers in other fields, sometimes with, but unfortunately many times without success.


Vehicle Displays

September 10, 2012
Driver in a Mitsubishi Galant using a hand hel...

Driver in a Mitsubishi Galant using a hand held mobile phone violating New York State law. (Photo credit: Wikipedia)

I went to a workshop the other day and heard that a major vehicle manufacturer had adopted the use of touch screen control panels for all of their products.  The speaker had been employed to study the human factors associated with these. His talk was very disturbing – drivers needed more visual attention to use these things effectively as they needed fine motor control (and therefore visual attention) to press the correct area on the screen for their selection, especially if the vehicle was pitching due to the road surface or other conditions.  It made me wonder what bright spark in the company had decided that these displays were a good way to go.  When we are trying so hard to reduce mobile phone and texting use because of the clear and significant problems they pose to road safety, we have a vehicle manufacturer that decides to integrate something into the vehicle which will undo everything that road safety authorities have so far done in this area due to a lack of understanding of the issues involved.

I thought that we had learned from the initial BMW iDrive that technology for its own sake is not necessarily the way to go.  It speaks volumes that BMW now have a very much enhanced vehicle control system which includes haptic feedback so that there is nowhere near the impact on visual resources that the original design had.  And that’s good as the more visual attention is focused on the road, the safer all users will be (put it this way – if a driver is not looking at the external visual field there is no way that they can perceive and react to a potentially dangerous situation).

It made me call to mind a conversation with an engineer who was working on electric vehicles.  He said that they would incorporate noise into the car to emulate the typical sound of current cars.  He insisted that it was the only way to retain safety for pedestrians. It called to mind the situation where a man with a red flag used to signal the approach of ‘horseless carriages’ when they were first introduced in the late nineteenth century.  Why would you introduce noise into the environment when it may not be necessary – surely that is one of the advantages of electric vehicles?  Imagine a city with substantially less road noise (and perhaps more liveable?) as a result.

One disadvantage of course is that the auditory warning provided to pedestrians and other users would not be present, but I’m sure that we have the technology to overcome this aspect.  The almost ubiquitous use of entertainment devices by commuters effectively attenuates these auditory cues in any case as has been tragically illustrated by pedestrians being killed because they stepped out in front of approaching vehicles whilst listening to music from their iPods. However, DSRC network technology could easily provide warning information to pedestrians if it is set up correctly and integrated with the mobile communication networks.  Of course, there would need to be considerable human factors input so that any system is designed properly.

I suppose that all of these examples illustrate the importance of the latter aspect.  It would have been great if the vehicle manufacturer described at the beginning of this post had taken the step of actually testing their idea from a human perspective prior to making such a retrograde decision.  We now have vehicles which inherently create a similar problem to mobile phone use and texting problems that we are trying so hard to overcome – a safety time bomb in each of the vehicles produced by this company. Similar to faulty brakes or steering as it may have the same effect on road safety

One can only hope that the engineers, accountants and marketers who seem to rise to the top of these companies will realise the importance of fundamental human factors in their future products. Not just aspects such as usability testing, but the integral way that humans sense, perceive and process information.  Perhaps we can then apply a safety systems approach to road use and reap the benefits of eliminating the contributors to potential incidents (such as poor vehicle controls) before they occur.


The passing of Neil Armstrong

August 28, 2012

The Saturn V carrying Apollo 11 took several s...

The passing of Neil Armstrong has brought back memories of the historic Moon landing in 1969.  Many who are old enough will remember the awe and excitement of seeing the Apollo 11 mission on the grainy TVs of the day.  I remember as a young boy that my school let the kids out early to view it and I recall peering through shop windows at the landing on the display TVs which were everywhere due to the significance of the event.  Many people had the same idea – it was standing room only outside the local store.

The various articles praise President Kennedy for the original vision, and the scientists and engineers who made it happen. But, as always, it was the human astronauts who carried out the mission, and I recall reading that Armstrong actually took control of the Lunar lander to steer it to its safe resting position after realizing that the planned location was unsuitable.

This poses the question of what role humans will play in future manned missions into space. Will they just be cargo and the autonomous spaceship take them where it is programmed to go? Or will the specially trained commander of the mission and their crew do a similar yet higher tech version of Armstrong and have the final say in where they go?

A few weeks ago, I posted a comment on the issues with regard to pilots becoming flight managers rather than retaining their ultimate control of the aircraft.  There is a current discussion about the ramifications of this for the skills of the pilots and their ability to recover a situation if the automation controls fail for whatever reason. It seems that there is always a problem in striking a balance between automation and human control.  In many circumstances, we get it right but there is still a view that humans should be excluded from decision processes.  I think a better way to go is to provide the humans in executive control the information they require to make the right decisions.  Robbing them of this basic situational awareness is a typical error in automation and the ramifications can be catastrophic.

So, as always, we need to ensure that we provide the right information to the human in the loop, at the right time, and in the right format.  That’s where human factorsprofessionals can help.

Neil Armstrong, Apollo 11

Neil Armstrong, Apollo 11 (Photo credit: Wikipedia)


Are Security Questions a Joke? Or is the way the Systems are Designed the Real Joke?

August 9, 2012
Security questions

Security questions (Photo credit: janetmck)

I read a great article the other day on the threat posed by the use of password security questions as a Computer security issue.

I too have been quite amused by the poorly designed questions which purport to help you if you forget your login information for a site.  Frank Voisin suggests a few ideas to make them more applicable.

However, the second item jarred with me – Applicable: the question should be possible to answer for as large a portion of users as possible (ideally, universal).

Why?

I would have thought that the primary (and only) function was to have something which was individual to the person involved.

Now I’m only a human factors scientist, but my training suggests that we ask the individual to design their own questions.  Sure, give them some advice and make the process as intuitive as possible, but give them the ability to make it as individual as they like – surely that‘s the whole point!  After all, this information is only kept in a secure database to be accessed as needs permit.

Is it more that the systems designer was trying to make his or her job easier?  Sort of fitting the human to the system rather than designing it to the individual’s explicit needs?  Did this save them a few lines of code?

Obviously some human science input into this area is sorely needed.  This raises the question of whether someone who is a computer scientist first and has cross-trained into the human interface is the best person for this role, or someone with a psychology or social science background.
My suggestion is that in this case, you really need some cross disciplinary interaction to arrive at an optimal solution.