Back to visual perception. Initially, it may not seem right to focus on one human sense and not discuss the others. We are multifaceted humans. The brain takes advantage of all its senses, when they are available. We’ve evolved with amazing capabilities.
The interesting notion that certain wines taste better when accompanies by certain music is a wonderful example of how interactive our systems can be. That’s without us having any conscious control over their immediate intimate workings. Parts maybe hardwired and others soft wired and adaptable.
Vision plays a dominant part in enabling us to move around. We haven’t yet evolved echo sounding, like bats and dolphins. This is not to say that those who loose vision can’t compensate to some extent, but they don’t fly aircraft or drive fast cars or become astronauts.
My thoughts arise from exposure to several aspects of our dependency on seeing the world around us. To begin, at the early part of my career, it was indeed the process of taking sound imaging and making it usable for recognising objects. Converting the information that come back from sending sound pluses through water into an image must deal with a dynamic environment. Interpretation of such electronic images can be the difference between hitting an object at sea and avoiding it.
Later, my design work concentrated on information presented to a pilot and what happens next. That whole arena of the aircraft cockpit is one big interface. The link between the senses and the decision maker. I’m not straying into the interminable debates about human factors.
Let’s stay with the trend that’s in front of us in every walk of life. That’s the dependence on recognising and acting on information that is presented to us on a nearby screen. In so far as I know, humans didn’t evolve with this need to relate acutely to closely presented information as much as reacting to distant stimulus. Afterall if a hostile animal or dangerously armed person was heading towards me at speed, I wouldn’t sit around debating the subject.
Aeronautics has experience in this shift of attention. At the start of my career aircraft cockpits where mostly knobs and dials. Mechanical indicators and filament bulbs. Sometime unreliable. Still the idea of flying by the “seat of the paints” prevailed. That centred around situation awareness, predominantly guided by looking out of the window. At the outside world. Distant vison equally, if not more, important as looking two feet ahead at a panel. Over the last five decades the above has changed radically. Instruments are large flat screens dotted with an array of colourful symbols offering every aspect of “situation awareness”.
Now, this is happening to cars. Most new cars have electronic screens. The expectation is that we humble humans have transitioned from simple mechanical dials to a fascinating world of colourful animated markers and whizzy logos. Despite the glorious technology the basic function remains the same. That is the link between the senses and the decision maker.
Adequate levels of visual perception being the number one attribute a pilot or driver is expected to maintain. This continues to be true as automation does more and more. What maybe a long-term trend in human evolution is that shift between the importance of what’s a couple of feet away and what’s in our surroundings. Will we become less sensitive to a personal experience of what’s more that two feet away? I wonder.