Reverse-engineering the honey bee visual system

by Adrian Horridge

Image caption: The human eye projects the environment through a lens upon the retina as an inverted image, but honey bees have a compound eye comprised of many ommatidia that each capture light from a  small solid angle. Due to the small size of the honey bee brain, the array of small visual fields is not recombined into an internal image. Instead, edges are detected by the six green sensitive receptors, while the strength of the blue stimulus is measured by blue sensitive receptors, of which there is one in each ommatidium. In the green channel, motion and its direction are detected by deeper processing of responses of pairs of adjacent ommatidia. Bees do not see pattern; they detect simple features, such as signal modulation, average local orientation of edges, position of a vertical edge relative to a patch of colour, width between outer edges, and others, and add together coincidences to make simple cues. The bee brain is not capable of detecting, processing or recognition of anything more, but they can process cues in a sequence over a route. Bees do not see colour either, despite the belief based on the 1914 publication of von Frisch. In RSBS, we showed that bees detect, locate, and measure the blue content of any area that looks coloured, white or grey to humans. Black emits no stimulus, and is not seen.

The timing was very important. In 1985, two departments in RSBS were reviewed, and two new Departments, Molecular Biology and Evolutionary Biology were established, but no professors were appointed as a result of lack of suitable applicants and lack of funds. The intention was to go molecular, but they were too late, there was a world shortage of candidates and no money for an expensive idea. John Shine left his tenured position in RSBS. Next, in 1986, two more departments were reviewed, Behavioural- and Neuro Biology. The reviewers suggested that collaboration should be encouraged between Neuro, Behavioural, and the residue of the JCSMR Department left by the departure of Prof Bishop, FRS, who had retired after a distinguished career on neurophysiology of mammalian vision. Forced by the shortfall in salary money from 1980 onwards, it was university policy to reduce the number and size of departments and reduce the power of professors to determine their own agendas. However, there really was common ground among a large number already working on vision in JCSMR, Neuro and Behavioural Biology in RSBS, and Applied Maths in R S Physics.

In response to these developments, three professors, Levick, Snyder and Horridge, made a suggestion directly to the Vice Chancellor for the establishment of a Centre of Visual Sciences, which would bring together those working on vision in the three Schools. This fitted in very well with the recommended policy, but the three Directors did not approve. The proposal was accepted and used by the V-C to get funds for a new 2-storey building as an extension of RSBS, which was the smallest of the three Schools. A small annual budget was intended for collaborative ventures. This new development had an electric effect from 1985 onwards. There was a great deal of new collaborative work between RSBS and Physics, but nothing from JCSMR, who decided not to join in, but keep their share of the funds in their own building, with no collaboration.

There were several programs in my Department of Neurobiology in RSBS, led by Allan Snyder, Simon Laughlin, and several students and distinguished visitors, on the physics of eyes, sampling, optical resolution, and field size, leading to many papers, a book, and a Conference. Laughlin was later elected to the Royal Society as a direct result of that work. Allan Snyder’s earlier discovery of long- distance transmission in fibre light guides led to their full scale use in inter-continental connections of the world-wide web. As a result of this work, Allan was elected to the Royal Society.

My own topic was measurement of range by insects in flight, how they avoid collision with objects, and how they know where to step out or jump exactly across a small gap. In RSBS, starting with the praying mantis and locusts in 1985, I showed that mantids make a small lateral movement of the head and use the resulting feedback of environmental angular motion to measure the RANGES of nearby objects. This was published in 1986. Meanwhile, using the new space and money from the Centre as a lever, I was able to get a tenured appointment for Mandyam Srinivasan, who returned in 1986 from a two-year temporary sojourn in Zurich. Srini brought a Swiss visitor, Miriam Lehrer, who taught us all how to train bees, and she returned every summer. Within a year, we worked out exactly how trained bees in flight measured cues to obtain range. It was angular velocity. It was not motion of a nearer object against a background, and not the change in pattern at the edge of a moving object (closing or opening parallax). Therefore the mechanism was suitable for control of free flight or stepping across gaps in a visually busy environment. It was nothing to do with colour or pattern perception. Using their own motion as they flew along, insects detected the optic flow from their surroundings to control piloting and avoid crash. There was no need to identify things. The practical applications were obvious.

I saw an advert in the Canberra Times, saying that federal grants were available for collaborative applied projects shared between a Company and a University Department. Coincidentally, a lady who worked for “Guide Dogs” called on me to satisfy her curiosity about what we were doing on ‘Vision’. I responded by going to the “Guide Dogs” headquarters in Melbourne, where I gave a lecture on our ideas about using induced relative motion to make gadgets to help people with impaired vision. There I was approached by their electronic technical expert, Dr Tony Heyes, who modified computers to read and produce Braille text. By collaborating, we secured $350,000 for our joint project and very quickly made a thimble with eye on finger that could read the range of surrounding objects. This device was intended to help blind people learn the locations of items in their immediate environment. We coded range as colour on a monitor screen. Like bee vision, it was not related to recognition of pattern or colour.

My team had made a gadget based on the bee. However, at the time no interest could be raised in Australia to produce such a device commercially, but after the Chernobyl disaster in 1987, the Japanese government was desperate for development of robots that could help with disaster recovery after a radioactive incident. A group of Japanese engineers were searching the world for ways to make mobile robots fitted with useful vision. Quite by chance, they arrived at ANU and saw our gadget, but learned nothing about how it worked. Very soon, the Fujitsu Computer Co gave ANU $10million for our knowhow, took it to Japan, and put 20 engineers on an improved version, done their way. They sold a few for mobile vehicles, computer on board (almost driverless cars), and also used our circuitry implemented in reverse, to drive virtual reality machines. When they needed these in their own disaster, they were not available.

Very quickly, the US Air Force learned about our gadget and suggested that we have a collaborative scheme to put our system under the belly of seaborne helicopters to assist their landing on heaving ships at sea, and to install on pilotless aeroplanes to assist landing and avoid crashes of remotely controlled drone aircraft. The insect visual feedback mechanism was extremely simple; its time had arrived; but no-one had thought of it. The idea could not be patented. By the time I retired in 1992, Srini was running the whole project with a team supported by large US grants, first from USAF then NASA, then DARPA. He eventually replaced me as Professor and was elected to the Royal Society for his work and later won the Prime Minister’s Prize for Science.

All of this was the result of making unexpected discoveries, and following them to a new conclusion. None of it came out of a proposal within the university system, and certainly none of it was considered by a grant giving body. All had to done without letting out the key new principles we discovered. We did the basic research at little cost; we made a gadget and the Japanese saw it work, so the cash was offered. By 1992, however, it was hardly a secret; the USA was flying drones piloted by their version of our simple system. When the story emerged, a view was taken that a highly secret project should not be done in Biological Sciences by staff not vetted for security risks. But by 1992, all the Science had been done; no further advance along these lines was possible in RSBS unless new ideas appeared, which did not happen. Eventually, about 2008, DSTO took the project to Adelaide, where our ideas, hardware and those staff willing to leave disappeared into the Department of Defence. They left behind our expert on helicopter stability, who took a teaching appointment at Duntroon, and our best chip technician, who also went to Duntroon. Srini took up an offer to move to Brisbane.

Research excellence grows out of trust. When politicians talk of Innovation being essential to increase productivity and national wealth, they usually think in terms of guided research, top down management, limited short-term budgets, progress reports (to whom and why), short-term appointments and especially research on previously defined topics. A better way is to appoint a few first class people with the appropriate skills and trust them to build their own team and find their own way forward.

Updated:  22 November 2017/Responsible Officer:  Director RSB/Page Contact:  Webmaster RSB