Saturday, November 30, 2019
NEW EYE TRACKING TECHNIQUES IMPROVE REALISM OF AIRCRAFT SIMULATORS Ess
  NEW EYE TRACKING TECHNIQUES IMPROVE REALISM OF AIRCRAFT SIMULATORS         A simulated flight environment for pilot training may soon    be made more realistic through the  use  of  eye-tracking    technology developed by researchers at the  University  of    Toronto's Institute of Biomedical Engineering (IMBE).         Many safety and cost benefits are obtained by training    aircraft pilots under simulated conditions, but to be effective    the simulation must be convicingly realistic. At present, th e    training facilities use large domes and gimballed projectors, or    an array of video screens, to display computer-generated images.    But these installations are very expensive and image resolution    is low. Further, it would take an enormous amount of addi to    improve image quality significantly throughout the whole viewed    scene.         However, based on the visual properties of the eye,    realism can be obtained by providing a high-resolution 'area of    interest' insert within a large, low-resolution field of view.    If the image-generating computer 'knows' where the pilot's    fixation is, it mage there.         The technology to make this possible was developed by a    research team headed by Professor Richard Frecker and Professor    Moshe Eizenman. The work was carried out in collaboration with    CAE Electronics Ltd. of Montreal with financial support from the    Natural Sciences and Engineering Research Council of Canada.         Their eye-tracker can record and analyze accurately up to    500 eye positions per second.  The system works by means of    capturing and processing the reflections of a low-level beam o f    invisible infra-red light shone onto the eye.         Multi-element arrays capture the image of the eye and    digitize the information, which is then processed in real time    by a fast, dedicated signal processing unit. The difference in    position between the ligh tre of the pupil  reveals  the    instantaneous direction of gaze.         Developments by the IBME team have significantly increased    the speed of signal processing in addition to enhancing accuracy    of eye position estimates.  Eizenman believes that  "these    improvements make our eye-tracker very effective in monitoring    the large G-force environment where the pilot tends to make    larger eye movements because of contraints which exist on    movements of his head".          In a new generation of aircraft simulators,  under    development by CAE Electronics Ltd. of Montreal, a head tracker    which tells the direction of the pilot's head is mounted on top    of the helmet. The eye tracker is mounted on the front of the    helmet, and is ll exactly where the pilot's eye is fixating.         Frecker said that "successful integration of our eye    tracker into the novel helmet-mounted CAE flight simulator would    result in a new generation of simulators that would likely    replace the current large domes and cumbersome video display    units."         Initial tests of the integrated system will be carried out    in collaboration with CAE Electronics at Williams Air Force Base    in Arizona later this year.  New Eye Tracking Techniques Improve Realism Of Aircraft Simulators Ess    New Eye Tracking Techniques Improve Realism of Aircraft Simulators        A simulated flight environment for pilot training may soon be made more  realistic through the use of eye-tracking technology developed by  researchers at the University of Toronto's Institute of Biomedical  Engineering (IMBE).      Many safety and cost benefits are obtained by training aircraft pilots  under simulated conditions, but to be effective the simulation must be  convicingly realistic. At present, th e training facilities use large domes  and gimballed projectors, or an array of video screens, to display  computer-generated images. But these installations are very expensive and  image resolution is low. Further, it would take an enormous amount of addi  to improve image quality significantly throughout the whole viewed scene.      However, based on the visual properties of the eye, realism can be  obtained by providing a high-resolution 'area of interest' insert within a  large, low-resolution field of view. If the image-generating computer  'knows' where the pilot's fixation is, it mage there.      The technology to make this possible was developed by a research team  headed by Professor Richard Frecker and Professor Moshe Eizenman. The work  was carried out in collaboration with CAE Electronics Ltd. of Montreal with  financial support from the Natural Sciences and Engineering Research  Council of Canada.      Their eye-tracker can record and analyze accurately up to 500 eye  positions per second. The system works by means of capturing and processing  the reflections of a low-level beam o f invisible infra-red light shone  onto the eye.      Multi-element arrays capture the image of the eye and digitize the  information, which is then processed in real time by a fast, dedicated  signal processing unit. The difference in position between the ligh tre of  the pupil reveals the instantaneous direction of gaze.      Developments by the IBME team have significantly increased the speed of  signal processing in addition to enhancing accuracy of eye position  estimates. Eizenman believes that "these improvements make our eye-tracker  very effective in monitoring the large G-force environment where the pilot  tends to make larger eye movements because of contraints which exist on  movements of his head".      In a new generation of aircraft simulators, under development by CAE  Electronics Ltd. of Montreal, a head tracker which tells the direction of  the pilot's head is mounted on top of the helmet. The eye tracker is  mounted on the front of the helmet, and is ll exactly where the pilot's eye  is fixating.      Frecker said that "successful integration of our eye tracker into the  novel helmet-mounted CAE flight simulator would result in a new generation  of simulators that would likely replace the current large domes and  cumbersome video display units."      Initial tests of the integrated system will be carried out in  collaboration with CAE Electronics at Williams Air Force Base in Arizona  later this year.      Contact:   Moshe Eizenman (416)978-5523   Richard Frecker (416)978-2236  NEW EYE TRACKING TECHNIQUES IMPROVE REALISM OF AIRCRAFT SIMULATORS Ess    NEW EYE TRACKING TECHNIQUES IMPROVE REALISM OF AIRCRAFT SIMULATORS      NEW EYE TRACKING TECHNIQUES IMPROVE  REALISM  OF  AIRCRAFT    SIMULATORS         A simulated flight environment for pilot training may soon    be made more realistic through the  use  of  eye-tracking    technology developed by researchers at the  University  of    Toronto's Institute of Biomedical Engineering (IMBE).         Many safety and cost benefits are obtained by training    aircraft pilots under simulated conditions, but to be effective    the simulation must be convicingly realistic. At present, th e    training facilities use large domes and gimballed projectors, or    an array of video screens, to display computer-generated images.    But these installations are very expensive and image resolution    is low. Further, it would take an enormous amount of addi to    improve image quality significantly throughout the whole viewed    scene.         However, based on the visual properties of the eye,    realism can be obtained by providing a high-resolution 'area of    interest' insert within a large, low-resolution field of view.    If the image-generating computer 'knows' where the pilot's    fixation is, it mage there.         The technology to make this possible was developed by a    research team headed by Professor Richard Frecker and Professor    Moshe Eizenman. The work was carried out in collaboration with    CAE Electronics Ltd. of Montreal with financial support from the    Natural Sciences and Engineering Research Council of Canada.         Their eye-tracker can record and analyze accurately up to    500 eye positions per second.  The system works by means of    capturing and processing the reflections of a low-level beam o f    invisible infra-red light shone onto the eye.         Multi-element arrays capture the image of the eye and    digitize the information, which is then processed in real time    by a fast, dedicated signal processing unit. The difference in    position between the ligh tre of the pupil  reveals  the    instantaneous direction of gaze.         Developments by the IBME team have significantly increased    the speed of signal processing in addition to enhancing accuracy    of eye position estimates.  Eizenman believes that  "these    improvements make our eye-tracker very effective in monitoring    the large G-force environment where the pilot tends to make    larger eye movements because of contraints which exist on    movements of his head".          In a new generation of aircraft simulators,  under    development by CAE Electronics Ltd. of Montreal, a head tracker    which tells the direction of the pilot's head is mounted on top    of the helmet. The eye tracker is mounted on the front of the    helmet, and is ll exactly where the pilot's eye is fixating.         Frecker said that "successful integration of our eye    tracker into the novel helmet-mounted CAE flight simulator would    result in a new generation of simulators that would likely    replace the current large domes and cumbersome video display    units."         Initial tests of the integrated system will be carried out    in collaboration with CAE Electronics at Williams Air Force Base    in Arizona later this year.    
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.