Accurate and efficient measurement of visual function is difficult in infants and young children because of limited cooperation, inability to provide cognitive verbal responses and lack of efficient behavioural methods. This is important in the clinical and research context where detection and treatment of eye conditions in infancy is dependent on measurement of visual function. Visual deprivation in infants disrupts normal visual development and affects multiple visual functions that are important in brain-based visually guided behaviors in everyday life such as contrast sensitivity, motion perception, contour integration, and face recognition. At present there are no reliable automated objective methods for measuring visual functions in infants and young children below the age of 3 years.
This new project will address these limitations. It involves the development of an application with a suite of visual stimuli to probe multiple visual functions. The application will employ an adaptive staircase with a preferential looking behavioral paradigm and eye tracking. The application will measure the sensory threshold of each visual function and the response path to the threshold, such as uncertainty, providing important additional indicators creating an individual and disease specific profile of visual loss. This application will extend to establish visual function norms, profile visual loss allowing targeted intervention therapy and monitor the effects of treatment.
Planned effort: 350 hours
Skill level: Intermediate/Advanced
Pre-requisite skills: Comfortable with 1 language (e.g. Python). Experience with app development and basic image/video processing. Ideally, comfortable with Android/iOS app development.
Mentors: Arvind Chandna @arvindchandna (lead) and Suresh Krishna @suresh.krishna (co-mentor)
Tech keywords: App development, iOS/Android, Python/equivalent