GSoC 2023 Project Idea 4.2 Virtual Reality for Distributed Research (Orthogonal Research and Education Lab) (175 h)

Virtual and Augmented Reality (collectively known as XR) is a perpetually emerging area at the intersection of cognitive neuroscience and computing. A host of tools introduced in the past few years have improved the accessibility of this technology to a host of applications. This project will focus on building applications for research or science communication. The applicant will propose a project based on one of the following templates.

● Replicate a famous experimental paradigm (e.g. Morris Water Maze) where the subject can interact with both the environment and stimuli. This environment should allow for participant data to be collected, and will exhibit a high degree of environmental realism.

● Replicate scientific knowledge or biological processes in a unique way. Possible applications include a navigable version of a brain atlas, creating an immersive version of the OpenWorm browser, or an animation that displays a biological process from multiple points of view.

● The ability to visualize Agent-based Models (ABM) using both a first-person and third-person perspective. This involves integrating simulations from the NetLogo platform with a 3-D physics and graphics engine.

The outcome of your project should be an XR application published under an open-source license that runs in a widely-available format (360 video, smartphone headset, or Meta Quest). This will allow learners and researchers from around the world to access and interact with these models. We would also prefer projects that allow for and measure haptic feedback and movement inputs.

What can I do before GSoC?

You can join the Orthogonal Lab Slack and Github, as well as attend our Saturday Morning NeuroSim meetings. You will work within our Principles of Bits to Matter to Mind and PhASiC initiatives, but interactions with your colleagues across the organization is key. You will also want to become familiar with a scientific programming approach (such as Python or Julia) to construct your cybernetic model, as well as the NetLogo platform for building agent-based models.

Orthogonal Research and Education Lab:

Skill level: Beginner/intermediate

Required skills: Knowledge of either Unity or Blender is essential, in addition to an ability to convert models into formats such as 360 video or Meta Quest applications. A willingness to embrace open-source development practices and an interest in interdisciplinary research are essential.

Time commitment: Half-time (175 h)

Lead mentor: Bradly Alicea, Jesse Parent

Project website:

Backup mentors: TBA

Tech keywords: Virtual Reality, Simulation, Blender/Unity, 3-D Graphics

Hey ! @arnab1896 , @b.alicea , @jparent
I’d be super interested in this project, i know this will be new experience for me in ar vr domain cause I’ve not made any significant contributions yet . But i can ensure that I’ll learn everything in order to make sure the project will work.
. I have basic skills in (python, unity 3d ,blender and unreal engine) . I am a Bachelor’s of Data science and programing Science, so I’m open to work with various domain and love to create things new interactive things.

Just thought I’d give my official shoutout of interest here - also happy to talk on slack :

Thanks for the message. Talk to you in Slack!

Hey, I’m Manish, an open source enthusiast and a web developer. I have had experience with unity and some avr tools before would love to brush up my skills and and boost them while working on such a great project. Looking forward to contribute to this great project and make it a great success!

Hello @b.alicea ,
I am Chandraprakash I react developer, interested in AR/XR/VR. I like the project to contribute in this project and learns about the AR/XR. I am very passionate to learn more and increase my domain.
I would love to contribute and learn more.

Thanks for your interest. Please join our lab Slack to get onboarded.

Please join our lab Slack to start interacting with our group!