Mentors: Alberto Acquilino <alberto.acquilino@mail.mcgill.ca>, Mirko d’Andrea <mirko.dandrea@gmail.com>, and Keerthi Reddy Kambham <reddykkeerthi@gmail.com>
Skill level: Advanced
Required skills: Proficiency in Python and JavaScript (experience with Angular and Ionic is a plus); familiarity with generative AI frameworks (e.g., OpenAI GPT, Hugging Face); experience with web development and UI/UX design; knowledge of audio processing libraries, such as Essentia (optional); basic understanding of music theory and notation (optional).
Time Commitment: Full time (350 hours)
About: HarmonyHub is an ongoing open-source project that aims to bridge this gap by developing a modular, web-based platform for music education. The platform leverages modern web technologies (Ionic and Angular) to create reusable components and services, enabling a hybrid mobile app that can run in browsers or be deployed as mobile apps for iOS and Android. By integrating generative AI, HarmonyHub will empower music educators to create adaptive, personalized learning experiences for students, making music education more inclusive, diverse, and effective.
The HarmonyHub platform is built using the Ionic framework and Angular, enabling the development of a hybrid mobile app that can run directly in browsers or be deployed as native mobile apps on iOS and Android devices. Its modular architecture allows for reusable components and services, simplifying customization and scalability.
Generative AI will be used to:
- Assist music teachers in defining exercises for their students.
- Enable a code-free interface for creating and compiling custom web applications for music education
Aims: This project envisions integrating large language models (LLMs) and Generative
AI to empower music educators with tools for dynamic, adaptive learning. This will enhance student engagement, improve learning outcomes, and provide educators with powerful tools to tailor instruction to individual needs. The project will also contribute to the broader field of adaptive learning by demonstrating how generative AI can be applied to domains with complex, non-textual data like music.
Objectives:
- Extend HarmonyHub to incorporate Generative AI for adaptive learning, focusing on creating personalized exercises tailored to students’ technical levels, age, and progression speed
- Develop a user-friendly, code-free interface for teachers to customize and deploy exercises
- Leverage Generative AI to analyze and enhance existing exercises, making recommendations for refinement based on student performance
- Enhance inclusivity and diversity in music education by providing tailored learning experiences for students from varied backgrounds and abilities
What can I do before GSoC?
- Familiarize yourself with the existing HarmonyHub codebase and its modular architecture
- Explore generative AI frameworks (e.g., OpenAI GPT, Hugging Face) and their applications in adaptive learning
- Experiment with audio processing libraries (e.g., Essentia, MIDI.js) to understand their potential for music education
Websites: Music Education Interface | Alberto Acquilino and GitHub - albertoacquilino/music-education-interface-ionic
Tech keywords: Generative AI, Adaptive Learning, Music Education, Angular, Ionic, Python, JavaScript, Audio Processing, MIDI