Making their marks in medicine
Two Faculty of Arts and Science students earned top honours in the recent Next Generation of Medical Simulation Hackathon. During the event, students were tasked with designing innovative solutions that focus on enhancing professional skills competencies in real-world healthcare scenarios.
Siobhan Lee (Biology) and Akshay Laxman Desale (School of Computing) earned top honours for their projects. Lee was joined by team members Spencer Osborn (Smith Engineering), Coleman Farvolden (Smith Engineering), and Ryan McGillivray (Smith Engineering). Laxman Desale, who’s project was called Med Vue, was joined by Brian Pereira (Smith Engineering) and Imogen Lawford-Wickman (Smith Engineering).
In the Challenge Statement for the Next Generation of Medical Simulation Hackathon, students were tasked with designing innovative solutions that focused on enhancing professional skills competencies in real-world healthcare scenarios. Participants were encouraged to explore intelligent innovative approaches, technologies, and methodologies to create designs that closely mimic the challenges faced by healthcare professionals in leadership roles and task-oriented environments.
Desale’s project Med Vue is a cutting-edge, two-tiered system engineered to quantitatively evaluate the stress levels and anxiety experienced by surgeons during simulated surgical procedures.
“Stage 1 involves the real-time monitoring of a surgeon's physiological markers, including heart rate variability and skin conductance, through advanced wearable technologies,” Desale explains. “Stage 2 introduces an innovative application of computer vision and machine learning algorithms to track the surgeon's hand movements and eye patterns meticulously.”
Desale adds presenting a fully functional model of Med Vue played a crucial role in the pitch. It allowed them to demonstrate the practicality and effectiveness of our system in real-time, making the benefits and applications of our project immediately apparent to the judges.
“Following the successful presentation and the positive reception of Med Vue, we are eager to propel our project into its next phase of development,” Desale said. “We have expressed our interest in collaborating with the Queen's Incubation Centre, a move that promises not only to refine and enhance our existing system but also to explore potential expansions and applications within the surgical training domain.”
Lee’s project, the NeuroNeck, focused on training healthcare professionals on performing spinals and keeping the spine steady and supported.
“It is especially difficult to train for these scenarios because there is no real time feedback when the neck or head is moved. The device uses sensors placed on the interior of a neck brace along with algorithms to give real time, specific feedback on movement that could not otherwise be picked up visually. Three LEDs coloured green, yellow and red are placed on the front of the brace indicating to the person treating the spinal injury how well they're keeping the neck steady.”
Lee explains the uniqueness of the project and the team dynamic is what made the NeuroNeck successful. “We were set on creating a device that was not already on the market and I can confidently say we succeeded at that. Most of the other projects focused more on the VR and AI aspects of medical simulation, making our project stand out. We also provided a live demonstration on top of the video demonstrations with Queen's First Aid which really caught the judges' attention.”
Ingenuity Labs, in collaboration with Connected Minds, partnered with Experience Ventures to offer The Next Generation of Medical Simulation Hackathon.