跳转到主要内容

Enhanced Learning Augmented Reality Architecture

A student-designed Augmented Reality headset developed to visualise human anatomy and other complex systems; advancing innovation in medicine, education, engineering design and manufacturing.

  • The current and near-final iteration of the headset design, with current front-facing components.

  • One member of UCXR talks about his sub-project and experience in relation to the larger project.

    One member of UCXR talks about his sub-project and experience in relation to the larger project.

  • The first design of the headset front enclosure along with the early stage computer and sensors.

  • The second iteration: changes were made to account for wearability, primary components also changed.

  • Back enclosure and components in the current design. Component modularity and cooling accounted for

  • Full current design, featuring rechargeable pocket-sized battery connected to the headset's computer

What it does

The ELARA headset is an AR device for real-time 3D visualization and interaction, designed to make technical education more immersive, accessible, and engaging for students and professionals in medicine, engineering, manufacturing, and beyond.


Your inspiration

We were inspired by how inaccessible and passive most technical learning tools are, especially in fields like medicine and engineering. Our team wanted to make anatomy, medical operations and other complex systems more tangible and interactive. As engineering students, we recognized augmented reality as a powerful way to bring dynamic, real-time content into physical environments. Drawing from our experience with static visuals and a passion for creating practical systems, we integrated custom hardware and live visuals to enable learners and professionals to visualise and interact with complex subjects in an intuitive and engaging way.


How it works

The headset features a custom housing that integrates both custom and commercial components. In the front-facing unit: a depth camera and motion tracker capture user movement and surroundings in real time, while the optical module projects content into the user’s view. In the rear housing: a powerful compute module, our custom carrier board for component-computer integration, and our custom passive cooling assembly engineered to match commercial active cooling solutions. Visual SLAM algorithms use the camera and tracker to accurately anchor virtual elements in physical space. A single-charge, two-hour battery powers the system, supported by our custom charger. Users interact via hand gestures, voice commands, and tentatively, brainwave controls. The software applications are developed in-house, with some algorithms and assets adapted from open-source frameworks. The system supports hands-on learning, simulation, and technical design.


Design process

The initial concept (image 2): The headset originally used dual FLCoS projectors, later replaced by a commercial high-resolution optical module. A Raspberry Pi Zero W processed data from three sensors—an IMU (motion), Time-of-Flight (depth), and Leap Motion 2 (hand tracking)—and ran AR applications. Components were housed in two custom 3D-printed enclosures (front and back), a layout that remains unchanged. Design two (image 3): We upgraded from the Pi to the Vim3, then to the LattePanda 3 Delta to meet demands for greater computing power and more suitable ports. The enclosure was reshaped for better placement of components, as well as to better fit the user’s face and support modular comfort features like nose and head padding. Design three (images 1, 3, 5): We moved to the LattePanda Mu, a system-on-module (SoM) that required us to design a custom carrier board; giving us full control over layout, ports, and modularity. It’s our smallest and most capable solution yet. We added the Intel D435i for precise 3D mapping and designed a vapor chamber–heat sink assembly for thermal management. A rechargeable battery system is also in development. The ToF and IMU are no longer primary components but are backups to the D435 and Leap Motion. This design is in the manufacturing phase


How it is different

UCXR is not a general-purpose AR headset. Unlike devices like Apple Vision Pro, Microsoft HoloLens, or Meta Quest, (designed for entertainment and broad productivity) UCXR is purpose-built by students for hands-on training in medicine, science, and engineering. It prioritizes technical accuracy and educational value over consumer convenience. While commercial headsets focus on media, remote meetings, or gaming, UCXR enables users to interact with detailed anatomical models, chemical processes, and mechanical systems using gesture control and real-world mapping. It also supports experimental features like brainwave-based input. Built with in-house electronics, custom interaction tools, and high-quality optics, UCXR delivers a clear and responsive experience. In Calgary, where industry and student teams often center on oil and gas, or aviation, UCXR stands out as a rare student-led initiative in consumer electronics and advanced computing.


Future plans

Our first headset is set to be completed this fall, with an initial showcase planned for the University of Calgary's Clubs Week. This event will allow us to gather feedback from students, faculty, and potential collaborators across disciplines. Following that, we will focus on improving user interaction, ergonomics. Our goal is to develop a commercial-grade headset that performs on par with existing alternatives on the market. As the project grows, we aim to scale the team, build interdisciplinary partnerships, and explore real-world pilot programs in academic and technical training environments.


Awards

In 2024, we won the award for the 'Most Innovative Club' in the engineering faculty. We were chosen by Laval Virtual to attend the world’s largest mixed reality conference in Laval, France. We are also sponsored by the following companies: Altium, SolidWorks, Kenesto, LattePanda, and NeuroPawn.


主要内容结束。回到顶部。

Select your location