Navigation Helper Helmet

Team name: Eye of the future

In this capstone project, we plan to develop a helmet/eyeglass that assists visually impaired people to perceive their surrounding environment and navigate more easily.

Client:Abolfazl Razi
GTA: Han Peng: hp263@nau.edu


Last updated feature:


⦁ Feature update on April 22, 2021
User interface: We have added the function of collecting face samples on the user interface. This feature allows users to collect new face recognition samples and store them in categories.

User interface: We have added some new voice packages to the user interface. For example, it can now prompt the user that the motor has been reset, etc.

Helmet: We re-made the helmet using acrylic sheet.



⦁ Feature update on March 11, 2021
Rotation system: Can drive the camera and LIDAR to rotate together. The stepper can swing left and right, angle range is +90 degrees to -90 degrees.

LIDAR system: Can detect distances from 0 to 400cm, and error won't exceed 5cm.

Voice prompt system: It can describe the type of object, the distance and direction relative to the user. When the detected object is a person familiar to the user and the sample has been collected, the voice broadcast is used to explain the detected object's identity.If the detected object is not recorded, the user needs to be notified that the object is an unknown person.

User interface: User now can easily adjust the range of LIDAR detection, the speed of the motor. And also turn on or off the video record or motor rotation.



⦁ Feature update on January 11, 2021
Face recognition: It can already distinguish strangers and acquaintances. Added recognition of Jingwei Yang, Junlin Hai, and Bo Sun among familiar people.

Object recognition: Many kinds of objects can be recognized, such as tables, chairs, displays, water bottles, trees, cars, bicycles.

Voice prompt system: The voice prompt system can already say Junlin Hai and Jingwei Yang's names.