Show simple item record

dc.contributor.advisor Halawani, Alaa
dc.contributor.author Ebido, Basel
dc.contributor.author Aburish, Omar
dc.contributor.author Abuisneneh, Samir
dc.date.accessioned 2024-08-25T09:26:34Z
dc.date.available 2024-08-25T09:26:34Z
dc.date.issued 2024-04-01
dc.identifier.uri scholar.ppu.edu/handle/123456789/9116
dc.description no of pages 54, امن معلومات 1/2024
dc.description.abstract EmoSense is a technology-driven solution designed to help visually impaired individuals better understand the emotions of those around them. Social interactions can be challenging for individuals with visual impairments, and emotions play a crucial role in how people communicate and interact with one another. Understanding the emotional states of others can be especially difficult for visually impaired individuals. Therefore, EmoSense aims to leverage the power of artificial intelligence to provide either haptic or audio feedback of the emotions of person they are interacting with. The EmoSense system consists of a motor-driven camera and an AI system that detects human speech then tracks, captures and analyzes the facial expressions of individuals in the wearer’s surroundings. The system then categorizes the emotions of the facial expressions into one of five categories: happiness, sadness, anger, surprise, and neutral. The system provides this information as haptic vibration or audio that the wearer can use to navigate social interactions more effectively. EmoSense represents a significant step forward in improving the quality of life for visually impaired individuals by providing them with a better understanding of the emotional states of those around them, enabling them to participate more fully in social events and interactions. The EmoSense project aims to help visually impaired individuals better understand the emotions of those around them. It utilizes a head-mounted device equipped with a camera and an AI system to capture and analyze the facial expressions of individuals in the wearers surroundings. The system categorizes emotions into happiness, sadness, anger, surprise, and neutral, and provides this information as haptic vibration or audio feedback to the wearer. Keywords: Real-time emotion recognition meeting experience for the visually impaired using Raspberry Pi, webcam, vibration motor, servo motor, 2-Axis Pan and Tilt Mount Kit, microphone, Bluetooth earphones, OpenCV, machine learning, CNN, facial detection, and haptic/audio feedback. en_US
dc.language.iso en en_US
dc.publisher جامعة بوليتكنك فلسطين - امن معلومات en_US
dc.subject Blind People en_US
dc.subject EmoSense en_US
dc.title EmoSense en_US
dc.title.alternative Emotion Recognition For Blind People en_US
dc.type Other en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account