Abstract:
At the current age of corona, it became harder than ever to attend physically at a distant
location. Flying between countries is closed most of the time, but still, people have their
connections and events that they want to attend. Current media of communication only
allow the participant to see and hear what is on the other side, with minimal control
interaction methods with the distant location. In this project, we hope to add a level of
involvement for the participants in their desired distant location. We would like to allow them
to look around the room freely, perform some everyday gestures such as nodding, moving
their head left and right, and other natural movements they do while communicating in their
daily lives.
Our work was to create a system that contains two main parts each at different location. The
two locations communicate with each other via the internet. At the first location, there is a
user who would like to view and interact with the second location. This user is equipped with
a headset mounted at his head, the headset contains sensors for measuring the orientation
of his head. At the second location, there is a stand that contains three different motors
controlled with a NodeMCU microcontroller. The purpose of this stand is to perform a
realistic movement that mimics how the user's head moves. Each of the three motors is
responsible for the movement around one of the three different axes.