In this project we are set out to implement an application for Video based guidance. We have presented the design and implementation of an asymmetric application that runs between two android phones over WiFi, along with the illustration on how a person (client) with the application can choose any other person (guide) with the same application and seek any kind of guidance. The client can connect to a particular guide, by choosing from a list of active guides. The client can stream a video using his/her phone camera to the guide and request guidance. The guide then sends a few signals to the client. If these signals are predefined with some meanings then they can be used as perfect guidance mechanisms. This application supports two ways of guidance, using the streaming video from the client to guide as reference:

  • Text commands as a response from guide
  • Gestures over the video as a feedback mechanism
    • In case of text messages the instructions are typed out on a text box and sent to the client. In case of gestures the guide can draw directions or instructions on a canvas and send it to the client. The client can view the drawn instructions over the video. This project helps us discover how the application can be implemented and understand technologies used for the same. It also illustrates how the login details of the clients and guides are stored in the database.

      Team Members