Oculus app content. To be able to use Oculus Rift.
Oculus app content. In order to be able to use Oculus Rift, it was necessary to find an environment with the various tools necessary for its programming. Offered by the manufacturer of the SDK, available only for Windows, we had to find a version suitable for use on a UNIX system. Fortunately, the project, initiated by Brad Davis, allows us to manipulate all of the Oculus Rift’s body sensors. It contains all the libraries and files necessary for manipulating and programming a virtual reality mask. As a result, we were able to restore the values of the accelerometers needed to continue our project in the terminal. Measurements of various box sensors can be used, now it is necessary to use them in our project. In particular, send these values to the camera system and build positions based on current sensor readings.
VR headset. We needed to determine the means of communication between Oculus and the system. To do this, we modified a program written in C ++ that allows us to restore the values of accelerometers associated with Oculus. This program, in addition to extracting values, now writes the resulting values to a file. Indeed, we were inspired by the network bridge developed in C during our cryptography classes.
VR video games. Thanks to this file (val.data) containing the values of accelerometers, any type of program that can read the file can restore these values. Thus, we were able to encode in C a program that reads these values for the first time, saves them as “initial values”, and then sends the difference between these values and the new values read on the network. This allows the program to begin to establish a starting position and process all trips from this position. The RaspberryPi program retrieves the values sent over the network, then adjusts the threshold value and converts the values to control the servomotors, respectively.
To be able to send our modules to the drone, we had to create a structure capable of loading all our equipment. First, we developed a module for attaching native cameras to the drone. This module was designed in such a way that it was as light as possible, at the same time able to contain electronics, as well as support the camera system. In addition, this system had the advantage that it perfectly centered the mass and provided certain stability in flight, since it was below the drone (prototype gondola, available in the application). Later, we realized that this solution was not feasible, since the two bindings of the previous camera were motionless. Therefore, we changed the structure by choosing several compartments: one for the battery and one for the RaspberryPi.
We finish the project on a globally positive balance, since we mainly respond to the initial problem. The great work of this project, in addition to finding a solution for each of our modules, is the work of “implementation”. In the best case, we tried to develop modules with a simple and independent design, allowing them to develop on their side. Even if some modules require refinement, we restore the stereoscopic three-dimensional image of the drone sent to the web server. We manage to transcribe the movement of points in our camera system, sending data from a server / client pair. Finally, we were able to observe 3D rendering by rendering our web page through Oculus Rift.