With the innovations in telecom domain and sensor technologies, the handheld devices of these days are equipped with various sensors which provide sophisticated functionalities and makes gadgets very powerful and easy to work with.
Accelerometer is one of the sensors which are commonly available on handheld devices like mobile phones, tablet PCs, notebooks etc. With the integration of gesture recognition framework for mobile applications, it provides intuitive ways of interaction with a rich user experience and it also makes things simple but interesting. Accelerometer can be used for wide range of applications and games on mobile devices and can be integrated with desktop applications.
The goal of the project is to build a gesture recognition framework based on accelerometer and provide seamless integration with any of the desktop applications. With the integration of this framework for desktop applications user can remotely play games, create drawing, control key event based applications. And the framework is so robust and flexible that this being a generic framework , can be integrated with any target application it really doesn't matter whether the application is new, or legacy , whether the application exposes APIs or not, no matter what this framework can be integrated with any target application and this doesn’t even need any changes from application side.
Racing games is one of the applications of accelerometer's integration with desktop applications. Wherein user will be able to control racing objects using gestures from his handheld device. Also user can steer the car to left, right, front, back just by corresponding gestures from his mobile device.
NFS racing game with Gesture Recognition Framework comprises of four modules.
- Accelerometer module on mobile device
The accelerometer module running on user’s mobile will initiate Accelerometer hardware and start listening to accelerometer output data which returns the acceleration due to gravity of the device in three dimensions. This is helpful in understanding which way the user is tilting the phone, and this information is fed to Gesture Recognition Algorithm which processes raw accelerometer data values into useful steering directions. And this steering information will control racing objects on user's PC.
- Bluetooth server running on user’s PC
Bluetooth server running on user’s PC will publish text/file transfer services, and these services are discovered by Bluetooth client application running on user’s device. Once the service is discovered the client will establish a connection with server and start pushing accelerometer data to server socket. Server will continue to receive Bluetooth client data using a series of “Receive” function calls.
- Bluetooth Client running on mobile device
Bluetooth client running on mobile device will search for peer Bluetooth servers publishing various services, In this case the client will search for file/text transfer services. After discovering the service client makes a connection with Bluetooth server and starts sending accelerometer data to server using series of “send” function calls.
- Gesture Recognition Framework
The Gesture Recognition Framework running on user's PC will receive gesture information from Bluetooth Server module. The intelligence is built into this framework so that it knows the ways to interact and take control of target application. Once the framework gets control of the target application user can remotely operate on the target application using his handheld device as joystick.
- Multi player options
- NFC integration
- Enabling tap events along with Gestures for enhanced control.
Suggestions and feedback on this project is most appreciated and encouraging for me. Also if any of the developers out there interested in working on these future enhancements can reach me out at firstname.lastname@example.org