There are many applications in which it is necessary to overlay a computer-generated object onto a real scene in real-time, requiring accurate measurement of the position of the camera or headset. Existing methods require bulky hardware, severely limiting their usability. The objective of this project is to develop and implement a system for determining the position, orientation, and focal length of a camera in real-time, by analysis of the camera images and exploitation of unobtrusive inertial motion sensors. This will enable the system as a whole to determine its location and orientation in a very natural way, mimicking the way a human orients himself, using the vestibular organ (in the ears) ? which is essentially an inertial measurement unit, and the eyes ? essentially comparable to a camera. The results of project will be a marker-free tracking system that works with a high frame-rate on a low-performance computer unit. It will allow the capture of camera and head motions respectively for TV production and mobile augmented reality applications. In particular the system will work over a large area in indoor as well as in outdoor environments. By providing this unique technology, the project will act as a strong enabling force for the wider deployment of augmented reality in application areas including content production, education, cultural heritage and industry.