A novel real-time algorithm for head and hand
tracking is proposed in this paper. This approach is based on data from a range camera, which is exploited to resolve ambiguities and overlaps. The position of the head is estimated with a depth-based
template matching, its robustness being reinforced with an adaptive search zone. Hands are detected in a bounding box attached
to the head estimate, so that the user may move freely in the scene. A simple method to decide whether the hands are open or
closed is also included in the proposal. Experimental results show high robustness against partial occlusions and fast movements. Accurate hand trajectories may be extracted from the estimated hand positions, and may be used for interactive applications as well as for gesture classification purposes.