While brain signals potentially provide us with valuable information about a user, it is not straightforward to derive and use this information to smooth man-machine interaction in a real life setting. We here propose to predict head rotation on the basis of brain signals in order to improve images presented in a Head Mounted Display (HMD). Previous studies based on arm and leg movements suggest that this could be possible, and a pilot study showed promising results. From the perspective of the field of Brain-Computer Interfaces (BCI), this application provides a good case to put the field's achievements to the test and to further develop in the context of a real life application. The main reason for this is that within the proposed application, acquiring accurately labeled training data (whether and which head movement took place) and monitoring of the quality of the predictive model can happen on the fly. From the perspective of HMD technology and Intelligent User Interfaces, the proposed BCI potentially improves user experience and enables new types of immersive applications.