Algorithmic detection of eye contact in driver-pedestrian interactions

More Info
expand_more

Abstract

Pedestrians today are very vulnerable on urban roads. Clear communication between drivers and pedestrians is one way to reduce their plight. Non-verbal communication in particular plays an important role in road safety, and eye contact is a kind of non-verbal communication that has the potential to minimize on-road collisions. However, with the advent of automated vehicles, driver-pedestrian eye contact loses its meaning since there is no longer a driver. It is therefore useful to study and detect eye contact so that the knowledge obtained may be applied to automated vehicles of the future. To this end, the following research goals were adopted: (a) What is eye contact between a pedestrian and a driver in a car? How can eye contact be defined/operationalized using an algorithm?, (b) How accurate is the algorithm that operationalizes eye contact?, and (c) How is it possible to use two eye-trackers with inertial measurement units (IMUs) and pedestrian recognition in a Toyota Prius car to reconstruct the entire driver-pedestrian interaction through a 3-D animation? An indoor experiment, designed to resemble a driver-pedestrian interaction at a pedestrian crossing was conducted with 31 participants. Participants’ (pedestrians’) eyes were tracked using a Tobii Pro Glasses 2 eye-tracker and the researcher’s (driver’s) eyes were tracked using a Smart Eye Pro dx eye-tracker,
both of which were synchronized. Participants’ locations were also tracked using a stereo camera equipped with pedestrian detection capabilities. Pedestrians imagined that they were on a real road and performed six types of trials
where they stood on / crossed from the left / right side curb in front of the stationary vehicle while either making eye contact or not making eye contact with the driver. The order of the trials was randomized, and each trial consisted of 3 repetitions of a driver-pedestrian interaction. If the driver and pedestrian were looking at each other at the same time there was eye contact, otherwise there was no eye contact. Significant differences in the percentages of eye contact between pedestrians standing on the left (median duration of 0.42 s) and the right (median duration of 0.54 s) were found. No significant differences in the percentages of eye contact between pedestrians crossing from the left (median duration of 1.23 s) and the right (median duration of 1.39 s) were found. Eye contact instants within trials were algorithmically detected by finding the angle between the 3-D gaze direction vectors of the driver and the pedestrian, and comparing it to an ‘eye contact threshold’. Trials were classified as either involving eye contact or not involving eye contact based on their percentages of eye contact instants. The classification performance of the algorithm was quantified using two ground truths: (1) Imposed eye contact (in half of the trials, participants were instructed to make eye contact; in the other half, participants were instructed not to make eye contact), and (2) Manually annotated areas of interest (AOIs) from the Tobii Pro Glasses 2 showing pedestrian eye contact seeking. The algorithm’s performance was found to be fair/poor and eye contact could be detected with an accuracy of 15-30°. A 3-D reconstruction of the driver-pedestrian interaction was achieved (in the form of an animation) by using the locations, head orientations and gaze directions of the driver and the pedestrian. This thesis provides objective measurements of driver-pedestrian eye contact and demonstrates how eye contact may be detected and reconstructed for use in automated vehicles of the future.