|Table of Contents|

Driver’s lip detection method under non-restraint conditions(PDF)

长安大学学报(自然科学版)[ISSN:1006-6977/CN:61-1281/TN]

Issue:
2016年05期
Page:
79-87
Research Field:
交通工程
Publishing date:

Info

Title:
Driver’s lip detection method under non-restraint conditions
Author(s):
CHENG Wen-dong FU Rui MA Yong ZHANG Ming-fang LIU Tong
1. Key Laboratory of Automotive Transportation Safety Technology of Ministry of Transport, Chang’an University, Xi’an 710064, Shaanxi, China; 2. School of Mechatronic Engineering, Xi’an Technological University, Xi’an 710032, Shaanxi, China
Keywords:
traffic engineering driver behavior surveillance machine vision illumination equalization skin model 2D-OTSU
PACS:
U491
DOI:
-
Abstract:
Lip recognition based on machine vision is one of the key techniques of driver behavior surveillance. Lip region can be detected with a desirable recognition rate under ideal conditions, including good illumination, stationary state of head and lip. However, non-restraint conditions, including unequalized illumination, head rotation and lip motion, seriously restrict the lip recognition efficiency. To adapt these unconstrained conditions, face detection and lip segmentation methods under real driving situation were put forward. Firstly, an illumination equalization method based on the algorithms of bilateral filter and single Retinex (BF-SSR) was proposed to enhance the clustering stability of skin color under unequalized illumination. Dynamic skin model based on Adaboost and YCb′Cr′ algorithms was then set up for face recognition. Finally, a multi-color accumulate strategy was introduced to enhance the threshold separability between lip and face, and 2D-OTSU criterion with dimension-reduction was proposed for lip segmentation. The results show that the method can detect driver’s lip region efficiently. The proposed algorithm is robust to lip shadow and edge blur caused by unequalized illumination and head pose. The algorithm is available for speaking and yawn detection in real driving environment.

References:

-

Memo

Memo:
-
Last Update: 2016-10-06