WO2006087812A1 - 画像処理方法、画像処理システム、画像処理装置及びコンピュータプログラム - Google Patents
画像処理方法、画像処理システム、画像処理装置及びコンピュータプログラム Download PDFInfo
- Publication number
- WO2006087812A1 WO2006087812A1 PCT/JP2005/002633 JP2005002633W WO2006087812A1 WO 2006087812 A1 WO2006087812 A1 WO 2006087812A1 JP 2005002633 W JP2005002633 W JP 2005002633W WO 2006087812 A1 WO2006087812 A1 WO 2006087812A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- imaging
- image processing
- light
- imaging target
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/141—Control of illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
Definitions
- Image processing method image processing system, image processing apparatus, and computer program
- the present invention relates to an image processing method for detecting a specific imaging target from an image obtained by imaging, an image processing system to which the image processing method is applied, an image processing apparatus used in the image processing system, and
- the present invention relates to a computer program for realizing the image processing apparatus, and more particularly to an image processing method, an image processing system, an image processing apparatus, and a computer program for determining the presence or absence of an imaging target.
- Patent Document 1 JP 2004-234367 A
- the conventional image processing apparatus executes the process of detecting the imaging target on the assumption that the imaging target exists within the range of the image obtained by imaging.
- the processing is executed to detect the imaging target from the background image where there is no imaging target.
- an abnormality such as an increase in processing load due to repetitions will occur.
- the present invention has been made in view of such circumstances, and irradiates light in the imaging direction, irradiates light, and images captured when illuminated, and sometimes images. Compare the images Image processing method capable of reducing possibility of occurrence of abnormality during image processing by determining presence / absence of imaging target in image, image processing system using the image processing method, and image processing system It is an object of the present invention to provide an image processing apparatus used in the above and a computer program for realizing the image processing apparatus.
- An image processing method is an image processing method using an image processing device that detects a specific imaging target from an image obtained by imaging, and irradiates light in an imaging direction to emit light. Based on the result of comparing and comparing the image captured when irradiated and the image captured while irradiated! It is characterized by determining whether or not there is an imaging target in the image
- the present invention when there is an imaging target, there is a difference in the image depending on the light irradiation state, so it is possible to determine the presence or absence of the imaging target. Therefore, even if the imaging target does not exist in the image, the processing for detecting the imaging target from the background image where the imaging target does not exist is executed, and erroneous recognition and detection processing from the background image are repeated unnecessarily. It is possible to prevent the occurrence of abnormalities such as an increase in processing load.
- An image processing system is an image processing unit, an irradiation unit that irradiates light in a direction in which the imaging unit captures an image, and an image that detects a specific imaging target from the image obtained by the imaging.
- the image processing device is configured to compare the image captured when irradiated with light and the image captured when irradiated with the light, and the comparison unit. And determining means for determining the presence or absence of an imaging target in the image based on the comparison result.
- the present invention when there is an imaging target, a difference occurs in the image depending on the light irradiation state, and therefore the presence or absence of the imaging target can be determined. Therefore, even if the imaging target does not exist in the image, the processing for detecting the imaging target from the background image where the imaging target does not exist is executed, and erroneous recognition and detection processing from the background image are repeated unnecessarily. It is possible to prevent the occurrence of abnormalities such as an increase in processing load.
- An image processing system is the image processing system according to the second invention, wherein the image processing device further includes means for obtaining a luminance of an image obtained by taking an image, and the comparing means includes the respective images.
- the determination means is configured to determine that there is no imaging target when the difference in luminance between the images is less than a reference value. .
- the brightness of the image changes greatly depending on the light irradiation state, and therefore the presence or absence of the imaging target can be determined, so that the imaging target exists in the image. It is possible to prevent occurrence of abnormalities in some cases.
- An image processing system is characterized in that, in the third invention, the luminance is an average luminance of pixels constituting the image.
- An image processing system is the image processing system according to any one of the second to fourth aspects of the present invention, derived based on an average luminance of pixels constituting an image, an irradiation condition of the irradiation unit, and an imaging condition of the imaging unit.
- the apparatus further comprises means for determining the presence or absence of an imaging target in the image based on the absolute luminance.
- the present invention by determining the presence / absence of an imaging target based on the absolute luminance, the influence of external light such as sunlight can be suppressed, so that the determination accuracy can be improved.
- the image processing device determines whether or not a specific imaging target can be detected from an image obtained by imaging.
- the comparison means and the determination means are configured to avoid execution when a specific imaging target is detected.
- An image processing system is the image processing system according to any one of the second to sixth aspects, wherein the irradiating means is configured to irradiate light having a wavelength in the near infrared region.
- the stage has a filter that transmits light having a wavelength in the near infrared region.
- the distance between the imaging means and the object to be imaged is mainly reflected in the brightness of the image, and the background image such as a seat pattern is shaded. Therefore, the influence of luminance can be suppressed, so that the determination accuracy can be improved.
- An image processing system includes the imaging device according to any one of the second to seventh inventions, the imaging device including the imaging unit and the irradiation unit, and connected to the image processing device.
- An image processing apparatus is an image processing apparatus that detects a specific imaging target from an image obtained by imaging, and that irradiates light in the imaging direction. It is characterized in that it comprises means for comparing images picked up sometimes and images picked up after irradiation and means for judging the presence or absence of an image pickup target in the image based on the comparison result. .
- the present invention when there is an imaging target, there is a difference in the image depending on the light irradiation state, so it is possible to determine whether there is an imaging target. Therefore, even if the imaging target does not exist in the image, the processing for detecting the imaging target from the background image where the imaging target does not exist is executed, and erroneous recognition and detection processing from the background image are repeated unnecessarily. It is possible to prevent the occurrence of abnormalities such as an increase in processing load.
- a computer program causes a computer connected to an imaging unit and an irradiating unit that emits light in a direction in which the imaging unit images to detect a specific imaging target from an image obtained by imaging.
- the computer irradiates the image with light and the procedure for comparing the image captured with the image and the image captured when the image is irradiated. And a procedure for determining whether there is an imaging target.
- the computer when the computer operates as an image processing apparatus when executed by a computer, and there is an imaging target, the image varies depending on the light irradiation state. Can be determined. Therefore, even if the imaging target does not exist in the image, processing is performed to detect the imaging target from the background image where the imaging target does not exist. It is possible to prevent the occurrence of abnormalities such as an increase in processing load due to erroneous recognition from the background image and unnecessary repetition of detection processing.
- An image processing method, an image processing system, an image processing apparatus, and a computer program according to the present invention are applied to, for example, a system using an in-vehicle camera mounted on a vehicle and imaging a driver's face.
- the average brightness of the image captured when light was irradiated and the average brightness of the image captured when not irradiated were compared, and the difference was less than the reference value. It is determined that there is no imaging target in the image.
- the influence of light by camera control can be suppressed by determining the presence / absence of an imaging target based on absolute luminance, so that the determination accuracy can be improved. It has excellent effects such as being possible.
- the image processing system and the like by irradiating light in the shooting direction, mainly the distance between the imaging means and the imaging target is reflected in the luminance of the image, and the seat pattern and the like Since it is possible to suppress the influence of luminance due to the contrast of the background image, it is possible to improve the determination accuracy and the like.
- FIG. 1 is a perspective view showing an appearance of an imaging device used in the image processing system of the present invention.
- FIG. 2 is a block diagram illustrating a configuration example of an image processing system according to the present invention.
- FIG. 3 is a flowchart showing processing of the image processing apparatus used in the image processing system of the present invention.
- FIG. 4 is a flowchart showing processing of the image processing apparatus used in the image processing system of the present invention.
- FIG. 1 is a perspective view showing an external appearance of an imaging device used in the image processing system of the present invention.
- reference numeral 1 denotes an imaging device such as an in-vehicle camera mounted on a vehicle.
- the imaging device 1 illustrated in FIG. 1 is a CCD (Charge Coupled Device) centered on one surface of a casing 10 having a rectangular parallelepiped shape.
- An imaging unit 11 using an imaging element such as the above is disposed, and a plurality of irradiation units 12 using a light emitting element such as an LED (Light Emitting Diode) are disposed around the imaging unit 11.
- a light emitting element such as an LED (Light Emitting Diode)
- the imaging unit 11 and the illuminating unit 12 are arranged on the same surface of the housing 10 of the imaging device 1 so that the imaging direction and the irradiation direction coincide with each other.
- the irradiation unit 12 is configured to emit light having a wavelength in the near infrared region such as 870 nm using a semiconductor chip such as GaAlAs.
- a filter 11a that selectively transmits light having a wavelength in the near-infrared region with a material such as methanol resin is incorporated in the light receiving unit of the imaging unit 11.
- near-infrared light which is invisible light, as the light emitted by the irradiating unit 12, the driver who receives the light does not notice the light and does not hinder driving.
- the filter 11a selectively transmits light having a wavelength in the near-infrared region and cuts visible light, so that the imaging unit 11 can perform processing without being affected by external light.
- the imaging device 1 is arranged in front of the driver, such as a steering wheel and a dashboard in the vehicle, in a state in which the driver's face can be imaged as an imaging target.
- the near-infrared light is irradiated and the imaging unit 11 takes an image.
- Various conditions including incorporation of the imaging unit 11 and the irradiation unit 12 into the same device are merely examples, and can be appropriately set according to the system configuration, purpose, and the like.
- the apparatus including the imaging unit 11 and the apparatus including the irradiation unit 12 may be separated and arranged at different positions.
- FIG. 2 is a block diagram showing a configuration example of the image processing system of the present invention.
- the imaging device 1 uses an in-vehicle computer that performs image processing such as detection of an imaging target based on the image power obtained by imaging of the imaging device 1, and a communication line such as a dedicated cable. Alternatively, they are connected by a communication network such as an in-vehicle LAN (Local Area Network) configured by wire or wireless.
- LAN Local Area Network
- the imaging apparatus 1 includes an MPU (Micro Processor Unit) 13 that controls the entire apparatus, various computer programs and data (ROMs) that are executed based on the control of the MPU 13, and a ROM (Read Only Memory) 14 that records data.
- RAM (Random Access Memory) 15 for storing various data temporarily generated when the computer program is executed.
- the imaging device 1 includes the imaging unit 11 and the irradiation unit 12 described above, and an A / D converter 16 that converts analog image data obtained by imaging of the imaging unit 11 into digital data, and digitally by AZD conversion.
- a frame memory 17 for temporarily storing the converted image data and a communication interface 18 used for communication with the image processing apparatus 2 are provided.
- the imaging target irradiated with near-infrared light by the irradiating unit 12 is imaged by the imaging unit 11, and for example, 30 image data (image frames) per second based on the imaging. Is generated and output to AZD conversion 16.
- the image data is converted into digital image data represented by gradation such as 256 gradations (lByte). Then, the image data converted into digital image data is stored in the frame memory 17, and the stored image data is output to the image processing device 2 at a predetermined timing.
- imaging condition data indicating the imaging conditions such as the shutter speed and gain value of the imaging unit 11 at the time of imaging and irradiation condition data indicating the irradiation conditions such as the irradiation time of the irradiation unit 12 are also output to the image processing device 2. Is done. Furthermore, in the imaging device 1, various data relating to control of various circuits such as the imaging conditions of the imaging unit 11 and the irradiation conditions of the irradiation unit 12 are transmitted from the image processing device 2 at predetermined timing. And controls various circuits such as the imaging unit 11 and the irradiation unit 12 based on the received data.
- the image processing apparatus 2 reads information from a CPU (Central Processing Unit) 21 that controls the entire apparatus, and a recording medium 4 such as a CD-ROM that records various information such as the computer program 3 and data of the present invention. Temporarily generated when the auxiliary storage unit 22 such as a CD-ROM drive, the hard disk (hereinafter referred to as HD) 23 for recording various information read by the auxiliary storage unit 22, and the computer program 3 recorded on the HD 23 are executed.
- a RAM 24 for storing various data, a frame memory 25 composed of a nonvolatile memory, and a communication interface 26 used for communication with the imaging device 1 are provided.
- the vehicle-mounted computer can obtain the image of the present invention. Operates as processor 2.
- the image data output from the imaging device 1 is received by the communication interface 26, the received image data is recorded in the frame memory 25, and the image data recorded in the frame memory 25 is read out.
- Various image processing is performed, and various data for controlling the imaging device 1 based on the result of the image processing are output from the communication interface 26 to the imaging device 1.
- the various image processing performed on the received image data includes detection of the contour of the driver's face, which is a detection target from the image data (imaging target at the time of imaging), identification of the positions of eyes and nose, etc. These are various processes related to detection of the detection target.
- the luminance of pixels in the vertical direction of the image is integrated, and the integrated value is compared with a predetermined threshold value.
- a process for detecting a range of directions can be given. Further, in the process, a process of differentiating the change in the horizontal direction of the integrated value to identify a position where the change is large and detecting a boundary between the background where the brightness changes greatly and the outline of the face can be mentioned. Another example of the process is a process of detecting the eye position by pattern matching.
- the details of the processing are described in documents such as Japanese Patent Application Laid-Open Nos. 2000-163564, 2004-234494, and 2004-234367 filed by the applicant of the present application. Note that these image processes are not necessarily disclosed in JP-A-2000-163564 and JP-A-2004-23.
- the processing is not limited to the processing described in Japanese Patent No. 4494 and Japanese Patent Application Laid-Open No. 2004-234367, but can be appropriately selected according to the use, hardware configuration, conditions such as cooperation with other application programs, and the like. .
- step S1 the extracted image data is subjected to detection processing related to the imaging target such as detection of the contour of the driver's face as the imaging target and identification of the positions of the eyes and nostrils (step S2).
- detection processing in steps S1 to S2 is performed using various types of processing described in documents such as Japanese Patent Laid-Open No.
- step S3 it is determined by the control of the CPU 21 whether or not the imaging target can be detected by the processing in steps S1 to S2 (step S3). If determined (step S3: YES), it is determined that there is an imaging target (step S4), the process returns to step S1, and the subsequent processing is repeated for the next image data.
- the detection in step S3 is detection of the presence or absence of an imaging target, and can be appropriately set as a determination criterion. For example, when detecting the contour, eyes, and nostrils of the driver's face that is the imaging target in the processing of steps S1-S2, the imaging target is detected only when all of the contours, both eyes, and both nostrils are detected.
- the detection status of the imaging target used for the determination in step S3 is the result of reading the detection process in steps S1-S2 recorded in HD23 or RAM24.
- step S3 If it is determined in step S3 that the imaging target cannot be detected (step S3: NO), the image processing apparatus 2 cannot detect the imaging target under the control of the CPU 21, and the state is preset as the first continuous reference value, and n frames (n is a natural value) Number) It is determined whether or not the force is continuous (step S5).
- the first continuous reference value is recorded in HD 23 or RAM 24.
- step S5 if it is determined that the frame force in a state where the imaging target cannot be detected is less than n frames, which is the first continuous reference value, and the force is continuous (step S5: NO), image processing is performed. in the device 2, the control of the CPU 21, the state that can not be able to detect the imaging target, m frames (m being preset as the second continuous reference value of the first continuous reference value smaller than the value, than n It is determined whether or not the force is continuous (small, natural number) or more (step S6).
- the second continuous reference value is recorded in HD23 or RAM24.
- step S6 if it is determined that the frame force in a state where the imaging target cannot be detected is less than the second continuous reference value m frames and the force is continuous (step S6: NO), image processing is performed. In the device 2, under the control of the CPU 21, the process proceeds to step S4, and it is determined that the imaging target exists (step S4). This is because, since the continuation of the state in which the imaging target cannot be detected is short, there is a high possibility of erroneous recognition if it is determined that there is no imaging target.
- step S6 when it is determined that the frames in which the imaging target cannot be detected are continuously longer than the second continuous reference value m frames (step S6: YES), the image processing device 2 Then, under the control of the CPU 21, it is determined whether or not the absolute luminance value of the image data is preset and less than the luminance value reference value (step S7).
- the luminance value reference value is recorded in HD23 or RAM24.
- the absolute luminance value is based on, for example, the average luminance value of the pixels forming the image indicated by the imaging condition data and the irradiation condition data output from the imaging device 1 together with the image data and the image data calculated by the image processing device 1. Calculated as the product of brightness value, shutter speed, gain value, and irradiation width.
- the imaging condition and the irradiation condition may be values output to the imaging apparatus 1 as control data that is not the values input from the imaging apparatus 1.
- the gain value is adjusted so that the brightness of the face to be imaged is constant. Therefore Judgment is performed using absolute luminance values instead of luminance values.
- Absolute luminance value average luminance value 3 3 ⁇ 1 + 8 76 ⁇ 37... Formula 1
- step S7 When it is determined in step S7 that the absolute luminance value of the image data is equal to or greater than the luminance value reference value (step S7: NO), the image processing apparatus 2 performs step S4 under the control of the CPU 21. Then, it is determined that the imaging target exists (step S4).
- the object to be imaged is irradiated with light, and therefore the absolute brightness value increases as the object is located closer to the imaging unit 11. Therefore, when the absolute luminance value is equal to or greater than the luminance value reference value, it is determined that the imaging target exists near the imaging unit 11 and at a position.
- step S5 when it is determined that the frame force in a state where the imaging target cannot be detected is n frames or more which is the first continuous reference value (step S5: YES), or in step S7, When it is determined that the absolute luminance value of the image data is less than the luminance value reference value (step S7: YES), the image processing device 2 emits light from the irradiation unit 12 included in the imaging device 1 under the control of the CPU 21. Is interrupted (step S8), and light is irradiated to obtain image data obtained by imaging in the state of V (step S9). In the normal state in which imaging by the imaging unit 11 is being performed, the irradiation unit 12 is a force that irradiates light in the imaging direction.
- step S8 the irradiation unit 12 performs control to interrupt irradiation of the irradiation unit 12.
- the irradiation unit 12 By outputting the data from the image processing device 2 to the imaging device 1, the irradiation by the irradiation unit 12 is temporarily interrupted.
- the CPU 21 controls to compare the average luminance value of the image captured while irradiating light and the average luminance of the image captured when irradiating light. In comparison, it is determined whether or not the average luminance value when the light is irradiated is greater than or equal to a preset luminance difference reference value based on the average luminance value when the light is not irradiated (step S10). ) The luminance value reference value is recorded in HD23 or RAM23.
- Step S10 when it is determined that the average luminance value power when the light is radiated V is greater than the luminance difference reference value from the average luminance value when the light is not radiated V (Step S10: Y ES), the image processing apparatus 2 proceeds to step S4 under the control of the CPU 21, and determines that there is an imaging target (step S4).
- step S10 When it is determined in step S10 that the difference between the average luminance value when the light is radiated and the average luminance value when the light is not radiated is less than the luminance difference reference value (step S10: NO), the image processing apparatus 2 determines that there is no imaging target under the control of the CPU 21 (step S11), returns to step S1, and repeats the subsequent processing for the next image data. And there is no imaging target! Various processing such as driving support processing when it is determined to be ⁇ is executed.
- the average luminance value used in the comparison in step S10 is not the average luminance value based on the luminance of the entire image data, but the central image excluding the peripheral image from the entire image indicated by the image data.
- the average luminance value at the center of the image is the difference between when the imaging target is present and when there is no imaging target! This is because the accuracy of judgment is greatly improved. If there is an imaging target such as when it is determined that the imaging target can be detected in step S3, and there is a high possibility that the imaging target is detected, the average luminance value shown in step S8-SI1 There is no comparison or imaging target! Processing such as judgment is avoided. This is to prevent an error in the processing result shown in step S2 and an increase in processing load by interrupting the light irradiation.
- the mode in which the driver of the vehicle is the imaging target is shown.
- the present invention is not limited to this, and various humans, and other types of organisms or non-living subjects are the imaging targets. It ’s okay!
- security personnel who perform surveillance work at a fixed position are targeted for imaging, When it is determined that there is no security guard, it can be deployed in various forms, such as applying to a monitoring system that performs warning processing.
- the present invention is not limited to this, and the processing of the image processing apparatus is performed.
- the image processing apparatus may perform all or part of the processing described in FIG.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2005/002633 WO2006087812A1 (ja) | 2005-02-18 | 2005-02-18 | 画像処理方法、画像処理システム、画像処理装置及びコンピュータプログラム |
JP2007503550A JP4397415B2 (ja) | 2005-02-18 | 2005-02-18 | 画像処理方法、画像処理システム、画像処理装置及びコンピュータプログラム |
CNA2005800484808A CN101124610A (zh) | 2005-02-18 | 2005-02-18 | 图像处理方法、图像处理系统、图像处理装置及计算机程序 |
US11/892,140 US8077916B2 (en) | 2005-02-18 | 2007-08-20 | Image processing system that detects an object by comparison of a difference between luminance of an illuminated and non-illuminated image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2005/002633 WO2006087812A1 (ja) | 2005-02-18 | 2005-02-18 | 画像処理方法、画像処理システム、画像処理装置及びコンピュータプログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/892,140 Continuation US8077916B2 (en) | 2005-02-18 | 2007-08-20 | Image processing system that detects an object by comparison of a difference between luminance of an illuminated and non-illuminated image |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006087812A1 true WO2006087812A1 (ja) | 2006-08-24 |
Family
ID=36916228
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/002633 WO2006087812A1 (ja) | 2005-02-18 | 2005-02-18 | 画像処理方法、画像処理システム、画像処理装置及びコンピュータプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US8077916B2 (ja) |
JP (1) | JP4397415B2 (ja) |
CN (1) | CN101124610A (ja) |
WO (1) | WO2006087812A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016196233A (ja) * | 2015-04-03 | 2016-11-24 | クラリオン株式会社 | 車両用道路標識認識装置 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006259900A (ja) * | 2005-03-15 | 2006-09-28 | Omron Corp | 画像処理システム、画像処理装置および方法、記録媒体、並びにプログラム |
JP4888838B2 (ja) * | 2008-05-12 | 2012-02-29 | トヨタ自動車株式会社 | 運転者撮像装置および運転者撮像方法 |
US8391554B2 (en) * | 2008-10-01 | 2013-03-05 | GM Global Technology Operations LLC | Eye detection system using a single camera |
KR20110006112A (ko) * | 2009-07-13 | 2011-01-20 | 삼성전자주식회사 | 카메라 시스템에서 디스플레이 패널의 백라이트를 제어하는 장치 및 그 방법 |
JP6737213B2 (ja) * | 2017-03-14 | 2020-08-05 | オムロン株式会社 | 運転者状態推定装置、及び運転者状態推定方法 |
KR102476757B1 (ko) * | 2017-12-21 | 2022-12-09 | 삼성전자주식회사 | 반사를 검출하는 장치 및 방법 |
EP3941176A4 (en) * | 2019-03-14 | 2022-04-06 | FUJI Corporation | OBJECT DETERMINATION METHOD AND OBJECT DETERMINATION DEVICE |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000172961A (ja) * | 1998-12-08 | 2000-06-23 | Mitsubishi Electric Corp | 監視警戒装置 |
JP2002083287A (ja) * | 2000-06-29 | 2002-03-22 | Trw Inc | バックグランド干渉の除去により最適化された人間の存在検出 |
JP2002298232A (ja) * | 2001-03-29 | 2002-10-11 | Mitsubishi Electric Corp | 人体検知装置、人体検知方法および障害物検知方法 |
JP2003296721A (ja) * | 2003-02-10 | 2003-10-17 | Toshiba Corp | 画像抽出装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000163564A (ja) | 1998-12-01 | 2000-06-16 | Fujitsu Ltd | 目の追跡装置および瞬き検出装置 |
JP4204336B2 (ja) | 2003-01-30 | 2009-01-07 | 富士通株式会社 | 顔の向き検出装置、顔の向き検出方法及びコンピュータプログラム |
JP4162503B2 (ja) | 2003-01-31 | 2008-10-08 | 富士通株式会社 | 眼の状態判定装置、眼の状態判定方法及びコンピュータプログラム |
-
2005
- 2005-02-18 CN CNA2005800484808A patent/CN101124610A/zh active Pending
- 2005-02-18 JP JP2007503550A patent/JP4397415B2/ja not_active Expired - Fee Related
- 2005-02-18 WO PCT/JP2005/002633 patent/WO2006087812A1/ja not_active Application Discontinuation
-
2007
- 2007-08-20 US US11/892,140 patent/US8077916B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000172961A (ja) * | 1998-12-08 | 2000-06-23 | Mitsubishi Electric Corp | 監視警戒装置 |
JP2002083287A (ja) * | 2000-06-29 | 2002-03-22 | Trw Inc | バックグランド干渉の除去により最適化された人間の存在検出 |
JP2002298232A (ja) * | 2001-03-29 | 2002-10-11 | Mitsubishi Electric Corp | 人体検知装置、人体検知方法および障害物検知方法 |
JP2003296721A (ja) * | 2003-02-10 | 2003-10-17 | Toshiba Corp | 画像抽出装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016196233A (ja) * | 2015-04-03 | 2016-11-24 | クラリオン株式会社 | 車両用道路標識認識装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2006087812A1 (ja) | 2008-07-03 |
US20070291989A1 (en) | 2007-12-20 |
JP4397415B2 (ja) | 2010-01-13 |
US8077916B2 (en) | 2011-12-13 |
CN101124610A (zh) | 2008-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006087812A1 (ja) | 画像処理方法、画像処理システム、画像処理装置及びコンピュータプログラム | |
JP4592744B2 (ja) | 画像処理方法、画像処理システム、画像処理装置及びコンピュータプログラム | |
US20190204448A1 (en) | Imaging device and electronic device | |
KR101774692B1 (ko) | 에어백 제어 장치 및 방법 | |
EP2743143B1 (en) | Vehicle occupant detection device | |
JP4364275B2 (ja) | 画像処理方法、画像処理装置及びコンピュータプログラム | |
EP1703480B1 (en) | System and method to determine awareness | |
EP2074550A2 (en) | Eye opening detection system and method of detecting eye opening | |
WO2013157466A1 (ja) | 喫煙検出装置、方法及びプログラム | |
US7370975B2 (en) | Image processing apparatus | |
WO2006087790A1 (ja) | 画像処理方法、画像処理システム、撮像装置、画像処理装置及びコンピュータプログラム | |
JP5034623B2 (ja) | 画像処理方法、画像処理装置、画像処理システム及びコンピュータプログラム | |
JP2005157648A (ja) | 運転者認識装置 | |
EP2378465A1 (en) | Driver assisting system and method for a motor vehicle | |
JP4457077B2 (ja) | 障害物検出システム、及び障害物検出方法 | |
US11983941B2 (en) | Driver monitor | |
JP2008028478A (ja) | 障害物検出システム、及び障害物検出方法 | |
JP6945775B2 (ja) | 車載用画像処理装置、および、車載用画像処理方法 | |
JP4692447B2 (ja) | 居眠り検知装置、居眠り検知方法 | |
JP2007083922A (ja) | 衝突回避支援システム、衝突回避支援方法及びコンピュータプログラム | |
JP5012690B2 (ja) | 顔面の覆体及び顔面部位特定方法 | |
JP2010108167A (ja) | 顔認識装置 | |
WO2023032029A1 (ja) | 遮蔽判定装置、乗員監視装置及び遮蔽判定方法 | |
JP2019074964A (ja) | 運転不能状態予測装置及び運転不能状態予測システム | |
WO2022024411A1 (ja) | 赤外線撮像装置、及び赤外線撮像システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2007503550 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580048480.8 Country of ref document: CN Ref document number: 11892140 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 11892140 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 05719302 Country of ref document: EP Kind code of ref document: A1 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 5719302 Country of ref document: EP |