CN105122784A - Object recognition in low-lux and high-lux conditions - Google Patents

Object recognition in low-lux and high-lux conditions Download PDF

Info

Publication number
CN105122784A
CN105122784A CN201380067876.1A CN201380067876A CN105122784A CN 105122784 A CN105122784 A CN 105122784A CN 201380067876 A CN201380067876 A CN 201380067876A CN 105122784 A CN105122784 A CN 105122784A
Authority
CN
China
Prior art keywords
illumination level
optical sensor
threshold value
view data
high optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201380067876.1A
Other languages
Chinese (zh)
Other versions
CN105122784B (en
Inventor
J·T·金
T·埃尔多克尔
P·瓦格赫菲纳扎里
S·M·山本
R·W·黄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Edge 3 Technologies LLC
Original Assignee
Honda Motor Co Ltd
Edge 3 Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd, Edge 3 Technologies LLC filed Critical Honda Motor Co Ltd
Publication of CN105122784A publication Critical patent/CN105122784A/en
Application granted granted Critical
Publication of CN105122784B publication Critical patent/CN105122784B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/106Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using night vision cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8006Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying scenes of vehicle interior, e.g. for monitoring passengers or cargo

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Vascular Medicine (AREA)
  • Image Analysis (AREA)
  • Mechanical Engineering (AREA)
  • Studio Devices (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to object recognition in low-lux and high-lux conditions. A system for capturing image data for gestures from a passenger or a driver in a vehicle with a dynamic illumination level comprises a low-lux sensor equipped to capture image data in an environment with an illumination level below an illumination threshold, a high-lux sensor equipped to capture image data in the environment with the illumination level above the illumination threshold, and an object recognition module for activating the sensors. The object recognition module determines the illumination level of the environment and activates the low-lux sensor if the illumination level is below the illumination threshold. If the illumination level is above the threshold, the object recognition module activates the high-lux sensor.

Description

Object identifying under low light photograph and high Irradiance
Technical field
The present invention relates to Object identifying, particularly relate to the Object identifying in the vehicle under low light photograph and high Irradiance.
Background technology
Imageing sensor is used for the view data of catching object under various varying environment.And imageing sensor or auxiliary equipment are adjusted to the most applicable monitored environment.Such as, under the environment with low-light (level) and low levels, imageing sensor can be aided with the irradiation source providing fixing illumination.Adjustment is although it is so highly suitable for static environment, but these adjustment cannot tackle produced problem in dynamic environment.Such as, when crossing dark areas due to vehicle or cause illumination level in vehicle to reduce at night running, the object in monitor vehicle and may be invalid by the imageing sensor that have adjusted for specific illumination level.
Summary of the invention
The embodiments of the present invention catch the view data for posture from occupant or driver in the vehicle with dynamic illumination level.Disclosed system comprises: low optical sensor, and it is installed to be catches view data having under the environment lower than the illumination level of illumination threshold; High optical sensor, it is installed to be catches view data having under the environment higher than the illumination level of illumination threshold; And Object identifying module, it is for starting and inactive transducer.Low light photograph and high optical sensor are positioned at the overhead console (overheadconsole) of vehicle.
Object identifying module judges the illumination level of environment, and judges that whether illumination level is lower than illumination level threshold value.If illumination level is lower than threshold value, then Object identifying module starts low optical sensor.In one embodiment, Object identifying module also starts irradiation source along with the startup of low optical sensor.Irradiation source illuminates the environment of low optical sensor, and the view data under allowing low optical sensor to catch low-light (level) level.If illumination level is higher than threshold value, then Object identifying module starts high optical sensor.In one embodiment, high optical sensor comprises infrared fileter to reduce the amount of infrared light arriving high optical sensor.
Other execution modes of the present invention comprise: computer-readable medium, and it stores the order of the function for realizing said system; Computer implemented method, it comprises the step implementing above-mentioned functions.
Accompanying drawing explanation
Fig. 1 is the block diagram that the computing environment for catching view data under the environment with dynamic illumination level is described based on an execution mode.
Fig. 2 is based on the block diagram of an execution mode explanation for catching the Object identifying module in the computing environment of view data under the environment with dynamic illumination level.
Fig. 3 is the flow chart that the method for catching view data under the environment with dynamic illumination level is described based on an execution mode.
Embodiment
Computing environment described herein is shone at low light and is caught the view data representing posture under high Irradiance from driver or occupant in vehicle.The accompanying drawings and the description below describe certain execution mode by means of only the mode of example.Those skilled in the art will easily recognize the alternate embodiments that can adopt structure described herein and method when not deviating from principle described herein from following description.In detail with reference to several execution mode, the example of these execution modes is described in the accompanying drawings.It should be noted, time suitable, use similar or identical Reference numeral in the accompanying drawings, and similar or identical Reference numeral representation class like or identical function.
System environments
With reference to Fig. 1, the computing environment 100 of catching view data under the environment with dynamic illumination level comprises: Object identifying module 102, a pair low light are according to imageing sensor 104a-b (such as, infrared sensor), high a pair light image transducer 106a-b (such as, RGB transducer), illumination level survey meter 110 and communication module 112.Although described computing environment 100 comprises two low lights according to imageing sensor 104a-b and two high light image transducer 106a-b, other execution modes of computing environment 100 can comprise one or more low light according to imageing sensor 104a-b and high light image transducer 106a-b.In addition, in one embodiment, computing environment 100 is present in vehicle or mobile cabin.Computing environment 100 also can be arranged in the environment that other have dynamic illumination level.
Illumination level survey meter 110 is the equipment for the light quantity existed in measurement environment 100.In one embodiment, illumination level survey meter 110 comprises photovoltaic sensor light being converted to the electricity that can measure.Illumination level survey meter 110 is measured the electricity of generation and is judged light quantity or illumination level based on the electricity measured to environment.In one embodiment, illumination level survey meter adopts the principle of light meter, and such as reflective light meter or incident light meter carry out the illumination level in measurement environment 100.
Object identifying module 102 receives measured illumination level from illumination level survey meter 110, judges whether this illumination level exceedes illumination threshold value, and starts this high light image transducer 106a-b or this low light photograph imageing sensor 104a-b based on this judgement.In one embodiment, Object identifying module 102 starts low light according to imageing sensor 104a-b in response to judgement illumination level lower than illumination threshold value, and starts high light image transducer 106a-b in response to judgement illumination level is equal to or greater than illumination threshold value.In another embodiment, start when Object identifying module 102 guarantees that low optical sensor 104a-b and high both optical sensor 106a-b is different.Therefore, in response to the low optical sensor 104a-b of startup, Object identifying module 102 makes the high optical sensor 106a-b of all working stop using, and vice versa.Then, Object identifying module 102 receives view data from the transducer 104,106 be activated, and processes the view data received, and carrys out the object in recognition image according to process.In one embodiment, this be identified to as if carrying out the people of given pose, and Object identifying module 102 performs the function as undertaken communicating by communication module 112 and user based on the object be identified.
Selecting property by Object identifying module 102 startup of low optical sensor 104a-b and high optical sensor 106a-b ensure that this Object identifying module 102 at low Irradiance (such as valuably, when environment 100 comprises the situation of low light quantity) and under high Irradiance (such as, when environment 100 comprises the situation of enough or high light quantity) capture-data.Therefore, Object identifying module 102 can be caught and image data processing by the environment valuably for having dynamic illumination level.
Communication module 112 provides the interface between user and Object identifying module 102.Therefore communication module 112 comprises output equipment and communicates with user with selectable input equipment.The example of output equipment comprises the touch-screen for carrying out visual communication and is used for carrying out the audio frequency apparatus of voice communication.The example of input equipment comprises touch-screen, mouse and auxiliary keyboard.
High optical sensor 106a-b is the imageing sensor catching view data at the illumination level being more than or equal to illumination threshold value.The example of high optical sensor 106a-b like this comprises and has suitable sensitivity, charge coupled device (CCD) transducer of namely suitable brightness range or complementary metal oxide semiconductors (CMOS) (CMOS) transducer to light, under the illumination level being more than or equal to illumination threshold value, catch view data.The brightness range of imageing sensor 106a is the scope of scene brightness, under the scope of this scene brightness, installs this transducer 106a-b to catch view data.Such as, the brightness range of high optical sensor 106a-b can be every square metre of 300-1000 candela or 1000-12000 lux.
In one embodiment, high optical sensor 106a-b catches view data to rebuild the color image sensor of the coloured image of environment 100.These colored high optical sensor 106a-b can selectively together make for by any distortion reduction caused by the photoconduction with infrared wavelength or minimize with infrared block filter.In one embodiment, two high optical sensor 106a-b interpupillary distances apart (approximate distance namely between human eye), catch the 3-D view for stereo-picture process.
Low optical sensor 104a-b is the imageing sensor can catching view data under the illumination level lower than illumination threshold value.The example of low optical sensor 104a-b like this comprises the CCD or cmos sensor with suitable brightness range, under the illumination level lower than illumination threshold value, catch view data.In one embodiment, low light has the brightness range of every square metre of 25-350 candela or 0-80 lux according to imageing sensor.In one embodiment, two low optical sensor 104a-b interpupillary distances apart, catch the 3-D view for stereo-picture process.In one embodiment, the brightness range of low optical sensor 104a-b is different from the brightness range of high optical sensor 106a-b.Different brightness ranges enables low optical sensor catch better compared with the view data under low-light (level) level, and the view data under enabling high optical sensor catch higher illuminance level better.In another embodiment, low light is consistent with the brightness range of high light image transducer 106a-b according to the brightness range of imageing sensor 104a-b.
In one embodiment, this low light has according to imageing sensor 104a-b the irradiation source 108a-b be associated, and irradiation source 108a-b illuminates the visual field of low light according to imageing sensor 104a-b.The example with the imageing sensor of the irradiation source be associated comprises DEFENDERSPARTAN5 night vision camera and CCTVEX11DXL dual sensor color night vision camera.And in an execution mode described as follows, irradiation source 108a-b sends the light with the spectrum comprising single wavelength or approximate wavelength.In this embodiment, low optical sensor 104a-b is the monochrome image sensor with the light of the wavelength be similar to the wavelength of the light sent by irradiation source 108a-b of effective conversion receiver.These monochromatic low optical sensor 104a-b are good at the view data of catching under low-light (level) level valuably.
In one embodiment, irradiation source 108a-b sends infrared light to illuminate scene, and low optical sensor 104a-b in this embodiment, does not comprise and arrives the infrared fileter of low light according to the infrared light of transducer 104a-b for reducing or minimizing.Because irradiation source 108a-b infrared light illuminates scene, so when this environment 100 is illuminated by infrared light, lack infrared fileter and make low optical sensor 104a-b catch view data valuably.
Irradiation source 108a-b in one embodiment, is contained in or attaches to low light according to imageing sensor 104a-b.In other execution modes of environment 100, irradiation source 108a-b is physically separated according to imageing sensor 104a-b with low light.
In one embodiment, be near-infrared (NIR) light-emitting diode (LEDs), it sends near infrared light to illuminate the visual field of transducer 104a-b to irradiation source 108a-b.Because near infrared light is invisible to the mankind, so NIR light illuminates visual field and people not in interference environment 100 valuably.When the environment 100 be illuminated is in vehicle, it is gratifying for not illuminating environment 100 with disturbing driver.In another embodiment, irradiation source 108a-b sends light beyond NIR light to illuminate visual field.In addition, in one embodiment, irradiation source 108a-b sends the light with the spectrum comprising single wavelength or approximate wavelength, this is because advantageously reduce the data of catching according to imageing sensor 104a-b based on low light and the image chroma produced with such illumination bright field.
In another embodiment, irradiation source 108a-b sends the light of the band had in the scope of low light according to the peak value place of the response curve of imageing sensor 104a-b.The response curve of imageing sensor represents the sensor efficiency of the electric current that the every light quantity received at this transducer of different wave length produces.Therefore, irradiation source 108a-b sends the light with the band comprising following wavelength, and at that wavelength, by low light, according to every light quantity of imageing sensor 104a-b reception, transducer 104a-b produces a large amount of electric currents.Such as, irradiation source 108a-b sends the light of wavelength between 750 nanometers (nm) to 900nm, this is because low optical sensor 104a-b produces 0.2-0.275 ampere every watt (A/W) for the light received at these wavelength, and usually produces less electric current for the light at other wavelength.
Irradiation source 108a-b, low optical sensor 104a-b, high optical sensor 106a-b are arranged in high position, the region more monitored than environment 100 alternatively physically.Transducer 104a-b, 106a-b of being positioned at higher position provide the roughly unscreened visual field of area to be monitored valuably to transducer 104a-b, 106a-b.Such as, when environment 100 is positioned at vehicle and monitored region is the region around driver or occupant, the overhead console on transducer 104a-b, 106a-b and the console of irradiation source 108a-b between driver's seat and occupant's seat.Except the roughly unscreened visual field, transducer 104a-b, 106a-b reduce the position in vehicle any harmful effect of the direct incident light entering vehicle from front windshield.
In addition, in one embodiment, irradiation source 108a-b proximity sense 104a-b, and irradiation source 108a-b illuminates the region before transducer 104a-b.Because irradiation source 108a-b is adjacent instead of relative relative to transducer, so the direct incident light carrying out the irradiation source 108a-b on sensor 104a-b is minimized valuably.
Object identifying module
Fig. 2 is based on the block diagram of an execution mode explanation for catching the Object identifying module in the computing environment of view data under the environment with dynamic illumination level.Object identifying module 102 comprises: illumination level module 202, illumination threshold module 204, catch source module 206, radiation source module 208, image data memory 210 and image processing module 212.
Illumination level module 202 is coupled to illumination level survey meter 110 in the mode that can communicate, and this illumination level module 202 judges the illumination level measured by survey meter 110.In one embodiment, this illumination level module 202 repeatedly poll (poll) illumination level survey meter 110 judge measured illumination level.In another embodiment, illumination level survey meter 110 is configured to repeatedly the illumination level of transmission measurement, and illumination level module 202 receives the illumination level that transmission comes.In yet, the illumination level measured value of illumination level module 202 receive threshold amount, or in threshold time, receive illumination level measured value, and illumination level module 202 judges illumination level based on the reading received.Such as, about illumination level, can judge that the mean value of the reading received is used as illumination level.Like this beneficially make Object identifying module 102 can consider any abnormal reading based on being determined with of multiple illumination level measured value, this abnormal reading may produce due to the temporary obstructions before the temporary derangement of survey meter 110 or survey meter 110.
Illumination threshold module 204 receives the illumination level determined from illumination level module 202, and judges whether this illumination level determined exceedes illumination threshold value.In one embodiment, this illumination threshold value can be arranged, and user or module provide suitable illumination threshold value by user interface to illumination threshold module 204.In one embodiment, illumination threshold value is based on the brightness range of low light according to imageing sensor 104a-b or high light image transducer.Such as, illumination threshold value is set to the value of the overlap in the brightness range of low optical sensor 104a-b and high optical sensor 106.Therefore, if the brightness range of low optical sensor 104a-b is every square metre of 25-350 candela, and the brightness range of high optical sensor 106a-b is every square metre of 300-1000 candela, then illumination threshold value can be set to every square metre of 325 candelas.Such illumination threshold value guarantees that current illumination level is in the brightness range of transducer 104,106 when any one of transducer 104,106 is activated valuably.
Catch source module 206 and start low optical sensor 104a-b or high optical sensor 106a-b based on the judgement made by illumination threshold module 204.If illumination threshold module 204 judges that illumination level is more than or equal to illumination threshold value, then catch source module 206 and start high optical sensor 106a-b.Otherwise, catch source module 206 and start low optical sensor 104a-b.In one embodiment, when catching source module 206 and starting high optical sensor 106a-b, catch source module 206 and low optical sensor 104a-b is stopped using, vice versa.Stopping using like this guarantees that view data is caught by the suitable transducer that the current Irradiance with environment 100 is suitable mutually valuably.
Radiation source module 208 controls the startup of irradiation source 108a-b and stops using.In one embodiment, radiation source module 208 with catch source module 206 and communicate, and radiation source module 208 response is caught source module 206 and is started and the low optical sensor 104a-b and starting and inactive irradiation source 108a-b of stopping using.In another embodiment, radiation source module 208 starts and inactive irradiation source 108a-b based on the judgement that illumination threshold module 204 is made.If illumination threshold module 204 judges that illumination level is equal to or greater than lighting threshold, then radiation source module 208 is stopped using the irradiation source 108a-b of all working.Otherwise radiation source module 208 starts irradiation source 108a-b.The irradiation source 108a-b started illuminates the visual field of low optical sensor 104a-b valuably.
Image data memory 210 is volatibility or nonvolatile memory, and it is received and stores the view data of being caught by imageing sensor 104a-b, 106a-b.In one embodiment, image data memory 210 stores the view data from low optical sensor 104a-b and high optical sensor 106a-b respectively.The data stored respectively make image processing module 212 can process view data from two dissimilar transducer 104a-b, 106a-b as required in a different manner.
Image processing module 212 processes the image be stored in image data memory 210, identifies object according to process, and initiates response according to the object identified.Image processing module 212 technology that can realize as stereo-picture treatment technology is carried out deal with data and is identified object.An example of such technology is have description the scientific paper of " stereo-picture process " (can obtain from http://dsp-book.narod.ru/DSPMW/57.PDF) at exercise question, and this paper is all contained in herein by reference.Based on the object identified, image processing module 212 judges and initiates suitable response.Such as, the view data that image processing module 212 can process storage also judges that the image data table stored is leted others have a look at and carries out a special posture.In response to this judgement, image processing module 212 Query Database (not shown) also judges that the user be associated with this posture asks.Then image processing module 212 indicates suitable process to take suitable action to respond the request determined.Such as, if this request represents that user wants order audio player to play specific music file, then image processing module 212 communicates with music player, and instructs player plays this music file.Other examples of the request be associated with various posture comprise: ask to start or light in inactive automobile, ask the application program in automobile as the startup of global positioning system or the startup or inactive of stopping using and ask driving characteristics to control as cruised.
Object identifying method
Fig. 3 is the flow chart that the method for catching view data under the environment with dynamic illumination level is described based on an execution mode.Object identifying module 102 judges the illumination level (step 302) in environment 100, and judges that whether this illumination level is lower than illumination threshold value (step 304).If illumination is lower than illumination threshold value, then Object identifying module 102 makes low optical sensor 104a-b start (step 306), and optionally starts irradiation source 108a-b.In one embodiment, Object identifying module 102 is also stopped using along with the startup of low optical sensor 104a-b the high optical sensor 106a-b (step 306) of all working.
If the illumination level determined is not less than illumination threshold value, then Object identifying module 102 starts high optical sensor 106a-b (step 308).In addition, in one embodiment, Object identifying module 102 is stopped using along with the startup of high optical sensor 106a-b the low optical sensor 104a-b of all working and irradiation source 108a-b.Then, Object identifying module 102 catches view data (step 310) and image data processing (step 312).In one embodiment, along with illumination level is changed to the level higher or lower than illumination threshold value repeatedly, Object identifying module 102 starts high optical sensor 106a-b and low optical sensor 104a-b repeatedly.Object identifying module 102 obtains data from transducer 104a-b, 106a-b, and processes the data obtained.In another embodiment, high optical sensor 106a-b and low optical sensor 104a-b works all the time, Object identifying module 102 based on illumination level from some sensors for data.If illumination level is lower than illumination threshold value, then Object identifying module 102 obtains data and deal with data from low optical sensor 104a-b.Otherwise Object identifying module 102 obtains data and deal with data from high optical sensor 106a-b.Therefore, identification module 102 can obtain and deal with data under the situation of the illumination level change crossing tunnel as vehicle, and wherein, cross in tunnel at vehicle, illumination level enters along with vehicle and sails out of tunnel and increase and reduce.According to process, the object concurrency in Object identifying module 102 recognition image plays suitable response (step 314).
The embodiments of the present invention object of above-mentioned explanation is to illustrate, and it is not intended exhaustive or the present invention is defined as disclosed precise forms.Those skilled in the art openly can understand based on above-mentioned and can carry out multiple amendment and distortion.
Some part in this specification represents according to the algorithm of information processing and symbol and describes embodiments of the present invention.These arthmetic statements and expression are made the others skilled in the art for the flesh and blood of its work being conveyed to effectively this field by the technical staff of data processing field usually.Function, calculating, illustrate in logic these operation, be understood to by computer program or equivalent electric circuit, microcode etc. and realize.In addition, configuration such as the module with reference to these operations is also proved to be easily sometimes, and without loss of generality.The operation illustrated and its module be associated can be embodied in software, firmware, hardware or their combination in any.The hardware understanding the module illustrated by realizing is comprised at least one processor and memory by one of ordinary skill in the art, and this memory comprises the order of the functions of modules illustrated by execution.
The arbitrary step illustrated herein, operation or process can with one or more hardware or software module separately or combine to implement or realize with other equipment.In one embodiment, software module computer program realizes, it comprises: non-momentary (non-transitory) computer-readable medium comprising computer program code, and computer program code can be subsequently can by computer device execution and realize any or all illustrated steps, operation or process.
Embodiments of the present invention also go for the equipment for implementing operation herein.This equipment can construct especially for the object required, and/or can comprise the universal computing device that the computer program that is stored in a computer optionally starts or reconfigure.In that this computer program can be stored in non-momentary, tangible computer-readable recording medium, or applicable store electrons order, the medium of any type of computer system bus can be coupled in.In addition, all computing systems related in this specification can comprise an independently processor or can in order to strengthen the structure that computing capability adopts multiprocessor to design.
The product that embodiments of the present invention also go for the computational process (computingprocess) by illustrating herein and manufacture.Such product can comprise the information produced by computational process, this information is stored in non-momentary, tangible computer-readable recording medium, and such product can comprise any execution mode of computer program or other data assemblies illustrated herein.
Finally, the language used in this specification is mainly chosen as readable and guiding object, and is not selected as describing or restriction subject matter of an invention.Therefore scope of the present invention is not limited by this concrete description, but by based on application propose all authority require limit.Therefore, laying down one's cards of embodiments of the present invention is schematic, but does not limit scope of invention, and scope of invention is illustrated in the following claims.

Claims (30)

1. a computer implemented method, for catching the view data for posture from occupant or driver in the vehicle with dynamic illumination level, described method comprises:
Illumination level is judged to described vehicle, described vehicle comprises the low optical sensor of the overhead console being positioned at described vehicle and high optical sensor, described low optical sensor is assigned to has the view data of catching expression posture under the illumination level lower than threshold value illumination level from the driver described vehicle or occupant, and described high optical sensor is assigned to has the view data of catching expression posture under the illumination level higher than threshold value illumination level from the driver described vehicle or occupant;
Whether the described illumination level determined by illumination threshold module is lower than threshold value illumination level;
When described illumination level is lower than processing the view data of being caught by described low optical sensor during described threshold value illumination level; And
When described illumination level is higher than processing the view data of being caught by described high optical sensor during described threshold value illumination level.
2. computer implemented method according to claim 1, also comprises:
Judge that described illumination level is lower than described threshold value illumination level;
Process the view data of being caught by described low optical sensor;
Judge that described illumination level is higher than described threshold value illumination level; And
Process the view data of being caught by described high optical sensor.
3. computer implemented method according to claim 1, also comprises:
Irradiation source is started lower than the judgement of described threshold value illumination level in response to described illumination level.
4. computer implemented method according to claim 1, wherein,
Described low optical sensor and described high optical sensor have different brightness ranges, and wherein, the brightness range of transducer is the scope that described transducer is mounted to catch the scene brightness of view data.
5. computer implemented method according to claim 4, wherein,
The brightness range of described low optical sensor and the brightness range of described high optical sensor have coincidence, and described threshold value illumination level is arranged in the intersection of brightness range.
6. computer implemented method according to claim 1, wherein,
Described high optical sensor comprises infrared fileter to reduce the amount of infrared light arriving described high optical sensor.
7. computer implemented method according to claim 3, wherein,
Described irradiation source sends the light of the spectrum in the scope at the peak value place of the response curve had at described low optical sensor.
8. computer implemented method according to claim 1, also comprises:
In response to the judgement of described illumination level lower than described threshold value illumination level, judge that whether described high optical sensor is in work; And
In response to the judgement of described high optical sensor in work, the described high optical sensor of work of stopping using.
9. computer implemented method according to claim 1, wherein,
Described vehicle comprises four transducers for catching the view data representing posture from driver or occupant, and described four transducers are positioned at the overhead console of described vehicle, and wherein, described four transducers comprise two RGB transducers and two infrared sensors.
10. a computer program, for catching the view data for posture from occupant or driver in the vehicle with dynamic illumination level, described computer program comprises non-transitory computer-readable storage medium, and described non-transitory computer-readable storage medium comprises the computer program code for following action:
Illumination level is judged to described vehicle, described vehicle comprises the low optical sensor of the overhead console being positioned at described vehicle and high optical sensor, described low optical sensor is assigned to has the view data of catching expression posture under the illumination level lower than threshold value illumination level from the driver described vehicle or occupant, and described high optical sensor is assigned to has the view data of catching expression posture under the illumination level higher than threshold value illumination level from the driver described vehicle or occupant;
Whether the described illumination level determined by illumination threshold module is lower than threshold value illumination level;
When described illumination level is lower than processing the view data of being caught by described low optical sensor during described threshold value illumination level; And
When described illumination level is higher than processing the view data of being caught by described high optical sensor during described threshold value illumination level.
11. computer programs according to claim 10, also comprise the computer program code for following action:
Judge that described illumination level is lower than described threshold value illumination level;
Process the view data of being caught by described low optical sensor;
Judge that described illumination level is higher than described threshold value illumination level; And
Process the view data of being caught by described high optical sensor.
12. computer programs according to claim 10, also comprise the computer program code for following action:
Irradiation source is started lower than the judgement of described threshold value illumination level in response to described illumination level.
13. computer programs according to claim 10, wherein,
Described low optical sensor and described high optical sensor have different brightness ranges, and wherein, the brightness range of transducer is the scope that described transducer is mounted to catch the scene brightness of view data.
14. computer programs according to claim 13, wherein,
The brightness range of described low optical sensor and the brightness range of described high optical sensor have coincidence, and described threshold value illumination level is arranged in the intersection of brightness range.
15. computer programs according to claim 10, wherein,
Described high optical sensor comprises infrared fileter to reduce the amount of infrared light arriving described high optical sensor.
16. computer programs according to claim 12, wherein,
Described irradiation source sends the light of the spectrum in the scope at the peak value place of the response curve had at described low optical sensor.
17. computer programs according to claim 10, also comprise the computer program code for following action:
In response to the judgement of described illumination level lower than described threshold value illumination level, judge that whether described high optical sensor is in work; And
In response to the judgement of described high optical sensor in work, the described high optical sensor of work of stopping using.
18. computer programs according to claim 10, wherein,
Described vehicle comprises four transducers for catching the view data representing posture from driver or occupant, and described four transducers are positioned at the overhead console of described vehicle, and wherein, described four transducers comprise two RGB transducers and two infrared sensors.
19. 1 kinds of computer systems, for catching the view data for posture from occupant or driver in the vehicle with dynamic illumination level, described computer system comprises processor and non-emporary computer-readable medium, and described computer-readable medium comprises the computer program code for following action:
Illumination level is judged to described vehicle, described vehicle comprises the low optical sensor of the overhead console being positioned at described vehicle and high optical sensor, described low optical sensor is assigned to has the view data of catching expression posture under the illumination level lower than threshold value illumination level from the driver described vehicle or occupant, and described high optical sensor is assigned to has the view data of catching expression posture under the illumination level higher than threshold value illumination level from the driver described vehicle or occupant;
Whether the described illumination level determined by illumination threshold module is lower than threshold value illumination level;
When described illumination level is lower than processing the view data of being caught by described low optical sensor during described threshold value illumination level; And
When described illumination level is higher than processing the view data of being caught by described high optical sensor during described threshold value illumination level.
20. computer systems according to claim 19, also comprise the computer program code for following action:
Judge that described illumination level is lower than described threshold value illumination level;
Process the view data of being caught by described low optical sensor;
Judge that described illumination level is higher than described threshold value illumination level; And
Process the view data of being caught by described high optical sensor.
21. computer systems according to claim 19, also comprise the computer program code for following action:
Irradiation source is started lower than the judgement of described threshold value illumination level in response to described illumination level.
22. computer systems according to claim 19, wherein,
Described low optical sensor and described high optical sensor have different brightness ranges, and wherein, the brightness range of transducer is the scope that described transducer is mounted to catch the scene brightness of view data.
23. computer systems according to claim 22, wherein,
The brightness range of described low optical sensor and the brightness range of described high optical sensor have coincidence, and described threshold value illumination level is arranged in the intersection of brightness range.
24. computer systems according to claim 19, wherein,
Described high optical sensor comprises infrared fileter to reduce the amount of infrared light arriving described high optical sensor.
25. computer systems according to claim 21, wherein,
Described irradiation source sends the light of the spectrum in the scope at the peak value place of the response curve had at described low optical sensor.
26. computer systems according to claim 19, wherein,
Described vehicle comprises four transducers for catching the view data representing posture from driver or occupant, and described four transducers are positioned at the overhead console of described vehicle, and wherein, described four transducers comprise two RGB transducers and two infrared sensors.
27. 1 kinds of computer systems, for catching the view data for posture from occupant or driver in the vehicle with dynamic illumination level, described computer system comprises:
For judging the illumination level module of illumination level to described vehicle, described vehicle comprises the low optical sensor of the overhead console being positioned at described vehicle and high optical sensor, described low optical sensor is assigned to has the view data of catching expression posture under the illumination level lower than threshold value illumination level from the driver described vehicle or occupant, and described high optical sensor is assigned to has the view data of catching expression posture under the illumination level higher than threshold value illumination level from the driver described vehicle or occupant;
For judging determined described illumination level whether lower than the illumination threshold module of threshold value illumination level; And
Catch source module, when described illumination level is lower than described threshold value illumination level, described source module of catching starts described low optical sensor, and when described illumination level is higher than described threshold value illumination level, described in catch the described high optical sensor of source module startup.
28. computer systems according to claim 27, wherein,
Described low optical sensor and described high optical sensor have different brightness ranges, and wherein, the brightness range of transducer is the scope that described transducer is mounted to catch the scene brightness of view data.
29. computer systems according to claim 28, wherein,
The brightness range of described low optical sensor and the brightness range of described high optical sensor have coincidence, and described threshold value illumination level is arranged in the intersection of brightness range.
30. computer systems according to claim 27, wherein,
Described vehicle comprises four transducers for catching the view data representing posture from driver or occupant, and described four transducers are positioned at the overhead console of described vehicle, and wherein, described four transducers comprise two RGB transducers and two infrared sensors.
CN201380067876.1A 2012-10-24 2013-09-12 Object identifying under low illumination and high Irradiance Active CN105122784B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/659,826 US8781171B2 (en) 2012-10-24 2012-10-24 Object recognition in low-lux and high-lux conditions
US13/659,826 2012-10-24
PCT/US2013/059408 WO2014065951A1 (en) 2012-10-24 2013-09-12 Object recognition in low-lux and high-lux conditions

Publications (2)

Publication Number Publication Date
CN105122784A true CN105122784A (en) 2015-12-02
CN105122784B CN105122784B (en) 2018-09-25

Family

ID=50485366

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380067876.1A Active CN105122784B (en) 2012-10-24 2013-09-12 Object identifying under low illumination and high Irradiance

Country Status (7)

Country Link
US (4) US8781171B2 (en)
EP (1) EP2912836B1 (en)
JP (1) JP6042555B2 (en)
KR (1) KR101687125B1 (en)
CN (1) CN105122784B (en)
IL (1) IL238251A (en)
WO (1) WO2014065951A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111527743A (en) * 2017-12-28 2020-08-11 伟摩有限责任公司 Multiple modes of operation with extended dynamic range
CN112001208A (en) * 2019-05-27 2020-11-27 虹软科技股份有限公司 Target detection method and device for vehicle blind area and electronic equipment
CN112541859A (en) * 2019-09-23 2021-03-23 武汉科技大学 Illumination self-adaptive face image enhancement method

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8781171B2 (en) * 2012-10-24 2014-07-15 Honda Motor Co., Ltd. Object recognition in low-lux and high-lux conditions
US10007329B1 (en) 2014-02-11 2018-06-26 Leap Motion, Inc. Drift cancelation for portable object detection and tracking
US9754167B1 (en) 2014-04-17 2017-09-05 Leap Motion, Inc. Safety for wearable virtual reality devices via object detection and tracking
US9868449B1 (en) 2014-05-30 2018-01-16 Leap Motion, Inc. Recognizing in-air gestures of a control object to control a vehicular control system
US9646201B1 (en) 2014-06-05 2017-05-09 Leap Motion, Inc. Three dimensional (3D) modeling of a complex control object
US10007350B1 (en) 2014-06-26 2018-06-26 Leap Motion, Inc. Integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
DE202014103729U1 (en) 2014-08-08 2014-09-09 Leap Motion, Inc. Augmented reality with motion detection
JP3194297U (en) 2014-08-15 2014-11-13 リープ モーション, インコーポレーテッドLeap Motion, Inc. Motion sensing control device for automobile and industrial use
CN106304570B (en) * 2016-10-09 2018-11-02 惠州市海尼克电子科技有限公司 A kind of method and inductor being placed in detection ambient light illumination in lamps and lanterns
CN111428545A (en) * 2019-01-10 2020-07-17 北京嘀嘀无限科技发展有限公司 Behavior judgment method and device and electronic equipment
KR20220067733A (en) * 2020-11-18 2022-05-25 한국전자기술연구원 Vehicle lightweight deep learning processing device and method applying multiple feature extractor

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6291802A (en) * 1985-10-18 1987-04-27 Canon Inc Observation measuring instrument
US20050226472A1 (en) * 2004-04-13 2005-10-13 Denso Corporation Driver's appearance recognition system
JP2007336449A (en) * 2006-06-19 2007-12-27 Asahi Glass Co Ltd Image pickup device
JP2008042806A (en) * 2006-08-10 2008-02-21 Nikon Corp Camera
US7602947B1 (en) * 1996-05-15 2009-10-13 Lemelson Jerome H Facial-recognition vehicle security system
JP2010253987A (en) * 2009-04-21 2010-11-11 Yazaki Corp In-vehicle photographing unit
JP2010268343A (en) * 2009-05-18 2010-11-25 Olympus Imaging Corp Photographing device and photographing method
JP2011142500A (en) * 2010-01-07 2011-07-21 Toyota Motor Corp Imaging apparatus
US20110249120A1 (en) * 2002-11-14 2011-10-13 Donnelly Corporation Camera module for vehicle

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6553296B2 (en) * 1995-06-07 2003-04-22 Automotive Technologies International, Inc. Vehicular occupant detection arrangements
US6507779B2 (en) * 1995-06-07 2003-01-14 Automotive Technologies International, Inc. Vehicle rear seat monitor
US6442465B2 (en) * 1992-05-05 2002-08-27 Automotive Technologies International, Inc. Vehicular component control systems and methods
JPH06291802A (en) 1993-04-02 1994-10-18 Furukawa Electric Co Ltd:The Multiplex transmitter
US7527288B2 (en) * 1995-06-07 2009-05-05 Automotive Technologies International, Inc. Vehicle with crash sensor coupled to data bus
DE69736764T2 (en) * 1996-08-28 2007-01-25 Matsushita Electric Industrial Co., Ltd., Kadoma Local positioning device and method therefor
US8120652B2 (en) * 1997-04-02 2012-02-21 Gentex Corporation System for controlling vehicle equipment
US6545670B1 (en) 1999-05-11 2003-04-08 Timothy R. Pryor Methods and apparatus for man machine interfaces and related activity
US7050606B2 (en) * 1999-08-10 2006-05-23 Cybernet Systems Corporation Tracking and gesture recognition system particularly suited to vehicular control applications
AU2001259640A1 (en) * 2000-05-08 2001-11-20 Automotive Technologies International, Inc. Vehicular blind spot identification and monitoring system
US6535242B1 (en) 2000-10-24 2003-03-18 Gary Steven Strumolo System and method for acquiring and displaying vehicular information
US20050271280A1 (en) * 2003-07-23 2005-12-08 Farmer Michael E System or method for classifying images
US7177486B2 (en) * 2002-04-08 2007-02-13 Rensselaer Polytechnic Institute Dual bootstrap iterative closest point method and algorithm for image registration
US7123747B2 (en) * 2002-05-28 2006-10-17 Trw Inc. Enhancement of vehicle interior digital images
US7088243B2 (en) 2003-05-26 2006-08-08 S1 Corporation Method of intruder detection and device thereof
WO2005032887A2 (en) * 2003-10-03 2005-04-14 Automotive Systems Laboratory, Inc. Occupant detection system
JP2005173257A (en) * 2003-12-11 2005-06-30 Canon Inc Ranging photometer and ranging remote control receiver
JP2008537190A (en) 2005-01-07 2008-09-11 ジェスチャー テック,インコーポレイテッド Generation of three-dimensional image of object by irradiating with infrared pattern
KR100729280B1 (en) 2005-01-08 2007-06-15 아이리텍 잉크 Iris Identification System and Method using Mobile Device with Stereo Camera
JP4438753B2 (en) 2006-01-27 2010-03-24 株式会社日立製作所 In-vehicle state detection system, in-vehicle state detection device and method
TWI302879B (en) * 2006-05-12 2008-11-11 Univ Nat Chiao Tung Real-time nighttime vehicle detection and recognition system based on computer vision
JP5084322B2 (en) * 2007-03-29 2012-11-28 キヤノン株式会社 Imaging apparatus and control method thereof
US20100039500A1 (en) 2008-02-15 2010-02-18 Matthew Bell Self-Contained 3D Vision System Utilizing Stereo Camera and Patterned Illuminator
WO2010083259A2 (en) 2009-01-13 2010-07-22 Meimadtek Ltd. Method and system for operating a self-propelled vehicle according to scene images
EP2515526A3 (en) 2011-04-08 2014-12-24 FotoNation Limited Display device with image capture and analysis module
US9389690B2 (en) 2012-03-01 2016-07-12 Qualcomm Incorporated Gesture detection based on information from multiple types of sensors
US8781171B2 (en) * 2012-10-24 2014-07-15 Honda Motor Co., Ltd. Object recognition in low-lux and high-lux conditions

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6291802A (en) * 1985-10-18 1987-04-27 Canon Inc Observation measuring instrument
US7602947B1 (en) * 1996-05-15 2009-10-13 Lemelson Jerome H Facial-recognition vehicle security system
US20110249120A1 (en) * 2002-11-14 2011-10-13 Donnelly Corporation Camera module for vehicle
US20050226472A1 (en) * 2004-04-13 2005-10-13 Denso Corporation Driver's appearance recognition system
JP2007336449A (en) * 2006-06-19 2007-12-27 Asahi Glass Co Ltd Image pickup device
JP2008042806A (en) * 2006-08-10 2008-02-21 Nikon Corp Camera
JP2010253987A (en) * 2009-04-21 2010-11-11 Yazaki Corp In-vehicle photographing unit
JP2010268343A (en) * 2009-05-18 2010-11-25 Olympus Imaging Corp Photographing device and photographing method
JP2011142500A (en) * 2010-01-07 2011-07-21 Toyota Motor Corp Imaging apparatus

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111527743A (en) * 2017-12-28 2020-08-11 伟摩有限责任公司 Multiple modes of operation with extended dynamic range
CN112001208A (en) * 2019-05-27 2020-11-27 虹软科技股份有限公司 Target detection method and device for vehicle blind area and electronic equipment
CN112541859A (en) * 2019-09-23 2021-03-23 武汉科技大学 Illumination self-adaptive face image enhancement method
CN112541859B (en) * 2019-09-23 2022-11-25 武汉科技大学 Illumination self-adaptive face image enhancement method

Also Published As

Publication number Publication date
KR101687125B1 (en) 2016-12-15
US9852332B2 (en) 2017-12-26
IL238251A0 (en) 2015-06-30
US20160176348A1 (en) 2016-06-23
US8781171B2 (en) 2014-07-15
US9469251B2 (en) 2016-10-18
EP2912836A1 (en) 2015-09-02
EP2912836A4 (en) 2016-08-17
US20160364605A1 (en) 2016-12-15
JP2015535412A (en) 2015-12-10
US9302621B2 (en) 2016-04-05
JP6042555B2 (en) 2016-12-14
CN105122784B (en) 2018-09-25
IL238251A (en) 2016-08-31
WO2014065951A1 (en) 2014-05-01
US20140285664A1 (en) 2014-09-25
US20140112528A1 (en) 2014-04-24
KR20150094607A (en) 2015-08-19
EP2912836B1 (en) 2018-05-09

Similar Documents

Publication Publication Date Title
CN105122784A (en) Object recognition in low-lux and high-lux conditions
RU2729045C2 (en) Adaptive lighting system for mirror component and method for controlling adaptive lighting system
CN109314751B (en) Light supplementing method, light supplementing device and electronic equipment
CN103839054B (en) Multi-functional mobile intelligent terminal sensor supporting iris recognition
JP5814920B2 (en) Vein imaging device
CN104203632A (en) Method, device, and computer program product for controlling at least one function of at least one display device for a vehicle interior
US11904869B2 (en) Monitoring system and non-transitory storage medium
JP2020001529A (en) In-vehicle lighting system
US20200150758A1 (en) Display device, learning device, and control method of display device
CN109496310A (en) Fingerprint identification method, device and terminal device
CN105895057A (en) Backlight regulating method and device, and terminal equipment
CN108664881A (en) Device and method for occupant's sensing
JP2013182811A (en) Illumination control apparatus and illumination system
US20210248346A1 (en) Under-Screen Fingerprint Module, Electronic Device and Fingerprint Image Processing Method
CN112896032A (en) Variable beam pattern lamp system for driver and control method thereof
CN106453932A (en) Method and apparatus of awakening mobile terminal, and mobile terminal
US20220305986A1 (en) Illumination control device
JP2016051312A (en) Visual line detection device
KR20190048944A (en) Fingerprint image obtaining method and electronic device supporting the same
CN109559535A (en) A kind of dynamic sound-light coordination traffic signal system of integration recognition of face
CN113678138A (en) Biometric imaging apparatus and biometric imaging method for capturing image data of a body part of a person capable of improving image data quality
CN103885572B (en) Switching device
KR101527816B1 (en) Intelligent monitoring system and controlling methods for appearance of vehicles using an omnidirectional camera
CN107734243A (en) A kind of cam device and electronic equipment
KR101376908B1 (en) Detectiing objects in an image based on a visible light communication signal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant