EP3504871A1 - Method and device for operating an interior camera - Google Patents
Method and device for operating an interior cameraInfo
- Publication number
- EP3504871A1 EP3504871A1 EP17757472.0A EP17757472A EP3504871A1 EP 3504871 A1 EP3504871 A1 EP 3504871A1 EP 17757472 A EP17757472 A EP 17757472A EP 3504871 A1 EP3504871 A1 EP 3504871A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- camera
- image
- head
- image signal
- parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 238000004590 computer program Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 4
- 238000001228 spectrum Methods 0.000 claims description 3
- 230000035945 sensitivity Effects 0.000 claims description 2
- 238000001454 recorded image Methods 0.000 abstract 1
- 210000003128 head Anatomy 0.000 description 77
- 238000005286 illumination Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 13
- 238000013459 approach Methods 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 7
- 210000001747 pupil Anatomy 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000007704 transition Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000001276 controlling effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 210000001525 retina Anatomy 0.000 description 4
- 230000003044 adaptive effect Effects 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 241000593989 Scardinius erythrophthalmus Species 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 230000001105 regulatory effect Effects 0.000 description 2
- 206010041349 Somnolence Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000011217 control strategy Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30268—Vehicle interior
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
- H04N23/21—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from near infrared [NIR] radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
Definitions
- the invention is based on a device or a method according to the preamble of the independent claims.
- the subject of the present invention is also a computer program.
- An indoor camera captures images in the near infrared range.
- the interior camera has an infrared illumination device. The closer an object is to the
- Lighting device is, the greater is an illumination intensity on the object.
- Has detection probability It is expected that the designated here as an object head of a driver of a vehicle with a very high probability after a short time is again arranged in an expected position.
- the reference value is matched to the expected position.
- the camera parameter is tracked according to the detected position of the head.
- a method of operating an interior camera for a vehicle wherein in a step of controlling at least one camera parameter of the indoor camera is controlled using at least one quality parameter of a previously acquired image of the indoor camera when a head of a target person is detected in the image of the
- Camera parameter is set to a predefined value if no head is detected.
- An interior camera can be understood as a camera directed into an interior of a vehicle.
- the interior camera can be understood as a camera directed into an interior of a vehicle.
- the interior camera can be understood as a camera directed into an interior of a vehicle.
- the vehicle be aimed in particular at a driver of the vehicle.
- Indoor camera can provide an image sequence of individual images of the interior.
- the indoor camera can also provide a video signal.
- a camera parameter can be an adjustable size.
- the driver can be the target person.
- a predefined value can be a default value for the
- a contrast As a quality parameter, a contrast, a brightness and / or a
- Brightness distribution of the image can be used.
- the camera parameters an exposure time and / or sensitivity of the indoor camera can be regulated.
- the camera parameter can be a Be regulated light intensity of a lighting device of the interior camera. By adjusting these camera parameters, the image can be of high quality.
- the method may include a step of detecting the head in a subsequently captured image of the indoor camera.
- Quality parameter can be related to a header of the image.
- the header is the region of interest of the image. By referring the quality parameters to the header area, the header area can be displayed very well.
- the method may include a step of adjusting in which a color depth of a raw image signal of the indoor camera is adjusted to obtain a working image signal.
- the head can be detected in an image of the working image signal. Reduced color depth requires less computation to process the image.
- a color depth spectrum can be extracted from the raw image signal to obtain the working image signal.
- raw color levels of the raw image signal may be added using a processing protocol
- the color depth spectrum can by a
- Processing instructions can be an algorithm for converting the color levels. By converting a large information content of the image can be obtained. By extracting, the color depth can be reduced quickly and easily.
- This method can be implemented, for example, in software or hardware or in a mixed form of software and hardware, for example in a control unit.
- the approach presented here also provides a device which is designed to implement the steps of a variant of a method presented here
- the device may comprise at least one computing unit for processing signals or data, at least one memory unit for storing signals or data, at least one interface to a sensor or an actuator for reading sensor signals from the sensor or for outputting data or control signals to the sensor Actuator and / or at least one
- the arithmetic unit may be, for example, a signal processor, a microcontroller or the like, wherein the memory unit may be a flash memory, an EEPROM or a magnetic memory unit.
- the communication interface can be designed to read or output data wirelessly and / or by line, wherein a communication interface that can read or output line-bound data, for example, electrically or optically read this data from a corresponding data transmission line or output in a corresponding data transmission line.
- a device can be understood as meaning an electrical device which processes sensor signals and outputs control and / or data signals in dependence thereon.
- the device may have an interface, which may be formed in hardware and / or software.
- the interfaces can be part of a so-called system ASIC, for example, which contains a wide variety of functions of the device.
- the interfaces are their own integrated circuits or at least partially consist of discrete components.
- the interfaces may be software modules that are present, for example, on a microcontroller in addition to other software modules.
- a computer program product or computer program with program code which is stored on a machine-readable carrier or storage medium such as a semiconductor memory, a hard disk memory or an optical disk Memory may be stored and for carrying out, implementing and / or controlling the steps of the method according to one of the above
- FIG. 1 is a block diagram of a vehicle having an apparatus for operating an interior camera according to an embodiment
- FIG. 2 is an illustration of a procedure of operating an indoor camera according to an embodiment
- FIG. 3 is a flowchart of a method for operating a
- FIG. 4 is a state diagram of a controller for a method of operating an indoor camera according to an embodiment
- 5 shows representations of a detection of an object in an image of an interior camera according to an exemplary embodiment
- 6 is a flowchart of an algorithm for operating a
- FIG. 7 is a block diagram of a controlled system for operating a
- Fig. 12 is a flow chart of a method of operation
- FIG. 1 shows a block diagram of a vehicle 100 with a device 102 for operating an interior camera 104 according to one exemplary embodiment.
- the interior camera 104 is aligned with an expected head area 106 of a driver 108 of the vehicle 100.
- the head 110 is the head 110 in
- the interior camera 104 has a lighting device 112.
- the illumination device 112 comprises at least one infrared light source 114, 116, which is aligned in the head region 106.
- the first infrared light source 114 is close to
- Interior camera 104 is arranged. Light from the first light source 114 is reflected on the retina of the driver's eyes to the interior camera 104 and, similar to the red eye effect, produces bright pupils in images of the camera
- the second infrared light source 116 is disposed away from the indoor camera 104. Light from the second light source 116 is not reflected via the retina to the indoor camera 104 and produces dark pupils in the images.
- the head 110 When the head 110 is within the head area 106, it may be detected in images of the indoor camera 104. Depending on a position of the head 110 in the header 106, the image has quality parameters 118. When the head 110 is detected, at least one camera parameter 122 of the indoor camera 104 is readjusted in a controller 120 of the apparatus 102 for operation using the quality parameters 118.
- FIG. 1 shows a device 102 for controlling a motor vehicle interior camera 104 with active I R illumination 112.
- Driver surveillance camera systems consist of at least one
- Camera module 104 an active near-infrared illumination or I R Modules 112 and a computer unit 102.
- Driver surveillance camera system may typically be a one-way, two-way multi-camera system.
- the I R modules 112 can in
- Bright Pupil (BP) 114 and Dark Pupil (DP) 116 can be distinguished.
- Bright pupil (BP) 114 produces bright pupils similar to the "red eye effect" when the illumination 114 is located very close to the camera 104, whereby the light which has hit the retina through the pupil is reflected back into the camera image.
- DP 116 occurs when the
- Lighting 116 away from the camera 104 is located and thus no emitted light enters directly on the retina or pupil and therefore the pupil in the camera image remains dark.
- the active I R illumination 112 provides a good illumination of the scene or the driver's face. This can be done at all
- the integration times of the imager and the I R illumination can be changed.
- the I R illumination 112 may be in terms of illumination and
- Irradiance can be adjusted for changing distances. Thus, many use cases are covered where the driver or the user is at different distances to the head or eye tracking system.
- a camera control for example by means of the built-in imager auto exposure control (AEC) or a dedicated camera SW module leads to a brightness control on the full picture or a set image area.
- AEC built-in imager auto exposure control
- a dedicated camera SW module leads to a brightness control on the full picture or a set image area.
- the picture area can not between the face and objects, like the sun behind the face, one Obscurity, for example, be distinguished in front of the face with one hand. This causes the face in the controlled image by means of
- Image processing methods can no longer be found because the contrast and brightness is not sufficient.
- the camera 104 and the I R illumination 112 provide an optimal image for the head tracking.
- the image has a quantization of 10 or 12 bits and in the first step becomes, for example, a 10/12 -> 8 bit mapping by means of a function such as a logarithmic characteristic or by simple bit selection or cutting of 8 bits from 10 / 12 bit performed.
- the raw image is thus reduced to 8-bit resolution and fed to the head tracking algorithm.
- Image control based on the image quality parameters perform. If, however, no face is detected, the system jumps back to the defined operating point.
- the transition may be made by temporally configuring with a defined period of time, instead of a hard transition to the aforementioned operating point. It is in the use cases mentioned, such as sun behind the head and
- Obscurations avoided unfavorable regulation on objects. Even in the case of an approach of the head with previously detected face to the camera 104, adapted to the situation lighting and integration of the image ensure an optimal for the head tracking image.
- FIG. 2 shows an illustration of an effect chain of a camera control of an interior camera 104 according to one exemplary embodiment.
- the chain of effects comprises a frame buffer 200, a preprocessing means 202, a head tracking means 204, an eye tracking means 206, and higher level functions 208.
- the indoor camera 104 provides raw image data 210 at 12 or 10 bits per pixel for the image buffer 200.
- the raw image data 210 is in the
- Preprocessor 202 by mapping or by a non-linear function, such as log or simply dropping bits, for example, the twelfth and first bits or the two lowest first and second bits of 12 to 8-bit image data 212 reduced.
- the 8-bit image data 212 is used in the tracker 204 to
- header data 214 The head tracking data 214 is used in the controller 120 in accordance with the approach presented herein to adjust camera and IR exposure control to head tracking data 214. Image quality can be improved for head tracking.
- the head tracking data 214 is evaluated to determine parameter 122 for controlling the camera / 1 R.
- a 2D head boundary box is used as the region of interest (ROI) for calculating image quality parameters 118.
- ROI region of interest
- the head tracking data 214 and the 2D header box may be used if a
- Acceptance level / acceptance threshold is exceeded.
- a confidence may be used to set a region of interest for calculating image quality parameters 118.
- the size of the region of interest is limited in minimum and maximum size to avoid too small or too large a region of interest.
- the camera / 1R parameters 122 may be frozen if the face is not frontal.
- a head tracking status may be initialized, tracked / tracked, or retrieved / reinitiated when the head tracking is in the tracked mode. In other cases, init or refind, the camera / 1R control will not be changed.
- Possible default settings are a bit shift or bit shift substantially greater than seven, a predetermined gain of one, a predetermined analog gain of one.
- Control parameters for example (PID controller). Timing thresholds especially for transition between face captured and not captured.
- a log function for mapping the 12-bit or 10-bit image to an 8-bit image may be applied.
- the image recording and exposure are adjusted by moving from the operating point 216 to a good image quality achieved, which is suitable for tracking the head and eyes.
- FIG. 3 shows a flow chart of a method for operating a
- the method can be carried out on a device for operating, as shown for example in FIG. 1.
- the flowchart includes a memory block 300, a first functional block 302, a first decision block 304, a second functional block 306, a second decision block 308, and a third functional block 310.
- the third functional block 310 is a third
- a start takes place with a default setting or in an operating point from the memory block 300, as shown in Fig. 2.
- a setting control starts when the face is detected or with an input signal of the Head tracking.
- the tracking quality / confidence level, tracking refresh rate, estimated distance from the face bounding box in a 2 D image plane may be used to set the region of interest of the image. This region of interest is used to calculate the image quality parameters and to make a review
- the region of interest may be a face area within the image displayed by the detected face boundary box.
- facial features such as visibility, symmetry, or head tracking obscurities are evaluated.
- the control is only done with (near) frontal view of the face.
- Brightness symmetry and / or face rotation or alignment are checked. Control parameters on header profiles are kept. Control occurs only when Head Tracking is in tracking mode. In this mode, only facial features or landmarks such as canthus and nostrils are tracked by applying, for example, a Kalman filter. In the initialization mode, the search is performed over a full frame for head / face candidates. In the recovery mode, try the search.
- Head tracking to find or capture a head within a larger image region than in the tracking mode.
- the control is carried out in two stages.
- For the exposure a new exposure time is used using the exposure time +/-
- Shutter speed_Step size determined.
- the step size is used to get a smooth dynamic brightness change to avoid changing a brightness level between frames.
- Control speed is adjusted to head movement toward the camera.
- a six, seven or eight bit shift bit shift can be used as an optional stage. If the threshold to a good image parameter with a matched image is not met, another bit shift operation may be performed one to two bits left or right.
- the main control parameter is the
- Exposure time The level and range, span or range can be tested iteratively.
- Minimum and maximum range of parameters A minimum and maximum exposure time is limited, for example between 40 microseconds and three milliseconds. The values are based on heuristics to allow adjustment close to the given operating point. The bit shift takes place with a maximum of 2 bits left and right.
- img_qpar or image quality parameters such as brightness and contrast and the comparison of actual image parameters with target image quality parameters qpar_thr, ideally equal to an average value, such as image brightness about 128 LSB with 8-bit image.
- an exposure time for the next frame may be set above 12C to obtain a result closer to a good image parameter based on an analysis of the current frame. For example, if it is too dark, the exposure time may be increased; if it is too bright, the exposure time may be reduced.
- the range or range and step or stages can be evaluated heuristically.
- Standard control for example via a PID controller, can be used.
- the reference variable w is the image quality parameter
- the feedback x is the control parameter
- the control deviation e is the difference for the controller.
- the head tracking quality and the estimated distance corresponding to the detected face bounding box in the 2-D image plane may be used to set a region of interest to the control.
- the interested parties Region (ROI) is relevant for calculating the image quality parameter and checking against the configured threshold.
- a frame n is recorded with a given exposure time exp_time and a bit shift bitshift.
- cam_capture_par are used.
- a query is made as to whether the head is detected in frame n-1.
- quality parameters qpar for frame n are calculated, qpar are image quality parameters such as image brightness or contrast in the full image or the face bounding box. This is followed by another query as to whether qpar is greater or less than a threshold qpar_thr. Where qpar is greater than qpar_thr if qpark is greater than qpar_thrk and qpark + i is greater than qpar_thrk + i and within operational header related parameters and
- Thresholds is.
- the exposure time exp_time is set and the bit shift for frame n + 1 is adjusted using a controller.
- the exposure is adjusted. If g_mean is less than g_mean_thr, the exp_time (n + l) is set to clip (exp_time (n) + exp_time_step). If g_mean is greater than g_mean_thr, the exp_time (n + l) is set to clip (exp_time (n) -exp_time_step).
- the exp_time_range is 0.5 to 3 ms.
- the exp_time_step is 0.5 ms.
- the bit shift is set. If qpar (n) - qpar (n-l) is greater than qpar_delta_thr and last_adj_act is greater than
- bitshift (n + l) is set as bitshift (n) ⁇ bitshift_step. If qpar (n) -qpar (n-l) is less than qpar_delta_thr and last_adj_act is less than last_adi_act_thr, bitshift (n + l) is set as bitshift (n) >> bitshift step.
- bitshift_range is 6 to 8 and the
- Bitshift step is 1.
- the exposure is set based on the first level histogram. If Hist_pix_cnt is greater than pix_cnt thr and the mean gray value, that is, Hist_pix_dark_cnt is greater than pix_dark_cnt_thr and g_mean is less when g_mean_thr is set, the exp_time (n + l) is set to clip (exp_time (n) + exp_time_step).
- FIG. 4 shows a state diagram of a controller 400 for a method for operating an interior camera according to an exemplary embodiment.
- the controller is a finite automaton or state machine
- the controller essentially corresponds to the controller in FIG. 3.
- the controller 400 has a first function block 402, a second function block 404 and a third
- a global ROI in a global state is used as a default value as long as no head is detected.
- the controller 400 changes to a head tracking state within a head tracking ROI.
- the controller 400 transitions to a transient state with a transitional time
- Head tracking ROI If the head is not recognized within the transition time, the controller 400 returns to the global state with the global ROI.
- FIG. 5 shows illustrations of a detection of an object 500 in an image of an interior camera according to an exemplary embodiment.
- a preset setting of the region of interest 502 is started.
- the indoor camera with a fixed exposure time or fixed exposure time or with adaptive
- Exposure time or operated with adaptive exposure time is used.
- the ROI 502, where the probability for the head 500 is highest, is used.
- the region of interest 502 may be referred to as header 502.
- the ROI size in the head tracking depends on the two-dimensional size of the detected head 500, the quality of the head tracking and the Frame rate of the head tracking.
- the center of the region of interest 502 is the center of the captured two-dimensional head 500, with limitation, to hold the region of interest 502 within the image.
- the setpoint is converted linearly over time. If the exposure time is fixed, a low pass filter with a longer rise time is used. if the
- Exposure time is adaptive, corners of the region of interest 502 are converted from the head track to linear ROI corners.
- FIG. 6 shows a flowchart of an algorithm for operating a
- the controller implements a model-based algorithm.
- the optimum actuation value or the exposure time u is calculated using an inverse model 600.
- An input signal u is the exposure time.
- An output signal y is an average of image pixel values. Both values are processed in the inverse model, giving an estimated optimal
- Input signal u which is filtered via an LPF low-pass filter 602 with a rise time of 0.14 seconds to be used again as input signal u.
- Fig. 7 shows a block diagram of a controlled system for operating a
- the controlled system implements the algorithm shown in FIG. 6.
- the exposure time is computed in a calculator 700 as the estimated optimal input signal (u) of the system.
- the mean target value is a predefined mean value which leads to good image quality.
- the image average is the calculated average of the image pixel values of a downsampled or downsampled image.
- the input signal (u) is filtered in the low-pass filter 602 with the rise time of 0.14 seconds.
- the imaging device 702 the
- Input signal of the system is mapped to camera control variables.
- Camera action variables are exposure time, gain and bit shift.
- 8 shows an illustration of an application case for an interior camera 104 according to an exemplary embodiment.
- the head 110 is arranged close to the camera 104 or IR 112 at a near border of the head area 106 or a beginning of the head motion box HMB 106.
- FIG. 9 shows an illustration of an application case for an interior camera 104 according to an embodiment.
- the application case corresponds essentially to the application in Fig. 1.
- the head in contrast, the head
- FIG. 10 shows an illustration of an application case for an indoor camera
- the application case essentially corresponds to the application in FIG. 1.
- an object 1000 is arranged between the camera and the head 110.
- the object 1000 partially covers the head 110.
- the obscuring object 1000 for example, a hand or the steering wheel shadows parts of the face.
- FIG. 11 shows an illustration of an application case for an interior camera 104 according to an embodiment.
- the application corresponds essentially to the application in Fig. 1.
- the head 110 is illuminated by extraneous light sources 1100.
- Ambient light for example from the
- HMB head motion field 106
- FIG. 12 shows a flowchart of a method for operating a
- the method points a step 1200 of the regulation.
- at least one camera parameter of the indoor camera is controlled by using at least one quality parameter of a previously acquired image of the indoor camera when a head of a target person is detected in the image.
- the camera parameter is set to a predefined value if no head is detected.
- an exemplary embodiment comprises a "and / or" link between a first feature and a second feature, then this is to be read so that the embodiment according to one embodiment, both the first feature and the second feature and according to another embodiment either only first feature or only the second feature.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102016215766.3A DE102016215766A1 (en) | 2016-08-23 | 2016-08-23 | Method and device for operating an interior camera |
PCT/EP2017/069684 WO2018036784A1 (en) | 2016-08-23 | 2017-08-03 | Method and device for operating an interior camera |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3504871A1 true EP3504871A1 (en) | 2019-07-03 |
Family
ID=59699653
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17757472.0A Withdrawn EP3504871A1 (en) | 2016-08-23 | 2017-08-03 | Method and device for operating an interior camera |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP3504871A1 (en) |
CN (1) | CN109565549B (en) |
DE (1) | DE102016215766A1 (en) |
WO (1) | WO2018036784A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7210965B2 (en) | 2018-09-26 | 2023-01-24 | 株式会社アイシン | indoor monitoring device |
JP6894880B2 (en) * | 2018-11-07 | 2021-06-30 | 矢崎総業株式会社 | Monitoring system |
DE102019202302B3 (en) | 2019-02-20 | 2020-01-02 | Zf Friedrichshafen Ag | Method, control device and computer program product for determining a head orientation and / or position of a vehicle occupant |
DE102019114754A1 (en) * | 2019-06-03 | 2020-12-03 | Bayerische Motoren Werke Aktiengesellschaft | Method for operating an interior camera while a vehicle is in motion, computer-readable medium, system and vehicle |
FR3101568B1 (en) * | 2019-10-03 | 2022-08-05 | Aleph Sas | METHOD FOR MANUFACTURING A FILM COMPRISING CAVITIES WITH DETERMINATION OF DRAWING PROFILES, DENSITY, THICKNESS AND/OR POROSITY OF THE FILM |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7616233B2 (en) * | 2003-06-26 | 2009-11-10 | Fotonation Vision Limited | Perfecting of digital image capture parameters within acquisition devices using face detection |
US8896725B2 (en) * | 2007-06-21 | 2014-11-25 | Fotonation Limited | Image capture device with contemporaneous reference image capture mechanism |
JP2008094221A (en) * | 2006-10-11 | 2008-04-24 | Denso Corp | Eye state detector, and eye state detector mounting method |
US8520979B2 (en) * | 2008-08-19 | 2013-08-27 | Digimarc Corporation | Methods and systems for content processing |
US8233789B2 (en) * | 2010-04-07 | 2012-07-31 | Apple Inc. | Dynamic exposure metering based on face detection |
CN101866215B (en) * | 2010-04-20 | 2013-10-16 | 复旦大学 | Human-computer interaction device and method adopting eye tracking in video monitoring |
DE102011006564A1 (en) * | 2011-03-31 | 2012-10-04 | Robert Bosch Gmbh | Method for evaluating an image captured by a camera of a vehicle and image processing device |
CN203327138U (en) * | 2013-07-26 | 2013-12-04 | 朱耀辉 | Ball-type camera |
FR3013875B1 (en) * | 2013-11-25 | 2017-03-31 | Renault Sas | SYSTEM AND METHOD FOR FORMING NIGHT IMAGES FOR A MOTOR VEHICLE. |
CN104036238B (en) * | 2014-05-28 | 2017-07-07 | 南京大学 | The method of the human eye positioning based on active light |
CN105302135B (en) * | 2015-09-18 | 2017-10-20 | 天津鑫隆机场设备有限公司 | The navigation of navigational lighting aid light-intensity test car and alignment system based on binocular vision |
-
2016
- 2016-08-23 DE DE102016215766.3A patent/DE102016215766A1/en active Pending
-
2017
- 2017-08-03 WO PCT/EP2017/069684 patent/WO2018036784A1/en unknown
- 2017-08-03 CN CN201780051672.7A patent/CN109565549B/en active Active
- 2017-08-03 EP EP17757472.0A patent/EP3504871A1/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
CN109565549A (en) | 2019-04-02 |
DE102016215766A1 (en) | 2018-03-01 |
WO2018036784A1 (en) | 2018-03-01 |
CN109565549B (en) | 2021-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3504871A1 (en) | Method and device for operating an interior camera | |
EP3157426B1 (en) | Device, method, and computer program for detecting micro- sleep | |
DE112008002645B4 (en) | Face picture taking device, face picture taking procedure and program for this | |
DE69627138T2 (en) | METHOD FOR ESTIMATING THE SITUATION OF A PICTURE TARGET REGION FROM SEVERAL REGIONS OF TRACKED LANDMARKS | |
DE60209989T2 (en) | Robust visual models for visual motion analysis and tracking | |
DE102007034657B4 (en) | Image processing device | |
DE102010001520A1 (en) | Iris detection system and method supported by an aircraft sensor | |
EP1504960B1 (en) | Device and method to improve vision in a vehicle | |
DE102018201054A1 (en) | System and method for image representation by a driver assistance module of a vehicle | |
DE112011105439B4 (en) | Red-eye detection device | |
DE112018005191T5 (en) | System and method for improving the signal-to-noise ratio when tracking objects under poor lighting conditions | |
EP1703462B1 (en) | System or method for enhancing an image | |
DE102014100352A1 (en) | Method for detecting condition of viewing direction of rider of vehicle, involves estimating driver's line of sight on basis of detected location for each of eye characteristic of eyeball of rider and estimated position of head | |
DE102014002134A1 (en) | Device for detecting a lighting environment of a vehicle and control method thereof | |
DE102013114996A1 (en) | Method for applying super-resolution to images detected by camera device of vehicle e.g. motor car, involves applying spatial super-resolution to area-of-interest within frame to increase the image sharpness within area-of-interest | |
DE112018006886T5 (en) | Occupant condition detection apparatus, occupant condition detection system, and occupant condition detection method | |
EP4078941A2 (en) | Converting input image data from a plurality of vehicle cameras of a surround-view system into optimised output image data | |
CN111833367A (en) | Image processing method and device, vehicle and storage medium | |
DE102014100364A1 (en) | Method for determining eye-off-the-road condition for determining whether view of driver deviates from road, involves determining whether eye-offside-the-road condition exists by using road classifier based on location of driver face | |
DE102012214637A1 (en) | Method for adjusting light emitting characteristic of headlamp to illuminate environment of vehicle, involves determining luminance characteristic of headlamp based on light distribution and default value for light distribution | |
DE102014209863A1 (en) | Method and device for operating a stereo camera for a vehicle and stereo camera for a vehicle | |
WO2004068850A1 (en) | Method for adjusting an image sensor | |
DE102018201909A1 (en) | Method and device for object recognition | |
DE112021007535T5 (en) | Setting the shutter value of a surveillance camera via AI-based object detection | |
DE102004047476B4 (en) | Device and method for adjusting a camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20190325 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20200224 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ROBERT BOSCH GMBH |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200908 |