WO2023243162A1 - Camera control device and vehicle control device - Google Patents
Camera control device and vehicle control device Download PDFInfo
- Publication number
- WO2023243162A1 WO2023243162A1 PCT/JP2023/008918 JP2023008918W WO2023243162A1 WO 2023243162 A1 WO2023243162 A1 WO 2023243162A1 JP 2023008918 W JP2023008918 W JP 2023008918W WO 2023243162 A1 WO2023243162 A1 WO 2023243162A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control device
- camera
- vehicle
- imaging
- camera control
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 135
- 239000013598 vector Substances 0.000 claims description 28
- 239000000284 extract Substances 0.000 claims description 2
- 238000001514 detection method Methods 0.000 abstract description 28
- 238000012545 processing Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 9
- 238000005286 illumination Methods 0.000 description 8
- 238000000034 method Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 4
- 230000001934 delay Effects 0.000 description 3
- 230000000737 periodic effect Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
Definitions
- the present invention improves recognition accuracy at night by utilizing headlight control by the vehicle's own camera at night in a vehicle that has a camera, headlights, and advanced driver assistance system (ADAS) functions.
- ADAS advanced driver assistance system
- the present invention relates to a camera control device and a vehicle control device that can appropriately provide driving support using this recognition.
- a method has been proposed for a vehicle system that is equipped with a camera and uses information from the camera to control the illumination of headlights to assist the driver in driving at night.
- Light distribution control is a technology that has been proposed for a long time. For example, as in Patent Document 1, the distance between the vehicle and the preceding vehicle is measured, and the high beam and low beam are switched depending on the distance between the vehicles at night.
- Various proposals have been made as conditions for switching headlight control, including not only the presence of a preceding vehicle but also the presence or absence of an oncoming vehicle.
- Patent Document 2 proposes a technique that removes reflective objects and facilitates the detection of oncoming vehicle lights.
- the headlights and camera are controlled in synchronization at the same time, and the camera captures images only at the same time as the moment the headlights are turned off, making it possible to remove reflective objects.
- the image captured by the camera is only of the light-emitting object, making it easier to detect the lights of an oncoming vehicle.
- Patent Document 3 discloses that headlights and cameras are controlled in synchronization at the same time, and images are always taken alternately when the headlights are turned off and turned on. , which realizes the detection of reflective objects and luminescent objects. Furthermore, by dividing the irradiation area of the headlight and performing light control independently for each area, for example, it is possible to lengthen the lights-out period only in a specific area to prevent dazzling. On the other hand, since the timing of imaging corresponds to the timing of turning off and lighting up the entire area, the longer the off time of a specific area becomes, the longer the interval between imaging becomes.
- the interval between the imaging timings (imaging time interval) between when the headlights are turned on and when they are turned off is as short as possible.
- variable timing control means for making the camera imaging timing variable; and (2) means for controlling headlight on/off.
- phase control means for shifting the phase of the imaging timing in order to adjust the imaging timing for shortening the imaging timing when the lights are turned on and off and the illumination control timing of the headlights; and (4) providing means for driving support such as automatic emergency braking using the detection information of reflective objects.
- This reflective object 103 may be reflected by a reflector of a person or a bicycle, but it may also be a pseudo object that is visible on the road due to the reflection of headlights or the like.
- FIG. 2B shows an image when the headlights were turned off at the same time t1, in which the reflective object is not displayed on the screen, and only the light-emitting object, the headlight 100 of the oncoming vehicle, is displayed on the screen.
- FIG. 2C shows an image when the headlights are turned off at time t2, which is after time t1.
- FIG. 2 it is possible to determine whether an object is a reflective object because it is not detected in the image when the lights are off, but FIG. 3 shows the determination of moving objects, stationary objects, and pseudo objects among reflective objects.
- the headlights are turned on, and a reflective object 111 and a reflective pseudo object 113 on the inside of the roadway, and a reflective object 112 on the outside of the roadway are detected (FIG. 3(a)). It is assumed that the headlights continue to be turned on at time t2 after time t1, and a reflective object 114 and a reflective pseudo object 116 on the inside of the roadway, and a reflective object 115 on the outside of the roadway are detected (FIG. 3(b)).
- the vector in the moving direction can be calculated from the position of the reflecting object.
- the movement vector of the reflective object inside the roadway is 117, and the movement vector of the reflective object outside the roadway is 118. Since the reflective pseudo object 116 is always at the same location on the screen, the movement vector 119 is zero (FIG. 3(b)).
- pseudo objects can be identified due to the absence of movement vectors. Further, stationary objects and moving objects can be distinguished from the direction and magnitude of the movement vector.
- FIG. 4 shows the principle of imaging at variable timing when the headlights are turned on and off in the first embodiment of the present invention.
- FIG. 4(a) shows a timing chart before phase adjustment of headlight on/off and imaging timing
- FIG. 4(b) shows a timing chart after phase adjustment.
- the camera issues an instruction to the headlights regarding the duty ratio of turning them off and on.
- the headlights receive the duty ratio instruction, and the headlights repeatedly turn off and on according to the duty ratio.
- the duty ratio between turning off and turning on is 1:9, so turning off and turning on is repeated at a time interval of 9 when turning on for 1 when turning off.
- the camera has a shutter control that allows variable imaging timing, and performs imaging at short time intervals and imaging at long time intervals.
- the camera shutter control time interval repeats periodically, independently and asynchronously of the headlights.
- FIG. 4A there are short time intervals between times t1 and t2 and between t2 and t3, and long time intervals between t3 and t4 and between t4 and t5. Capturing images when the headlights are off produces a dark image, while capturing images when the headlights are on produces a bright image.
- time t4, time t9, and time t14 are images when the headlights are turned off, and images at other times are images when the headlights are turned on.
- FIG. 4(b) shows a timing chart after phase adjustment of headlight on/off and imaging timing.
- Time t1, time t2, time t6, time t7, time t11, and time t12 are images when the headlights are turned off, and images at other times are images when the headlights are turned on.
- imaging is performed once when the lights are off, whereas in FIG. 4(b), imaging can be performed twice when the lights are off. Furthermore, it is possible to capture an image in the vicinity of the change in headlight turning off and turning on.
- the on/off duty of the headlights, the corresponding variable imaging timing of the camera, and the number of images to be acquired when the lights are off and when the lights are on are determined in advance, for example, 2 images when the lights are off and 6 images when the lights are on.
- Fig. 4(a) there are 1 image when the lights are off and 7 images when the lights are on.Since the number of images is different, by adjusting the phase of the imaging timing
- Fig. 4(b) there are 2 images when the lights are off and 7 images when the lights are on. Since the number of images is six, which is the same as the predetermined number of images when the lights are off and images when the lights are on, the state is after phase adjustment. Self-adjustment (self-test) of phase adjustment can be performed in this manner.
- FIG. 1 shows an example of the configuration of a vehicle system according to the first embodiment of the present invention, which detects reflective objects with a camera and is used for object detection for driving support.
- the camera is not particularly limited to a monocular camera or a stereo camera, but in the first embodiment, it is a stereo camera that has two left and right lenses and can detect the distance of an object by trigonometry.
- the camera control unit 3 is connected to the camera imaging unit 2 via an LVDS cable, and includes an image processing unit 20 that processes the transferred RAW image, a recognition unit 21 that recognizes objects and lanes on the image, and a recognition unit 21 that controls turning on and off the headlights. It is comprised of a light control section 23 that instructs light distribution control to switch between high beam and low beam depending on timing and the presence or absence of a vehicle, and a CAN-IF section 22 that is connected to a CAN bus and sends and receives data.
- the image processing unit 20 has a function of instructing the camera imaging unit 2 to take a shutter, which is the timing of image capture, and a function of generating an image necessary for recognition from the transferred RAW image.
- the former includes an imaging timing calculation unit 30 that calculates the imaging timing of the camera imaging unit 2, shifts the phase of the imaging timing to match the turning on and off of the light, and issues a shutter instruction to the camera imaging unit 2, which is the imaging timing. This is performed by the imaging phase adjustment section 31.
- the recognition unit 21 includes a lights-off duty calculation unit 40 that determines the duty of the time when the headlights are turned on and off, a pseudo object removal unit 41 that removes pseudo objects that are objects that do not actually exist, such as the reflection of the own vehicle's lights, and a stationary object removal unit 41 a moving object determination unit 42 that determines whether a vehicle is a moving object; a lane/road boundary detection unit 43 that detects lanes from images and detects road boundaries using reflective objects such as lanes, curbs, road shoulders, or guardrails; and light distribution control.
- the light distribution light detection section 44 detects the headlights and taillights of other vehicles.
- the light control unit 23 includes a duty instruction unit 70 for determining the timing of turning on and off the headlights using a duty ratio, and a light distribution instruction unit 71 for instructing light distribution control to switch between high beam and low beam depending on the presence or absence of a vehicle.
- the vehicle control unit 4 includes a lane departure control unit 24 that determines whether the vehicle deviates from the lane and issues a warning or a steering control instruction to return the vehicle to the lane when the vehicle deviates, and a lane departure control unit 24 that determines whether the vehicle will collide with an obstacle. It is comprised of an automatic emergency brake control section 25 that determines whether a collision is occurring and automatically issues an emergency braking instruction when there is a high possibility of a collision, and a CAN-IF section 26 that is connected to the CAN bus and sends and receives data.
- the automatic emergency brake control unit 25 includes an obstacle trajectory calculation unit 60 that calculates the trajectory of an obstacle, a risk calculation unit 61 that calculates the trajectory of the vehicle and the risk of collision with an obstacle, and a risk calculation unit 61 that calculates the trajectory of the own vehicle and the risk of collision with the obstacle.
- the control unit 62 is configured to generate a warning or brake instruction based on the information provided.
- turning on and off the light and taking images with the camera operate independently and asynchronously, but both are cyclic processes, and by adjusting the phase, a specific number of images can be taken when the light is on and when the light is off. I do.
- the light is an example of automatic lighting, the present invention is not limited and can also be applied to manual lighting.
- FIG. 5 shows a flowchart for phase adjustment of light on/off and imaging timing.
- the imaging timing calculation unit 30 causes the camera imaging unit 2 to perform dots at regular intervals and at intervals shorter than the intervals based on a lighting/extinguishing pattern of a predetermined frequency and duty for the headlight unit 10, which will be described later.
- the camera imaging unit 2 sets the shutter timing in the shutter setting unit 27 based on the shutter timing (imaging timing pattern) transferred from the image processing unit 20, and performs imaging at the set timing (S15).
- the light control section 23 receives the lights-off duty and instructs the headlight section 10 to set the lights-off duty using the duty instruction section 70 (S16).
- the duty instruction section 70 sets a predetermined frequency (corresponding to the time interval between turning off and turning on) and a turning-on/off pattern of duty in the headlight section 10. Further, the light distribution instruction section 71 uses the light on/off pattern to control (the light distribution of) the headlight section 10 to turn on/off.
- the headlight section 10 sets a light-off duty (light-on/off pattern) in the light-off duty setting section 28, and periodically turns on and off at the light-off duty determined by the light distribution control section 29 (S17). For example, if the cycle is 1 sec and the light-off duty is 1:9, the light will be turned on and off repeatedly with a light-off time of 100 ms and a lighting time of 900 ms (see FIG. 4(a)).
- Figure 6 shows a flowchart of object recognition.
- the camera is assumed to be a stereo camera, but is not particularly limited.
- the image comparison unit 34 determines whether the object is a reflective object or a luminescent object, and adds reflective/luminous object attributes.
- the attributes will be updated as needed until they are finalized. Since the object that can be detected in the illuminated image may be both a luminescent object and a reflective object, it is set to an indeterminate attribute and is set to be undetermined. Objects that can be detected in the unlit image are determined to be luminescent objects. Since the on/off attribute is added to the image, when one of the images taken at a certain time (t1) and the image taken at the previous time (t2) has the on attribute and the other has the off attribute, the image Make a comparison.
- the brightness images when the lights are on and off are compared, and objects that are detected both when the lights are off and when the lights are on are determined (extracted) as luminescent objects, and objects that are detected only when the lights are on are determined (extracted) as reflective objects.
- the types of these luminous objects and reflective objects are added as reflective/luminous object attributes (S36).
- the image comparison unit 34 compares a plurality of brightness images taken at consecutive times, calculates a movement vector from the difference in relative coordinates of the object, and adds it (S37).
- Objects include both reflective objects and luminescent objects.
- the movement vector is calculated by complementing and connecting the brightness images when the reflecting object is on.
- Each object holds as an attribute a movement vector between a certain time (t1) at which the image was taken and a time (t2) immediately before the image was taken.
- t1 time at which the image was taken
- t2 time immediately before the image was taken.
- the light is on at the time (t3) immediately before the time (t2), so the time (t2) and the time
- the relative position at time (t1) is predicted using the relative position at (t3).
- the pseudo object removal unit 41 removes them from among the objects that have reflective/luminous object attributes.
- An object whose movement vector is zero can be determined to be a pseudo object and removed (S38).
- pseudo objects that are objects that do not actually exist such as the reflection of the own vehicle's lights, can be eliminated. It is possible to exclude obstacles from light distribution control.
- the reflective object information is used to detect lanes using reflectors on guardrails on the roadside or delinators. improve rate.
- FIG. 7 shows a flowchart of the lane/road boundary detection unit 43.
- a brightness image is generated in the brightness image generation unit 32 of the image processing unit 20 (S41).
- the edge image generation unit 35 generates an edge image from the gradient of the luminance image (S42).
- the lane/road boundary detection unit 43 detects a lane from the edge image, and calculates a coordinate point according to a certain relative distance from the own vehicle (S43).
- the lane/road boundary detection unit 43 detects a reflective object with a reflective/luminous object attribute, a stationary object with a moving object attribute, and multiple objects detected at equal intervals in relative position from the own vehicle. It is extracted as a road boundary group (S44).
- S44 road boundary group
- reflective objects that are determined to be stationary objects by the lane/road boundary detection unit 43 are placed at equal intervals (at a fixed distance) relative to the host vehicle, they are determined to be a road boundary (group). to decide.
- whether to use lane detection information or road boundary groups is selected as lane information.
- the lane/road boundary detection unit 43 When a lane is detected on the screen (S45), the lane/road boundary detection unit 43 outputs the coordinate point of the lane detection as lane information (S46). If no lane is detected on the screen (S45), the lane/road boundary detection unit 43 outputs a coordinate point that is a certain distance closer to the own vehicle from the coordinates of the road boundary group as lane information (S47).
- Light distribution control is a technology that switches between high beam and low beam depending on the presence or absence of an object, such as high beam when there is no object and low beam when there is an object.
- a feature of the first embodiment of the present invention is that reflective objects including pseudo objects are excluded from light distribution control targets by using the reflection/emission object attributes of the recognition target (pseudo objects removal section 41).
- the light distribution control of the headlights is finely switched between high beam and low beam, the driver will feel uncomfortable, so in order to improve the accuracy of obstacle detection, objects are accumulated from multiple images and saved (S51).
- the presence or absence of the same object in multiple images is checked (S52). If there are identical objects, the reflective/luminous object attributes of the objects are used, and if they are reflective objects, they are treated as if they do not exist (S53). Further, if the object is not a reflective object (S53), it is determined whether the same object is a headlight or taillight of another vehicle detected by the light distribution light detection unit 44 (S54).
- the light distribution instruction unit 71 When the three conditions (S52, S53, S54) are satisfied, it is determined that there is a valid object that should be set to low beam, and the light distribution instruction unit 71 generates a light distribution instruction to set low beam, and the headlight unit A light distribution instruction is sent to 10 (S55).
- the light distribution control section 29 of the headlight section 10 controls the headlights to be set to low beam (S56).
- the light distribution instruction unit 71 generates a light distribution instruction to set the high beam, and sends the light distribution instruction to the headlight unit 10 (S57).
- the light distribution control section 29 of the headlight section 10 controls the headlights to be set to high beam (S58).
- the reflective object determination conditions (S53) and target object determination conditions (S54) for light distribution control can be changed and expanded.
- the light-emitting object is the headlight or taillight of another vehicle, but if the attribute of a pedestrian or bicycle is to be added, then the condition should be added when a reflective object such as a pedestrian or bicycle is detected. You can also do that.
- Figure 9 shows a flowchart of automatic emergency brake control.
- the camera control unit 3 sends reflective/luminescent object attributes, relative position attributes, and moving object attributes as object information for recognition (S61).
- the automatic emergency brake control section 25 uses the accumulated object information to predict the trajectory of the obstacle in the obstacle trajectory calculation section 60 (S62).
- the obstacle trajectory calculation unit 60 predicts the trajectory of the light-emitting object using the relative position information of the object information based on the reflective/light-emitting object attribute, and the trajectory of the light-emitting object determined based on the reflective/light-emitting object attribute during the lights-out period. Since it cannot be detected and the relative position is unknown, the trajectory is predicted by predicting the position using linear interpolation or polynomial interpolation from the relative positions of multiple times back (S63).
- the reflecting object corresponds to both stationary objects and moving objects, so the moving object attribute is not used.
- a stationary object with the moving object attribute may be treated as the target object's trajectory, skipping trajectory prediction and treated as stationary, and calculate the relative position of the own vehicle with respect to the own vehicle trajectory.
- the obstacle trajectory calculation unit 60 determines that the moving object is a pedestrian or a bicycle based on the relative speed of the own vehicle, and calculates the future trajectory based on the past trajectory. Predict the trajectory.
- the risk calculation unit 61 calculates whether the trajectory of the light-emitting object and the reflective object is on the path of the own vehicle, and calculates the risk of collision (S64).
- the control unit 62 issues a warning or an emergency brake control instruction based on the collision risk calculated by the risk calculation unit 61 (when it is determined that the risk of collision with the host vehicle is high) (S65). Thereafter, the brake unit 13 of the actuator receives the emergency brake control instruction via the CAN 6, and performs brake control of the own vehicle based on the control instruction.
- FIG. 10 shows a flowchart of automatic lane departure control.
- the camera control unit 3 sends out reflective/luminous object attributes, relative position attributes, moving object attributes, and recognition lane information as recognition object information (S71).
- the own vehicle trajectory calculation unit 52 predicts the trajectory of the own vehicle from the steering information of the own vehicle (S72).
- the deviation determination unit 53 inputs the trajectory predicted from the steering information and the recognized lane information, and calculates the risk of deviating from the lane (S73).
- the control unit 51 issues a warning or a steering control instruction to avoid lane departure (to prevent entry into the road boundary) (S74).
- the steering unit 12 of the actuator receives the steering control instruction via the CAN 6, and performs steering control of the own vehicle based on the control instruction.
- the principle of the second embodiment is shown in FIG. It is assumed that there are two cycles, and the ratio of lights off and lights on in the first cycle is 1:9, and the ratio of lights off and lights on in the second cycle is 1:4. That is, the headlights are turned on and off periodically at different duties (1:9 in the first cycle, 1:4 in the second cycle).
- the imaging timing of the camera is also variable according to the two-cycle on/off duty. By performing phase adjustment, in the first cycle, images are taken twice when the lights are off, and six times when the lights are on, and in the second cycle, images are taken three times when the lights are off, and five times when the lights are on. Repeat this cycle. Two cycles is one example; by using a multi-duty system with three or four cycles and multiple cycles, the possibility of synchronization with other vehicles can be significantly reduced.
- the first embodiment and second embodiment of the present invention can also be expanded as follows.
- the headlight turn-off duty calculation of the recognition unit 21 can be set variably instead of fixedly in the light-off duty calculation unit 40.
- the light-off duty it can be set based on a random number when the ignition is turned on, for example. Further, in order to reduce the frequency of turning off the headlights, it is also possible to turn off the headlights only during a specific period or at a specific location based on the map information of the map section 11.
- the camera control device (camera control section 3) of this embodiment is a camera that controls the headlight section 10 that illuminates the front of the own vehicle and the camera imaging section 2 that takes an image of the front of the own vehicle.
- the camera control device sets a lighting/extinguishing pattern of a predetermined frequency and duty in the headlight section 10 (duty instruction section 70), and turns on/off using the lighting/extinguishing pattern.
- the headlight unit 10 (light distribution) is controlled (light distribution instruction unit 71), and the camera imaging unit 2 receives a periodic first signal based on a predetermined frequency and duty for the headlight unit 10.
- An imaging timing pattern different from the lighting/extinguishing pattern is set for an imaging time interval of 1 and a second imaging time interval shorter than the first imaging time interval (imaging timing calculation unit 30), and the imaging timing pattern of the camera imaging unit 2 is set.
- imaging timing calculation unit 30 By adjusting the time phase (imaging phase) of the timing pattern, the imaging time interval when the headlight section 10 is switched between turning off and turning on is reduced (shortened) (imaging phase adjustment section 31).
- the camera control device adjusts the imaging timing pattern of the camera imaging unit 2 based on the number of images at the time of lighting and the number of images at the time of turning off of each image captured by the camera imaging unit 2 with the time phase shifted. Adjust the time phase (imaging phase adjustment section 31).
- the camera control device measures the brightness of a specific pixel of each image captured by the camera imaging unit 2 while shifting the time phase, determines an image when the light is on and an image when the light is off based on the luminance, and
- the time phase of the imaging timing pattern of the camera imaging unit 2 is adjusted by checking the number of images when the light is on and the number of images when the light is off periodically at the shutter timing in the time phase (imaging phase adjustment unit 31). .
- the camera control device compares a first brightness image and a second brightness image obtained by turning off and turning on the headlight unit 10 captured by the camera imaging unit 2, extracts a reflective object, and generates a plurality of brightness images.
- the moving vector of the reflecting object in the image is calculated by comparing the moving vectors of the reflecting object, and when the moving vector of the reflecting object is opposite to the moving vector of the own vehicle, the reflecting object is regarded as a stationary object and the moving vector of the reflecting object is zero.
- the reflective object is determined to be a pseudo object, and in all other cases, the reflective object is determined to be a moving object (luminance image generation section 32, image comparison section 34, pseudo object removal section 41, and moving object determination section 42).
- the camera control device variably sets the duty of the headlight section 10 (light-off duty calculation section 40).
- the camera control device sets the duty of the headlight unit 10 based on a random number (light-off duty calculation unit 40).
- the camera control device periodically turns on and off the headlight section 10 at different duties a plurality of times (light-off duty calculation section 40, duty instruction section 70, light distribution instruction section 71).
- the camera control device causes the headlight section 10 to be turned off only during a specific period or at a specific location based on the map information (lights-off duty calculation section 40).
- the camera control device excludes the pseudo object of the reflective object from the obstacle target of the light distribution control. removal section 41, light distribution instruction section 71).
- the vehicle control device (vehicle control unit 4) of this embodiment is a vehicle control device that controls the own vehicle based on information output from the camera control device, and the vehicle control device controls the own vehicle based on the information output from the camera control device. If the reflective object is the moving object, it is determined that the moving object is a pedestrian or a bicycle based on the relative speed of the vehicle, the future trajectory is predicted from the past trajectory, and the vehicle The risk of a collision with the own vehicle is calculated, and when it is determined that the risk of a collision with the own vehicle is high, a warning or automatic emergency brake control is performed (automatic emergency brake control unit 25).
- control lines and information lines are shown that are considered necessary for explanation, and not all control lines and information lines are necessarily shown in the product. In reality, almost all components may be considered to be interconnected.
- Pseudo object removal unit 42 Moving object determination unit 43... Lane/road boundary detection unit 44... Light distribution light detection unit 51, 62... Control unit 52... Self-vehicle Trajectory calculation unit 53... Deviation determination unit 60... Obstacle trajectory calculation unit 61... Risk calculation unit 70... Duty instruction unit 71... Light distribution instruction unit 100... Oncoming vehicle at time t1 Headlights 101...Reflection on a reflective object on the curb at time t1 102...Reflection on a reflective object on a guardrail at time t1 103...Reflection on the own vehicle's headlight at time t1 (pseudo object) 104... Headlight of an oncoming vehicle at time t2 111...
- Reflective object of a bicycle in the roadway at time t1 For example, Reflective object of a bicycle in the roadway at time t1; Reflective object of a stationary object outside the roadway at time t1 113... Inside the roadway at time t1 Pseudo object 114 due to the reflection of the headlights...Reflection object 115 of a bicycle on the roadway at time t2...Reflection object 116 of a stationary object outside the roadway at time t2...Reflection of the headlights on the roadway at time t2 Pseudo object due to reflection 117...Movement vector of the reflective object of the bicycle on the roadway at time t2-t1...Movement vector of the reflective object of a stationary object outside the roadway at time t2-t1...Time t2- Movement vector of the pseudo object due to reflection of headlights in the roadway at t1
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
Abstract
Provided are a camera control device and a vehicle control device that can be realized by turning headlights off for a short time and comparing images when light is turned off and turned on in order to detect a reflective object using a camera and use the reflective object for object detection in driving assistance, and that, due to the camera and the headlights being physically separated, make it possible to asynchronously control the imaging timing of the camera and the turning-on/-off of the headlights as well as to capture images near the timing of switching between turning on and turning off. The time phase of an imaging timing pattern (imaging phase) of a camera imaging unit 2 is adjusted, as a result of which an imaging time interval at the time of switching between turn-off and turn-on of a headlight unit 10 is curtailed (shortened).
Description
本発明は、カメラとヘッドライトと、先進運転支援システム(ADAS:Advanced Driver-Assistance Systems)機能を有する車両において、夜間時に自車のカメラによるヘッドライトの制御を活用して、夜間の認識精度向上とその認識を用いて適切に運転支援を行うことができるカメラ制御装置及び車両制御装置に関する。
The present invention improves recognition accuracy at night by utilizing headlight control by the vehicle's own camera at night in a vehicle that has a camera, headlights, and advanced driver assistance system (ADAS) functions. The present invention relates to a camera control device and a vehicle control device that can appropriately provide driving support using this recognition.
夜間の運転は、周辺が暗いため、視認性が低く、例えば、歩行者や無灯火の自転車に対し、ドライバは昼間に比べて危険の発見が遅れることがある。歩行者や自転車に反射物体を付ける場合は、自車のヘッドライトが反射物体に反射することによって、ドライバに対し、危険発見の遅れを防ぐ補助手段となる。
When driving at night, the surroundings are dark, so visibility is low. For example, drivers may be slower to spot danger from pedestrians or unlit bicycles than during the day. When a reflective object is attached to a pedestrian or bicycle, the headlights of the own vehicle are reflected on the reflective object, which serves as an auxiliary means for drivers to prevent delays in detecting danger.
カメラを有し、カメラの情報を用いてヘッドライトの照射制御を行う車両システムで、ドライバの夜間時の運転を補助する方法が提案されている。
A method has been proposed for a vehicle system that is equipped with a camera and uses information from the camera to control the illumination of headlights to assist the driver in driving at night.
古くから提案されている技術として、配光制御がある。例えば、特許文献1のように先行車との車間距離を計測し、夜間時に車間距離に応じてハイビームとロービームを切替える。ヘッドライトの制御の切替えの条件として、先行車だけでなく、対向車の有無など、様々な提案が行われている。
Light distribution control is a technology that has been proposed for a long time. For example, as in Patent Document 1, the distance between the vehicle and the preceding vehicle is measured, and the high beam and low beam are switched depending on the distance between the vehicles at night. Various proposals have been made as conditions for switching headlight control, including not only the presence of a preceding vehicle but also the presence or absence of an oncoming vehicle.
反射物体か発光物体かを判定するためには、自車のヘッドライトを短時間消灯することにより、自車のヘッドライトの消灯時に消失するのが反射物体、消失しないのが発光物体であり、ヘッドライトの消灯により、反射物体、発光物体を区別することができる。
To determine whether an object is a reflective object or a luminescent object, turn off your vehicle's headlights for a short time. Reflective objects disappear when your vehicle's headlights are turned off, and luminescent objects do not disappear when the vehicle's headlights are turned off. By turning off the headlights, reflective objects and light-emitting objects can be distinguished.
特許文献2では、反射物体を除去し、対向車ライト検出を容易にする技術が提案されている。ヘッドライトとカメラが同じ時刻に同期して制御され、ヘッドライトが消灯した瞬間と同時刻のみカメラで撮像することで、反射物体を除去することができる。これにより、カメラに撮像された画像は発光物体のみであり、対向車のライトの検出が容易となる。
Patent Document 2 proposes a technique that removes reflective objects and facilitates the detection of oncoming vehicle lights. The headlights and camera are controlled in synchronization at the same time, and the camera captures images only at the same time as the moment the headlights are turned off, making it possible to remove reflective objects. As a result, the image captured by the camera is only of the light-emitting object, making it easier to detect the lights of an oncoming vehicle.
次に、反射物体と発光物体をそれぞれ検出するために、特許文献3では、ヘッドライトとカメラが同じ時刻に同期して制御され、ヘッドライトが消灯と点灯のタイミングで常に交互に撮像することにより、反射物体と発光物体の検出を実現している。更に、ヘッドライトの照射領域を分割し、領域ごとに独立してライト制御を行うことで、例えば、幻惑防止のために特定の領域のみ消灯期間を長くすることができる。一方、撮像のタイミングは全領域の消灯と点灯のタイミングとなるため、特定の領域の消灯時間が長くなると、撮像の間隔が長くなる。
Next, in order to detect reflective objects and light-emitting objects, Patent Document 3 discloses that headlights and cameras are controlled in synchronization at the same time, and images are always taken alternately when the headlights are turned off and turned on. , which realizes the detection of reflective objects and luminescent objects. Furthermore, by dividing the irradiation area of the headlight and performing light control independently for each area, for example, it is possible to lengthen the lights-out period only in a specific area to prevent dazzling. On the other hand, since the timing of imaging corresponds to the timing of turning off and lighting up the entire area, the longer the off time of a specific area becomes, the longer the interval between imaging becomes.
このように、反射物体の除去、または、反射物体の検出にヘッドライト制御を組みあわせて行うことが提案されている。
As described above, it has been proposed to perform headlight control in combination with the removal of reflective objects or the detection of reflective objects.
ヘッドライトの照射制御によって反射物体を検出する場合、ヘッドライトの点灯と消灯での撮像タイミングの間隔(撮像時間間隔)はできるだけ短い方が望ましい。点灯時の画像と消灯時の画像を比較する場合、画像取得の時間間隔が長いと、画像内に写った物体の移動量が多くなり、画面から無くなることがあり、反射物体のために消失したのか、物体の移動により画面から消失したのかの判別を行う必要がある。
When detecting a reflective object by controlling the illumination of headlights, it is desirable that the interval between the imaging timings (imaging time interval) between when the headlights are turned on and when they are turned off is as short as possible. When comparing images when the lights are on and images when the lights are off, if the time interval between image acquisitions is long, the objects in the images may move a lot and disappear from the screen, and objects that are reflected may disappear due to reflective objects. It is necessary to determine whether the object has disappeared from the screen due to movement.
点灯と消灯の撮像タイミングの時間間隔を短くすると、カメラの撮像タイミングとヘッドライトの照射制御を同時刻に同期して制御することが困難となる。カメラとヘッドライトが独立の機器で、物理的に離れて配置され、機器間の情報をCANで通信する場合には、機器のそれぞれのマイコンでの処理やCANでの機器間の通信遅延があるため、数10ms以上の遅延を要し、同期制御は難しいと考えられる。このため、カメラの撮像タイミングとヘッドライトの照射制御を非同期にて制御を行う必要がある。特許文献2と特許文献3はいずれもカメラの撮像タイミングとヘッドライトの照射制御が同期となっているため、点灯と消灯の短い時間間隔での撮像には適さない。
If the time interval between the imaging timings of turning on and off is shortened, it becomes difficult to synchronize and control the camera imaging timing and the headlight illumination control at the same time. If the camera and headlight are independent devices and are physically separated and the information between the devices is communicated via CAN, there will be processing delays in each device's microcontroller and communication delays between the devices via CAN. Therefore, a delay of several tens of ms or more is required, and synchronous control is considered difficult. For this reason, it is necessary to control the camera's imaging timing and the headlight illumination control asynchronously. In both Patent Documents 2 and 3, the imaging timing of the camera and the illumination control of the headlights are synchronized, so they are not suitable for imaging at short time intervals between turning on and turning off the lights.
また、カメラを配光制御だけでなく、自動緊急ブレーキなどの運転支援に用いる物体検知に使用するためには、例えば50ms周期以内など特定の周期以内で繰り返し撮像を行う必要がある。特許文献2のように消灯のタイミングのみ撮像することや、特許文献3のように幻惑防止のために、消灯期間を延長し、撮像間隔を長くすることは、運転支援用の撮像には適さない。
Furthermore, in order to use the camera not only for light distribution control but also for object detection for use in driving support such as automatic emergency braking, it is necessary to repeatedly capture images within a specific cycle, such as within a 50 ms cycle. Capturing images only when the lights are turned off as in Patent Document 2, or extending the lights-out period and increasing the imaging interval to prevent dazzling as in Patent Document 3 are not suitable for imaging for driving support. .
これらから、カメラにて反射物体を検出し、運転支援の物体検知に利用するためには、点灯と消灯時の撮像タイミングを短くし、カメラの撮像とヘッドライトの点消灯を非同期に制御する手段を提供すること、が課題である。
Therefore, in order to detect reflective objects with a camera and use it for object detection for driving support, it is necessary to shorten the imaging timing when the lights are turned on and off, and to control the camera imaging and the headlights on and off asynchronously. The challenge is to provide the following.
そこで、本発明では、この課題を解決する手段を有するカメラ制御装置及び車両制御装置を提供することを目的とする。点灯と消灯時の撮像タイミングを短くし、カメラの撮像とヘッドライトの点消灯を非同期に制御するために、(1)カメラの撮像のタイミングを可変とする可変タイミング制御手段、(2)ヘッドライトの点消灯を制御する手段としてのデューティ比による照射制御手段、(3)点灯と消灯時の撮像タイミングを短くするための撮像タイミングとヘッドライトの照射制御タイミングが非同期のためにタイミングの位相を揃えるための位相制御を行う制御手段、の3つの手段を有するカメラ制御装置を提供することを目的とする。
Therefore, an object of the present invention is to provide a camera control device and a vehicle control device that have means for solving this problem. In order to shorten the imaging timing when the lights are turned on and off, and to control the camera imaging and the headlights on and off asynchronously, (1) a variable timing control means that makes the timing of camera imaging variable; (2) the headlights. (3) Illumination control means using a duty ratio as a means for controlling turning on and off of the headlights; (3) aligning the timing phases because the imaging timing and the headlight irradiation control timing are asynchronous to shorten the imaging timing when turning on and off. It is an object of the present invention to provide a camera control device having three means: a control means for controlling the phase of the camera;
更に、本発明では、上記の3つの手段によって、歩行者や自転車などの反射物体の検知が可能となるため、それを活用して、(4)反射物体の検知情報を用いて、自動緊急ブレーキ等の運転支援する手段を有する車両制御装置を提供することを目的とする。
Furthermore, in the present invention, since reflective objects such as pedestrians and bicycles can be detected by the above three means, (4) automatic emergency braking can be performed using the detection information of reflective objects. It is an object of the present invention to provide a vehicle control device having driving support means such as the following.
上記目的を達成するため、本発明によるカメラ制御装置は、自車両の前方を照らすヘッドライト部と、前記自車両の前方を撮像するカメラ撮像部とを制御するカメラ制御装置であって、前記カメラ制御装置は、前記ヘッドライト部に所定の周波数とデューティの点消灯パターンを設定し、前記点消灯パターンを用いて、点消灯するよう前記ヘッドライト部を制御し、前記カメラ撮像部には、前記ヘッドライト部向けの所定の周波数とデューティを基に、定期的な第1の撮像時間間隔と該第1の撮像時間間隔より短い第2の撮像時間間隔の、前記点消灯パターンと異なる撮像タイミングパターンを設定し、前記カメラ撮像部の前記撮像タイミングパターンの時間位相を調整することにより、前記ヘッドライト部の消灯時と点灯時の切替時の撮像時間間隔を削減する。
In order to achieve the above object, a camera control device according to the present invention is a camera control device that controls a headlight section that illuminates the front of the own vehicle, and a camera imaging section that takes an image of the front of the own vehicle, The control device sets a lighting/extinguishing pattern of a predetermined frequency and duty in the headlight section, controls the headlight section to turn on/off using the lighting/extinguishing pattern, and sets the headlight section to turn on/off using the lighting/extinguishing pattern. Based on a predetermined frequency and duty for the headlight section, an imaging timing pattern different from the lighting/extinguishing pattern of a regular first imaging time interval and a second imaging time interval shorter than the first imaging time interval. By setting , and adjusting the time phase of the imaging timing pattern of the camera imaging section, the imaging time interval when switching between turning off and lighting the headlight section is reduced.
また、本発明による車両制御装置は、前記静止物体と判定された前記反射物体が一定の距離で配置されている場合、道路境界と判断し、前記道路境界への侵入を防止するように警報または前記自車両の操舵を制御する。また、本発明による車両制御装置は、前記反射物体が前記移動物体である場合、前記移動物体に対して、前記自車両との相対速度との対比から歩行者または自転車と判断し、過去の軌道から将来の軌道を予測し、前記自車両との衝突可能性のリスクを計算し、前記自車両との衝突可能性のリスクが高いと判断されたときに警報または自動緊急ブレーキ制御を行う。
Furthermore, when the reflective object determined to be a stationary object is located at a certain distance, the vehicle control device according to the present invention determines that it is a road boundary, and issues an alarm or alert to prevent intrusion into the road boundary. Controlling the steering of the host vehicle. In addition, when the reflective object is the moving object, the vehicle control device according to the present invention determines that the moving object is a pedestrian or a bicycle based on a comparison with the relative speed of the own vehicle, and The system predicts the future trajectory from the vehicle, calculates the risk of collision with the own vehicle, and issues a warning or performs automatic emergency braking control when it is determined that the risk of collision with the own vehicle is high.
以上述べたように、本発明によれば、ヘッドライトを周期的に短時間消灯し、消灯と点灯の境界で短いタイミングで撮像することにより、反射物体と発光物体を検知することができ、反射物体のうち、ヘッドライトの映り込み等で道路上に物体として見えている疑似物体を正しく除去することで、疑似物体で誤ったヘッドライトの配光制御や自動緊急ブレーキを行うことを回避し、更に、縁石やガードレールの反射物体を用いて道路境界とし、レーン検知ができないときに、その代替として利用することにより、自動逸脱回避システムの性能を向上する、といった、種々の高度運転支援システム(Advanced Driver Assistance Systems:ADAS)を実現することができる。
As described above, according to the present invention, reflective objects and light-emitting objects can be detected by periodically turning off the headlights for short periods of time and capturing images at short timings at the boundary between turning off and turning on the headlights. Among objects, by correctly removing pseudo-objects that appear on the road due to headlight reflections, etc., it is possible to avoid incorrect headlight light distribution control or automatic emergency braking caused by pseudo-objects. Furthermore, various advanced driving assistance systems (Advanced Driving Assistance Systems) are being developed, such as using reflective objects such as curbs and guardrails as road boundaries to improve the performance of automatic departure avoidance systems by using them as substitutes when lane detection is not possible. Driver Assistance Systems (ADAS) can be realized.
上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。
Problems, configurations, and effects other than those described above will be made clear by the description of the embodiments below.
以下、本発明の実施形態を図面に基づいて説明する。
Hereinafter, embodiments of the present invention will be described based on the drawings.
[第一の実施例]
本発明の第一の実施例では、カメラにて反射物体を検出し、運転支援の物体検知に利用するためには、ヘッドライトを短時間消灯し、更に点灯と消灯時の撮像タイミングを短くし、カメラの撮像とヘッドライトの点消灯を非同期に制御するという課題に対し、(1)カメラの撮像のタイミングを可変とする可変タイミング制御手段、(2)ヘッドライトの点消灯を制御する手段としてのデューティ比による照射制御手段、(3)点灯と消灯時の撮像タイミングを短くするための撮像タイミングとヘッドライトの照射制御タイミングを調整するために、撮像タイミングの位相をずらすための位相制御手段、並びに、(4)反射物体の検知情報を用いて、自動緊急ブレーキ等の運転支援する手段を提供する。 [First example]
In the first embodiment of the present invention, in order to detect reflective objects with a camera and use the object detection for driving support, the headlights are turned off for a short period of time, and the imaging timing when the lights are turned on and off is shortened. In response to the problem of asynchronously controlling camera imaging and headlight on/off, we have developed (1) variable timing control means for making the camera imaging timing variable; and (2) means for controlling headlight on/off. (3) phase control means for shifting the phase of the imaging timing in order to adjust the imaging timing for shortening the imaging timing when the lights are turned on and off and the illumination control timing of the headlights; and (4) providing means for driving support such as automatic emergency braking using the detection information of reflective objects.
本発明の第一の実施例では、カメラにて反射物体を検出し、運転支援の物体検知に利用するためには、ヘッドライトを短時間消灯し、更に点灯と消灯時の撮像タイミングを短くし、カメラの撮像とヘッドライトの点消灯を非同期に制御するという課題に対し、(1)カメラの撮像のタイミングを可変とする可変タイミング制御手段、(2)ヘッドライトの点消灯を制御する手段としてのデューティ比による照射制御手段、(3)点灯と消灯時の撮像タイミングを短くするための撮像タイミングとヘッドライトの照射制御タイミングを調整するために、撮像タイミングの位相をずらすための位相制御手段、並びに、(4)反射物体の検知情報を用いて、自動緊急ブレーキ等の運転支援する手段を提供する。 [First example]
In the first embodiment of the present invention, in order to detect reflective objects with a camera and use the object detection for driving support, the headlights are turned off for a short period of time, and the imaging timing when the lights are turned on and off is shortened. In response to the problem of asynchronously controlling camera imaging and headlight on/off, we have developed (1) variable timing control means for making the camera imaging timing variable; and (2) means for controlling headlight on/off. (3) phase control means for shifting the phase of the imaging timing in order to adjust the imaging timing for shortening the imaging timing when the lights are turned on and off and the illumination control timing of the headlights; and (4) providing means for driving support such as automatic emergency braking using the detection information of reflective objects.
第一の実施例の車両システムの構成説明に先立ち、第一の実施例の原理を図2~図4を用いて説明する。
Prior to explaining the configuration of the vehicle system of the first embodiment, the principle of the first embodiment will be explained using FIGS. 2 to 4.
図2にカーブでの反射物体と発光物体の画面表示の例を示す。自車はカーブに向かって進んでいる。図2(a)は時刻t1でのヘッドライト照射時の画像を示す。発光物体として、対向車のヘッドライト100がある。反射物体として、縁石の反射物体101、ガードレールの反射物体102があり、これらは一定間隔で並んでいる。道路の境界の視認性が低い時に、これら一定間隔で並んだ反射物体を用いることで、道路境界のおおよその場所が把握できる。また、自車のヘッドライトの点灯に応じ、反射する物体103もある。この反射物体103は、人や自転車の反射板に反射する場合もあるが、ヘッドライトの映り込み等で道路上に物体として見えている疑似物体である場合もある。同時刻t1でヘッドライトが消灯していた時の画像が図2(b)で、反射物体は画面には映らず、発光物体の対向車のヘッドライト100のみが画面に表示される。点灯画像(図2(a))と消灯画像(図2(b))を比較することにより、縁石やガードレールだけでなく、他の反射物体103も分離することができる。時刻t1より後の時刻のt2でヘッドライトが消灯していた時の画像が図2(c)である。このとき、時間t2-t1分だけ、自車がカーブに近づき、対向車がカーブから自車の方に向かうため、図2(b)に比べて、画面下部のところに対向車のヘッドライト104の位置となる。更に、時刻を経ると、対向車が画面外に位置する為、発光物体の対向車のヘッドライトは画面から消失する。
Figure 2 shows an example of a screen display of a reflective object and a luminous object on a curve. The car is heading towards a curve. FIG. 2(a) shows an image when the headlights are illuminated at time t1. The light-emitting object is a headlight 100 of an oncoming vehicle. The reflective objects include a curb reflective object 101 and a guardrail reflective object 102, which are lined up at regular intervals. When the visibility of road boundaries is low, the approximate location of road boundaries can be determined by using these reflective objects lined up at regular intervals. There is also an object 103 that reflects when the headlights of the own vehicle are turned on. This reflective object 103 may be reflected by a reflector of a person or a bicycle, but it may also be a pseudo object that is visible on the road due to the reflection of headlights or the like. FIG. 2B shows an image when the headlights were turned off at the same time t1, in which the reflective object is not displayed on the screen, and only the light-emitting object, the headlight 100 of the oncoming vehicle, is displayed on the screen. By comparing the lights-on image (FIG. 2(a)) and the lights-off image (FIG. 2(b)), not only curbs and guardrails but also other reflective objects 103 can be separated. FIG. 2C shows an image when the headlights are turned off at time t2, which is after time t1. At this time, the own vehicle approaches the curve for a period of time t2-t1, and the oncoming vehicle heads toward the own vehicle from the curve, so the headlights 104 of the oncoming vehicle appear at the bottom of the screen compared to FIG. 2(b). The position will be Furthermore, as time passes, the oncoming vehicle is located outside the screen, so the headlights of the oncoming vehicle, which are light-emitting objects, disappear from the screen.
ヘッドライトの点灯と消灯は時間差があるため、図2(a)、(b)が同時に起こることは無い。時刻t1が点灯時の画像、消灯時を別時刻でt1より後の時刻をt2としたときの消灯時の画像は図2(c)となり、時間t2-t1の間隔が長いと、環境条件が大きく変わり、発光物体の画面消失の可能性が高まることから、点灯画像と消灯画像を比較する場合、この時間間隔は短い方が望ましいことが言える。
Since there is a time difference between the turning on and turning off of the headlights, Figs. 2(a) and 2(b) will never occur at the same time. The image when the lights are turned on is shown at time t1, and the image when the lights are turned off is shown in Figure 2(c) when the time t1 is a different time and the time after t1 is set to t2.If the interval between times t2 and t1 is long, the environmental conditions are When comparing a lit image and an unlit image, it is preferable that this time interval be short, since this increases the possibility that the light-emitting object will disappear from the screen.
図2で消灯時に画像から検知されないもので、反射物体かどうかを判別することができるが、図3では、反射物体のうち、移動物体、静止物体、疑似物体の判別を示す。
In FIG. 2, it is possible to determine whether an object is a reflective object because it is not detected in the image when the lights are off, but FIG. 3 shows the determination of moving objects, stationary objects, and pseudo objects among reflective objects.
図3に、反射物体の移動物体と静止物体、疑似物体の検知の例を示す。反射物体の移動物体は自転車として、後輪など自転車のいずれかの場所に反射物体が貼られているものとする。反射物体の静止物体は、電柱などに反射物体が貼られているものとする。疑似物体は、自車の映り込み、道路での自車のヘッドライトの反射など実際には物体が無いが見かけ上物体に見えるものを指す。
FIG. 3 shows an example of detection of a moving object, a stationary object, and a pseudo object as reflective objects. It is assumed that the moving object of the reflective object is a bicycle, and a reflective object is affixed somewhere on the bicycle, such as the rear wheel. The stationary reflective object is assumed to be a reflective object affixed to a telephone pole or the like. Pseudo objects refer to objects that appear to be objects, such as reflections of one's own vehicle or the reflection of one's own vehicle's headlights on the road, even though they are not actually objects.
時刻t1ではヘッドライトが点灯され、車道内側の反射物体111と反射疑似物体113、車道外側の反射物体112がそれぞれ検知されているものとする(図3(a))。時刻t1より後の時刻t2でもヘッドライト点灯が継続され、車道内側の反射物体114と反射疑似物体116、車道外側の反射物体115がそれぞれ検知されているものとする(図3(b))。時刻t1と時刻t2の画像を比較し、反射物体の位置から、移動方向のベクトルを算出することができる。車道内の反射物体の移動ベクトルが117、車道外の反射物体の移動ベクトルが118である。反射疑似物体116は常に画面上の同じ場所にあるため、移動ベクトル119がゼロである(図3(b))。
It is assumed that at time t1, the headlights are turned on, and a reflective object 111 and a reflective pseudo object 113 on the inside of the roadway, and a reflective object 112 on the outside of the roadway are detected (FIG. 3(a)). It is assumed that the headlights continue to be turned on at time t2 after time t1, and a reflective object 114 and a reflective pseudo object 116 on the inside of the roadway, and a reflective object 115 on the outside of the roadway are detected (FIG. 3(b)). By comparing the images at time t1 and time t2, the vector in the moving direction can be calculated from the position of the reflecting object. The movement vector of the reflective object inside the roadway is 117, and the movement vector of the reflective object outside the roadway is 118. Since the reflective pseudo object 116 is always at the same location on the screen, the movement vector 119 is zero (FIG. 3(b)).
これにより、疑似物体に関しては、移動ベクトルが無いことにより、判別することができる。また、静止物体と移動物体に関しては、移動ベクトルの向きと大きさから判別することができる。
As a result, pseudo objects can be identified due to the absence of movement vectors. Further, stationary objects and moving objects can be distinguished from the direction and magnitude of the movement vector.
図4に、本発明の第一の実施例のヘッドライト点灯と消灯時の可変タイミングでの撮像の原理を示す。
FIG. 4 shows the principle of imaging at variable timing when the headlights are turned on and off in the first embodiment of the present invention.
図4(a)にヘッドライトの点消灯と撮像タイミングの位相調整前のタイミングチャート、図4(b)に位相調整後のタイミングチャートを示す。
FIG. 4(a) shows a timing chart before phase adjustment of headlight on/off and imaging timing, and FIG. 4(b) shows a timing chart after phase adjustment.
まず、カメラからヘッドライトに対し、消灯点灯のデューティ比の指示が出される。ヘッドライトはデューティ比の指示を受け、ヘッドライトはデューティ比で消灯と点灯を繰り返す。図4では消灯と点灯のデューティ比は1:9であるため、消灯時間1に対し、点灯時間9の時間間隔で消灯と点灯を繰り返す。
First, the camera issues an instruction to the headlights regarding the duty ratio of turning them off and on. The headlights receive the duty ratio instruction, and the headlights repeatedly turn off and on according to the duty ratio. In FIG. 4, the duty ratio between turning off and turning on is 1:9, so turning off and turning on is repeated at a time interval of 9 when turning on for 1 when turning off.
カメラは撮像タイミングが可変のシャッター制御を有し、短い時間間隔での撮像と、長い時間間隔での撮像を行う。カメラのシャッター制御の時間間隔は、ヘッドライトとは独立に非同期に、周期的に繰り返される。図4(a)では、時刻t1とt2の間、t2とt3の間は短い時間間隔で、t3とt4の間、t4とt5の間は長い時間間隔となる。ヘッドライトが消灯時に撮像すると暗い画像、点灯時に撮像すると明るさを持つ画像が得られる。図4(a)では、時刻t4、時刻t9、時刻t14がヘッドライト消灯時の消灯時画像、それ以外の時刻の画像は点灯時画像となる。
The camera has a shutter control that allows variable imaging timing, and performs imaging at short time intervals and imaging at long time intervals. The camera shutter control time interval repeats periodically, independently and asynchronously of the headlights. In FIG. 4A, there are short time intervals between times t1 and t2 and between t2 and t3, and long time intervals between t3 and t4 and between t4 and t5. Capturing images when the headlights are off produces a dark image, while capturing images when the headlights are on produces a bright image. In FIG. 4A, time t4, time t9, and time t14 are images when the headlights are turned off, and images at other times are images when the headlights are turned on.
反射物体の判定のための画像比較の為に消灯時と点灯時の時間間隔を短縮すること、また、消灯時に複数の画像を取得する為に消灯時の時間間隔を短縮するために、撮像間隔の短い箇所をヘッドライトの消灯タイミングと合わせる必要がある。ヘッドライトの点消灯、撮像タイミングはいずれも周期処理であることから、撮像のタイミングの位相をずらす調整を行うことにより、撮像間隔の短い箇所をヘッドライトの消灯タイミングに合わせることができる。
To shorten the time interval between when the lights are off and when the lights are on for image comparison for determining reflective objects, and to shorten the time interval when the lights are off in order to obtain multiple images when the lights are off. It is necessary to match the short point with the timing of turning off the headlights. Since turning on/off of the headlights and the imaging timing are both cyclical processes, by adjusting the phase of the imaging timing, it is possible to match a portion with a short imaging interval to the timing of turning off the headlights.
図4(b)にヘッドライトの点消灯と撮像タイミングの位相調整後のタイミングチャートを示す。時刻t1、時刻t2、時刻t6、時刻t7、時刻t11、時刻t12がヘッドライト消灯時の消灯時画像、それ以外の時刻の画像は点灯時画像となる。図4(a)では消灯時に1回の撮像であったのに対し、図4(b)では消灯時に2回の撮像ができる。また、ヘッドライトの消灯と点灯の変化の近傍で撮像することができる。このように、非同期ではあるものの周期的なタイミングのヘッドライトの点消灯と撮像タイミングにおいて、位相を調整することで、消灯時、並びに消灯と点灯の変化の近傍で多く撮像し、点灯時には一定間隔で撮像することができる。
FIG. 4(b) shows a timing chart after phase adjustment of headlight on/off and imaging timing. Time t1, time t2, time t6, time t7, time t11, and time t12 are images when the headlights are turned off, and images at other times are images when the headlights are turned on. In FIG. 4(a), imaging is performed once when the lights are off, whereas in FIG. 4(b), imaging can be performed twice when the lights are off. Furthermore, it is possible to capture an image in the vicinity of the change in headlight turning off and turning on. In this way, by adjusting the phase of the asynchronous but periodic headlight on/off timing and imaging timing, many images are taken when the headlights are off and near the change between the headlights off and on, and at regular intervals when the headlights are on. can be imaged with.
すなわち、予めヘッドライトの点消灯デューティと、それに対応する、カメラの可変撮像タイミングと、消灯時と点灯時の画像取得数を、例えば、消灯時画像2枚、点灯時画像6枚と定めておくと、図4(a)では消灯時画像1枚、点灯時画像7枚であり、枚数が異なるため、撮像タイミングの位相を調整することにより、図4(b)で消灯時画像2枚、点灯時画像6枚となり、定めておいた消灯時画像と点灯時画像の枚数と同じであるため、位相調整後の状態となる。このような形で位相調整の自己調整(自己テスト)を行うことができる。
That is, the on/off duty of the headlights, the corresponding variable imaging timing of the camera, and the number of images to be acquired when the lights are off and when the lights are on are determined in advance, for example, 2 images when the lights are off and 6 images when the lights are on. In Fig. 4(a), there are 1 image when the lights are off and 7 images when the lights are on.Since the number of images is different, by adjusting the phase of the imaging timing, in Fig. 4(b), there are 2 images when the lights are off and 7 images when the lights are on. Since the number of images is six, which is the same as the predetermined number of images when the lights are off and images when the lights are on, the state is after phase adjustment. Self-adjustment (self-test) of phase adjustment can be performed in this manner.
本発明の第一の実施例の、カメラにて反射物を検出し、運転支援の物体検知に利用する車両システムの構成例を図1に示す。カメラとして、単眼カメラ、ステレオカメラは特に制限されないが、第一の実施例では、左右の二つのレンズを有し、三角法により物体の距離を検出可能なステレオカメラとする。
FIG. 1 shows an example of the configuration of a vehicle system according to the first embodiment of the present invention, which detects reflective objects with a camera and is used for object detection for driving support. The camera is not particularly limited to a monocular camera or a stereo camera, but in the first embodiment, it is a stereo camera that has two left and right lenses and can detect the distance of an object by trigonometry.
車両1は、カメラのレンズや画像センサが内蔵されて自車両前方の撮像を行うカメラ撮像部2、撮像された画像の画像処理や物体の検知・認識を行うカメラ制御部3、認識情報を用いて、ブレーキ制御や操舵制御の運転支援を行う車両制御部4、車両外部の照度を示す照度計5、車両内通信を行うCAN6、自車両の前方を照らすヘッドライトの制御を行うヘッドライト部10、車両の位置とその周囲の環境情報を有する地図部11、車両のアクチュエータである操舵部12とブレーキ部13から構成される。
The vehicle 1 includes a camera imaging unit 2 that has a built-in camera lens and an image sensor and captures images in front of the vehicle, a camera control unit 3 that processes captured images and detects and recognizes objects, and uses recognition information. A vehicle control section 4 performs driving support such as brake control and steering control, an illuminance meter 5 that indicates illuminance outside the vehicle, a CAN 6 that performs in-vehicle communication, and a headlight section 10 that controls headlights that illuminate the front of the own vehicle. , a map section 11 that has information on the location of the vehicle and its surrounding environment, a steering section 12 and a brake section 13 that are actuators of the vehicle.
カメラ撮像部2は、画像を撮像する機能であり、画像を撮像し、RAW画像をLVDSケーブルで出力する。図では省略しているが、左右レンズや画像センサ、画像デジタルデータをシリアル信号に変更するシリアライザ、シャッター設定部27を有する。シャッター設定部27には、撮像シャッターのタイミング情報を保存する。例えば、周期を350ms、シャッター総数8回、うち、2回を25ms、6回を50msのような情報(撮像タイミングパターン)を保存する(図4参照)。
The camera imaging unit 2 has a function of capturing an image, captures an image, and outputs a RAW image via an LVDS cable. Although not shown in the figure, it includes left and right lenses, an image sensor, a serializer for converting image digital data into a serial signal, and a shutter setting section 27. The shutter setting unit 27 stores timing information of the imaging shutter. For example, information (imaging timing pattern) such as a cycle of 350 ms, a total of 8 shutters, 2 for 25 ms, and 6 for 50 ms is stored (see FIG. 4).
カメラ制御部3は、カメラ撮像部2からLVDSケーブルで接続され、転送されたRAW画像を処理する画像処理部20、画像上の物体やレーンの認識を行う認識部21、ヘッドライトの点消灯のタイミングや車両の有無によりハイビームとロービームを切り替える配光制御の指示を行うライト制御部23、CANバスに接続し、データの送受信を行うCAN-IF部22から構成される。
The camera control unit 3 is connected to the camera imaging unit 2 via an LVDS cable, and includes an image processing unit 20 that processes the transferred RAW image, a recognition unit 21 that recognizes objects and lanes on the image, and a recognition unit 21 that controls turning on and off the headlights. It is comprised of a light control section 23 that instructs light distribution control to switch between high beam and low beam depending on timing and the presence or absence of a vehicle, and a CAN-IF section 22 that is connected to a CAN bus and sends and receives data.
画像処理部20は、カメラ撮像部2への撮像タイミングであるシャッター指示を行う機能と、転送されたRAW画像から認識に必要な画像を生成する機能を有する。前者は、カメラ撮像部2の撮像のタイミングを計算する撮像タイミング計算部30、ライトの点消灯に合わせるために、撮像のタイミングの位相をずらし、カメラ撮像部2に撮像タイミングであるシャッター指示を行う撮像位相調整部31が行う。後者は、図では省略しているが、シリアル信号からRAW画像データを生成するデシリアライザ、左右のRAW画像から画素単位での距離情報を有する視差画像を生成する視差画像生成部33、左右の一方のRAW画像から輝度画像を生成する輝度画像生成部32、時間差のある二枚以上の画像を比較する画像比較部34、輝度画像の勾配からエッジ画像を生成するエッジ画像生成部35から構成される。
The image processing unit 20 has a function of instructing the camera imaging unit 2 to take a shutter, which is the timing of image capture, and a function of generating an image necessary for recognition from the transferred RAW image. The former includes an imaging timing calculation unit 30 that calculates the imaging timing of the camera imaging unit 2, shifts the phase of the imaging timing to match the turning on and off of the light, and issues a shutter instruction to the camera imaging unit 2, which is the imaging timing. This is performed by the imaging phase adjustment section 31. Although the latter is omitted in the figure, a deserializer that generates RAW image data from a serial signal, a parallax image generation unit 33 that generates a parallax image having distance information in pixel units from the left and right RAW images, and one of the left and right RAW images. It is composed of a brightness image generation section 32 that generates a brightness image from a RAW image, an image comparison section 34 that compares two or more images with a time difference, and an edge image generation section 35 that generates an edge image from the gradient of the brightness image.
認識部21は、ヘッドライトの点消灯の時間のデューティを定める消灯デューティ計算部40、自車のライトの映り込みなど実際にはない物体である疑似物体を除去する疑似物体除去部41、静止物体か移動物体かを判定する移動体判定部42、画像からレーンを検知し、レーンや、縁石や路肩やガードレールの反射物体を用いて道路境界を検知するレーン・道路境界検知部43、配光制御用の他車両のヘッドライトやテールライトを検知する配光用ライト検出部44から構成される。
The recognition unit 21 includes a lights-off duty calculation unit 40 that determines the duty of the time when the headlights are turned on and off, a pseudo object removal unit 41 that removes pseudo objects that are objects that do not actually exist, such as the reflection of the own vehicle's lights, and a stationary object removal unit 41 a moving object determination unit 42 that determines whether a vehicle is a moving object; a lane/road boundary detection unit 43 that detects lanes from images and detects road boundaries using reflective objects such as lanes, curbs, road shoulders, or guardrails; and light distribution control. The light distribution light detection section 44 detects the headlights and taillights of other vehicles.
ライト制御部23は、ヘッドライトの点消灯のタイミングをデューティ比で定めるためのデューティ指示部70、車両の有無によりハイビームとロービームを切り替える配光制御の指示を行う配光指示部71を有する。
The light control unit 23 includes a duty instruction unit 70 for determining the timing of turning on and off the headlights using a duty ratio, and a light distribution instruction unit 71 for instructing light distribution control to switch between high beam and low beam depending on the presence or absence of a vehicle.
車両制御部4は、車両が車線から逸脱するかどうかを判定し、逸脱したときに警報または車線内に戻すように操舵制御指示を行う車線逸脱制御部24と、車両が障害物に衝突するかどうかを判定し、衝突する可能性が高い時に自動で緊急ブレーキ指示を行う自動緊急ブレーキ制御部25と、CANバスに接続し、データの送受信を行うCAN-IF部26から構成される。
The vehicle control unit 4 includes a lane departure control unit 24 that determines whether the vehicle deviates from the lane and issues a warning or a steering control instruction to return the vehicle to the lane when the vehicle deviates, and a lane departure control unit 24 that determines whether the vehicle will collide with an obstacle. It is comprised of an automatic emergency brake control section 25 that determines whether a collision is occurring and automatically issues an emergency braking instruction when there is a high possibility of a collision, and a CAN-IF section 26 that is connected to the CAN bus and sends and receives data.
車線逸脱制御部24は、自車軌道を予測計算する自車軌道計算部52、自車軌道と道路境界情報から、道路境界に逸脱するかどうかを判定する逸脱判定部53、道路境界に逸脱するとき、警報または操舵制御指示を行う制御部51から構成される。
The lane deviation control unit 24 includes a vehicle trajectory calculation unit 52 that predicts and calculates the vehicle trajectory, a deviation determination unit 53 that determines whether the vehicle will deviate to the road boundary from the vehicle trajectory and road boundary information, and a lane departure determination unit 53 that determines whether the vehicle will deviate to the road boundary. The control unit 51 includes a control unit 51 that issues an alarm or a steering control instruction when the vehicle is in use.
自動緊急ブレーキ制御部25は、障害物の軌道を計算する障害物軌道計算部60と、自車の軌道を計算し、障害物との衝突リスクを計算するリスク計算部61と、衝突リスクに基づいて警告またはブレーキ指示を作成する制御部62から構成される。
The automatic emergency brake control unit 25 includes an obstacle trajectory calculation unit 60 that calculates the trajectory of an obstacle, a risk calculation unit 61 that calculates the trajectory of the vehicle and the risk of collision with an obstacle, and a risk calculation unit 61 that calculates the trajectory of the own vehicle and the risk of collision with the obstacle. The control unit 62 is configured to generate a warning or brake instruction based on the information provided.
本実施例では、ライトの点消灯とカメラの撮像は、独立して非同期で動作するが、どちらも周期処理であり、位相を調整することにより、ライト点灯時と消灯時で特定の回数の撮像を行う。ライトは自動点灯の例としているが、人手によるマニュアル点灯にも適用でき、限定されない。
In this example, turning on and off the light and taking images with the camera operate independently and asynchronously, but both are cyclic processes, and by adjusting the phase, a specific number of images can be taken when the light is on and when the light is off. I do. Although the light is an example of automatic lighting, the present invention is not limited and can also be applied to manual lighting.
図5にライト点消灯と撮像タイミングの位相調整のフローチャートを示す。
FIG. 5 shows a flowchart for phase adjustment of light on/off and imaging timing.
まず、ライトの点消灯の制御を行うにあたり、照度計5で明るさの度合いが下がったときから、設定を開始する。照度計5にて照度を計算し、カメラ制御部3に照度を送信する(S11)。照度がしきい値以下(S12)でないときは、点消灯の制御を開始せず、照度がしきい値以下(S12)のとき、点消灯の制御を開始する。消灯デューティ計算部40にて、メモリに保持されたデータを用いて、消灯デューティを算出する(S13)。消灯デューティは、周期時間(消灯及び点灯の時間間隔)と消灯と点灯の時間比を示す。消灯デューティを用いて、画像処理部20、ライト制御部23でそれぞれ次のことを行う。画像処理部20では、撮像タイミング計算部30にて、消灯デューティを用いて、消灯と点灯時にそれぞれ何回撮像するかの撮像のシャッタータイミングを計算し、カメラ撮像部2にシャッタータイミングを指示する(S14)。消灯時は短い時間に多くの撮像を行い、点灯時は一定間隔で(定期的に)撮像し、更に、消灯と点灯の切替の近傍で撮像できるよう、シャッタータイミングを計算する。すなわち、撮像タイミング計算部30にて、カメラ撮像部2に、後述するヘッドライト部10向けの所定の周波数とデューティの点消灯パターンを基に、定期的な間隔と該間隔より短い間隔の、点消灯パターンと異なる撮像タイミングパターンを設定する。カメラ撮像部2は、画像処理部20から転送されたシャッタータイミング(撮像タイミングパターン)に基づき、シャッター設定部27にシャッタータイミングを設定し、設定タイミングにて撮像を行う(S15)。画像処理部20と並行して、ライト制御部23は消灯デューティを受取り、デューティ指示部70にて、ヘッドライト部10に、消灯デューティ設定を指示する(S16)。すなわち、デューティ指示部70にて、ヘッドライト部10に所定の周波数(消灯及び点灯の時間間隔に対応)とデューティの点消灯パターンを設定する。また、配光指示部71にて、当該点消灯パターンを用いて、点消灯するようにヘッドライト部10(の配光)を制御する。ヘッドライト部10は、消灯デューティ設定部28に消灯デューティ(点消灯パターン)を設定し、配光制御部29にて定めた消灯デューティにて周期的に点消灯する(S17)。例えば、周期1sec、消灯デューティが1:9とすると、消灯時間100ms、点灯時間900msで点消灯を繰り返す(図4(a)参照)。
First, when controlling the turning on and off of the light, the settings are started when the brightness level on the illuminance meter 5 decreases. The illumination meter 5 calculates the illuminance and transmits the illuminance to the camera control unit 3 (S11). When the illuminance is not below the threshold value (S12), the control for turning on and off the lights is not started, and when the illuminance is below the threshold value (S12), the control for turning on and off the lights is started. The lights-off duty calculation unit 40 calculates the lights-off duty using the data held in the memory (S13). The lights-out duty indicates the cycle time (time interval between lights-out and lights-on) and the time ratio between lights-out and lights-on. Using the light-off duty, the image processing section 20 and the light control section 23 each perform the following. In the image processing section 20, the imaging timing calculation section 30 uses the lights-off duty to calculate the shutter timing for imaging how many times to take images when the lights are off and when the lights are on, and instructs the shutter timing to the camera imaging section 2 ( S14). When the lights are off, many images are taken in a short period of time, when the lights are on, images are taken at regular intervals (regularly), and the shutter timing is calculated so that the images can be taken near the switching between the lights off and the lights on. That is, the imaging timing calculation unit 30 causes the camera imaging unit 2 to perform dots at regular intervals and at intervals shorter than the intervals based on a lighting/extinguishing pattern of a predetermined frequency and duty for the headlight unit 10, which will be described later. Set an imaging timing pattern that is different from the lights-off pattern. The camera imaging unit 2 sets the shutter timing in the shutter setting unit 27 based on the shutter timing (imaging timing pattern) transferred from the image processing unit 20, and performs imaging at the set timing (S15). In parallel with the image processing section 20, the light control section 23 receives the lights-off duty and instructs the headlight section 10 to set the lights-off duty using the duty instruction section 70 (S16). That is, the duty instruction section 70 sets a predetermined frequency (corresponding to the time interval between turning off and turning on) and a turning-on/off pattern of duty in the headlight section 10. Further, the light distribution instruction section 71 uses the light on/off pattern to control (the light distribution of) the headlight section 10 to turn on/off. The headlight section 10 sets a light-off duty (light-on/off pattern) in the light-off duty setting section 28, and periodically turns on and off at the light-off duty determined by the light distribution control section 29 (S17). For example, if the cycle is 1 sec and the light-off duty is 1:9, the light will be turned on and off repeatedly with a light-off time of 100 ms and a lighting time of 900 ms (see FIG. 4(a)).
次に、ヘッドライトの点消灯と撮像タイミングの位相を調整するために、撮像位相調整部31にて位相調整を行う。撮像位相調整部31でシャッタータイミング(撮像タイミングパターン)の位相をずらし、カメラ撮像部2に位相情報を指示する(S18)。カメラ撮像部2はシャッター設定部27に位相情報を設定し、設定タイミングに位相をずらして撮像を行う(S19)。撮像位相調整部31で、各画面の特定の画素(例えば、ヘッドライトが直接照射されないが、ヘッドライトの点消灯で必ず差異が出る場所の画素)の輝度を測定し、その輝度で点灯時の画像か、消灯時の画像かを判断し、該当する位相でのシャッタータイミングで周期的な点灯時の画像数と消灯時の画像数が想定通りか(つまり、定めておいた消灯時画像と点灯時画像の枚数と同じであるか)を確認する(S20)。点灯時と消灯時の画像数が異なるときは(S21)、再度、位相調整を行うために、撮像位相調整部31での位相調整を行う(S18)。このような形で位相調整の自己調整(自己テスト)機能を備えている。
Next, in order to adjust the phase of the headlight on/off and the imaging timing, the imaging phase adjustment section 31 performs phase adjustment. The imaging phase adjustment unit 31 shifts the phase of the shutter timing (imaging timing pattern) and instructs the camera imaging unit 2 to provide phase information (S18). The camera imaging unit 2 sets phase information in the shutter setting unit 27, and performs imaging by shifting the phase at the set timing (S19). The imaging phase adjustment unit 31 measures the brightness of a specific pixel on each screen (for example, a pixel in a location that is not directly illuminated by the headlights but always makes a difference when the headlights are turned on or off), and uses that brightness to determine the brightness when the headlights are turned on or off. Determine whether it is an image or an image when the lights are off, and check whether the number of periodic images when the lights are on and the number of images when the lights are off are as expected (in other words, whether the number of images when the lights are on periodically and the number of images when the lights are off are as expected with the shutter timing in the relevant phase (S20). If the number of images when the light is on is different from the number of images when the light is off (S21), the imaging phase adjustment unit 31 performs phase adjustment in order to perform phase adjustment again (S18). In this way, it has a self-adjustment (self-test) function for phase adjustment.
ライト点灯開始時に自動での位相調整を行った後は(図4(b)参照)、画面上の障害物(対象物)で認識処理を行う。
After automatic phase adjustment is performed when the light starts turning on (see FIG. 4(b)), recognition processing is performed using obstacles (objects) on the screen.
図6に対象物認識のフローチャートを示す。カメラはステレオカメラを仮定するが、特に制限されない。
Figure 6 shows a flowchart of object recognition. The camera is assumed to be a stereo camera, but is not particularly limited.
まず、カメラ制御部3の画像処理部20はカメラ撮像部2から左右のRAW画像を受け取る(S31)。対象物の輝度を抽出するために、輝度画像生成部32にて右のRAW画像から輝度画像を生成する(S32)。輝度画像は左右のいずれのRAW画像を用いてもよい。撮像位相調整部31で、各画面の特定の画素、例えば、ヘッドライトが直接照射されないが、ヘッドライトの点消灯で必ず差異が出る場所の画素において、画素の輝度を測定する。予め定めた輝度のしきい値でヘッドライト点消灯を判定し、輝度画像に対して、ヘッドライトが点灯か消灯かを示す点消灯属性を計算する(S33)。また、視差画像生成部33は、左右RAW画像から三角法を用いて、画素ごとの距離を有する視差画像を生成する(S34)。検出された対象物に対し、視差画像生成部33にて、視差画像を用いて、それぞれの対象物に識別番号としてIDを付加し、距離を含む自車両からの相対位置属性を付加する(S35)。
First, the image processing unit 20 of the camera control unit 3 receives left and right RAW images from the camera imaging unit 2 (S31). In order to extract the brightness of the object, the brightness image generation unit 32 generates a brightness image from the right RAW image (S32). Either the left or right RAW image may be used as the brightness image. The imaging phase adjustment unit 31 measures the brightness of a specific pixel on each screen, for example, a pixel in a location that is not directly illuminated by headlights but always shows a difference when the headlights are turned on or off. It is determined whether the headlights are on or off based on a predetermined luminance threshold, and an on/off attribute indicating whether the headlights are on or off is calculated for the luminance image (S33). Further, the parallax image generation unit 33 uses trigonometry from the left and right RAW images to generate a parallax image having distances for each pixel (S34). The parallax image generation unit 33 adds an ID as an identification number to each detected object using the parallax image, and adds relative position attributes including distance from the own vehicle (S35 ).
次に、画像比較部34にて、反射物体と発光物体を判定し、反射・発光物体属性を付加する。属性は、確定するまで随時更新する。点灯画像で検出できる対象物は発光物体と反射物体の両方の可能性があるので、不定属性にしておき、非確定とする。消灯画像で検出できる対象物は発光物体で確定する。画像に点消灯属性が付加されているので、撮像されたある時刻(t1)と撮像された一つ前の時刻(t2)の画像において、片方が点灯属性、もう片方が消灯属性のとき、画像比較を行う。点灯と消灯の輝度画像を比較し、消灯と点灯で両方とも検知した対象物は発光物体として確定(抽出)、点灯時のみ検知した対象物は反射物体として確定(抽出)する。これら発光物体、反射物体の種別を反射・発光物体属性として付加する(S36)。
Next, the image comparison unit 34 determines whether the object is a reflective object or a luminescent object, and adds reflective/luminous object attributes. The attributes will be updated as needed until they are finalized. Since the object that can be detected in the illuminated image may be both a luminescent object and a reflective object, it is set to an indeterminate attribute and is set to be undetermined. Objects that can be detected in the unlit image are determined to be luminescent objects. Since the on/off attribute is added to the image, when one of the images taken at a certain time (t1) and the image taken at the previous time (t2) has the on attribute and the other has the off attribute, the image Make a comparison. The brightness images when the lights are on and off are compared, and objects that are detected both when the lights are off and when the lights are on are determined (extracted) as luminescent objects, and objects that are detected only when the lights are on are determined (extracted) as reflective objects. The types of these luminous objects and reflective objects are added as reflective/luminous object attributes (S36).
更に、画像比較部34にて、連続する時間の複数枚の輝度画像を比較し、対象物の相対座標の差から移動ベクトルを算出し、付加する(S37)。対象物は反射物体と発光物体の双方を含む。2つの画像の輝度画像の比較では、点灯同士、消灯同士、点灯と消灯の3種類あるが、点灯同士の比較では、発光物体、反射物体共に移動ベクトルが算出できるが、消灯同士と点灯と消灯では、発光物体のみ移動ベクトルが算出できる。連続する時間の複数枚の軌道画像を用いることにより、反射物体の消灯画像の部分は検知できないため、点灯時の輝度画像を補完してつなぐことにより、移動ベクトルを算出する。各対象物は撮像されたある時刻(t1)と撮像された一つ前の時刻(t2)間の移動ベクトルを属性として保持する。時刻(t1)が消灯、時刻(t2)が点灯の反射物体においては、時刻(t2)に対して撮像された一つ前の時刻(t3)が点灯しているので、時刻(t2)、時刻(t3)の相対位置を用いて、時刻(t1)の相対位置を予測する。時刻(t1)の場所が不明の為、時刻(t2)と時刻(t3)から等速直線運動したと仮定して、時刻(t1)の相対座標を仮定し、時刻(t1)と時刻(t2)の相対座標から移動ベクトルを算出する。ここでは時刻(t3)を用いる例を示したが、精度を向上する為、更に遡った撮像時刻の座標を複数用いてもよく、そちらの方が予測精度は向上する。
Further, the image comparison unit 34 compares a plurality of brightness images taken at consecutive times, calculates a movement vector from the difference in relative coordinates of the object, and adds it (S37). Objects include both reflective objects and luminescent objects. When comparing the luminance images of two images, there are three types: between lights on, between lights off, and between lights on and off.When comparing between lights on, movement vectors can be calculated for both luminescent objects and reflective objects, but when comparing lights between lights, and between lights on and lights off. In this case, the movement vector of only the light-emitting object can be calculated. By using a plurality of trajectory images taken at consecutive times, it is impossible to detect the unlit image portion of the reflective object, so the movement vector is calculated by complementing and connecting the brightness images when the reflecting object is on. Each object holds as an attribute a movement vector between a certain time (t1) at which the image was taken and a time (t2) immediately before the image was taken. For a reflective object whose light is off at time (t1) and turned on at time (t2), the light is on at the time (t3) immediately before the time (t2), so the time (t2) and the time The relative position at time (t1) is predicted using the relative position at (t3). Since the location of time (t1) is unknown, we assume that there is a uniform linear motion from time (t2) and time (t3), and assume the relative coordinates of time (t1). ) Calculate the movement vector from the relative coordinates of Here, an example is shown in which time (t3) is used, but in order to improve accuracy, a plurality of coordinates of imaging times further back in time may be used, which improves prediction accuracy.
反射物体のうち、実際には物体が無いものの、自車のヘッドライトの影響で見かけ上物体に見えるものに関しては、疑似物体除去部41にて、反射・発光物体属性を有する対象物のうち、移動ベクトルがゼロの物体は、疑似物体と判定して除去することができる(S38)。これにより、後述するヘッドライト部10の配光制御(障害物を検知したときにハイビームからロービームに切り替える配光制御)において、自車のライトの映り込みなど実際にはない物体である疑似物体を配光制御の障害物対象から除くことができる。そして、移動体判定部42にて、自車と同じ移動量で反対の移動方向のベクトルを持つものは静止物体、異なるものは移動物体と判定し、対象物情報に移動ベクトルを移動体属性として付加する(S39)。
Among the reflective objects, for those that are not actually objects but appear to be objects due to the influence of the own vehicle's headlights, the pseudo object removal unit 41 removes them from among the objects that have reflective/luminous object attributes. An object whose movement vector is zero can be determined to be a pseudo object and removed (S38). As a result, in the light distribution control of the headlight unit 10 (light distribution control that switches from high beam to low beam when an obstacle is detected), which will be described later, pseudo objects that are objects that do not actually exist, such as the reflection of the own vehicle's lights, can be eliminated. It is possible to exclude obstacles from light distribution control. Then, the moving object determination unit 42 determines that an object having the same amount of movement and a vector in the opposite movement direction as the own vehicle is a stationary object, and a different object as a moving object, and sets the movement vector as a moving object attribute in the object information. Add (S39).
このように、カメラ制御部3では、対象物を検知し、検知した対象物に対し、ヘッドライト点灯時と消灯時の輝度画像の比較により反射物体か発光物体かを判定して反射・発光物体属性を付加し、視差画像を用いて自車との距離情報を含む相対位置属性を付加し、更に複数の輝度画像の比較で移動体属性を付加する。
In this way, the camera control unit 3 detects the object, determines whether the detected object is a reflective object or a light-emitting object by comparing the brightness images when the headlights are on and when the headlights are off, and identifies the detected object as a reflective or light-emitting object. Attributes are added, relative position attributes including distance information from the own vehicle are added using parallax images, and moving body attributes are added by comparing a plurality of brightness images.
次に、レーン認識について記載する。通常のレーン検知に加え、反射物体の路肩のガードレール上の反射板や視線誘導標(デリネーター)を用いて、レーンのかすれなどでレーンが検知できないときに、反射物体の情報を用いて、レーン検知率を向上する。
Next, lane recognition will be described. In addition to normal lane detection, when the lane cannot be detected due to blurred lanes, the reflective object information is used to detect lanes using reflectors on guardrails on the roadside or delinators. improve rate.
図7にレーン・道路境界検知部43のフローチャートを示す。
FIG. 7 shows a flowchart of the lane/road boundary detection unit 43.
まず、画像処理部20の輝度画像生成部32にて輝度画像を生成する(S41)。エッジ画像生成部35にて、輝度画像の勾配からエッジ画像を生成する(S42)。レーン・道路境界検知部43にて、エッジ画像からレーンを検知し、自車からの一定の相対距離に応じた座標点として算出する(S43)。
First, a brightness image is generated in the brightness image generation unit 32 of the image processing unit 20 (S41). The edge image generation unit 35 generates an edge image from the gradient of the luminance image (S42). The lane/road boundary detection unit 43 detects a lane from the edge image, and calculates a coordinate point according to a certain relative distance from the own vehicle (S43).
レーンに加え、路肩の一定の間隔で配置された反射物体が対象物として認識されるため、その対象物の情報を利用する。レーン・道路境界検知部43にて、対象物のうち、反射・発光物体属性で反射物体、移動体属性で静止物体、自車からの相対位置で等間隔にて検出される複数の対象物を道路境界群として抽出する(S44)。すなわち、レーン・道路境界検知部43にて、静止物体と判定された反射物体が自車からの相対位置で等間隔にて(一定の距離で)配置されている場合、道路境界(群)と判断する。レーン情報として、画面上にレーンが検知されるかどうかで、レーン検知情報を用いるか、道路境界群を用いるかを選択する。画面上にレーンが検知される(S45)場合、レーン・道路境界検知部43にて、レーン検知の座標点をレーン情報として出力する(S46)。画面上にレーンが検知されない(S45)場合、レーン・道路境界検知部43にて、道路境界群の座標から自車両側に一定の距離を近づけた座標点をレーン情報として出力する(S47)。
In addition to lanes, reflective objects placed at regular intervals on the shoulder of the road are recognized as objects, so information about those objects is used. Among the objects, the lane/road boundary detection unit 43 detects a reflective object with a reflective/luminous object attribute, a stationary object with a moving object attribute, and multiple objects detected at equal intervals in relative position from the own vehicle. It is extracted as a road boundary group (S44). In other words, when reflective objects that are determined to be stationary objects by the lane/road boundary detection unit 43 are placed at equal intervals (at a fixed distance) relative to the host vehicle, they are determined to be a road boundary (group). to decide. Depending on whether a lane is detected on the screen, whether to use lane detection information or road boundary groups is selected as lane information. When a lane is detected on the screen (S45), the lane/road boundary detection unit 43 outputs the coordinate point of the lane detection as lane information (S46). If no lane is detected on the screen (S45), the lane/road boundary detection unit 43 outputs a coordinate point that is a certain distance closer to the own vehicle from the coordinates of the road boundary group as lane information (S47).
次に、認識の対象物情報を用いて、ヘッドライトのハイビーム、ロービームを制御する配光制御の方法を用いて説明する。配光制御とは、対象物が無いときはハイビーム、対象物があるときはロービームにと、対象物の有無に応じてハイビームとロービームを切替える技術である。本発明の第一の実施例の特徴として、認識の対象物の反射・発光物体属性を用いて、疑似物体を含む反射物体に対しては、配光制御の対象から除くことである(疑似物体除去部41)。
Next, a light distribution control method for controlling the high beam and low beam of the headlights using the recognition target information will be explained. Light distribution control is a technology that switches between high beam and low beam depending on the presence or absence of an object, such as high beam when there is no object and low beam when there is an object. A feature of the first embodiment of the present invention is that reflective objects including pseudo objects are excluded from light distribution control targets by using the reflection/emission object attributes of the recognition target (pseudo objects removal section 41).
図8に配光制御のフローチャートを示す。
FIG. 8 shows a flowchart of light distribution control.
ヘッドライトの配光制御はきめ細かくハイビーム、ロービームを切替えると、ドライバが不快となるため、障害物検知の確度を向上させるために、複数画像から対象物を蓄積して保存する(S51)。複数画像に同一対象物の有無を確認する(S52)。同一対象物がある場合、対象物の反射・発光物体属性を用いて、反射物体である場合は、対象物が無いものとして扱う(S53)。更に、反射物体でない(S53)場合は、同一対象物が、配光用ライト検出部44で検知された他車両のヘッドライトやテールライトであるかどうかを判定する(S54)。
If the light distribution control of the headlights is finely switched between high beam and low beam, the driver will feel uncomfortable, so in order to improve the accuracy of obstacle detection, objects are accumulated from multiple images and saved (S51). The presence or absence of the same object in multiple images is checked (S52). If there are identical objects, the reflective/luminous object attributes of the objects are used, and if they are reflective objects, they are treated as if they do not exist (S53). Further, if the object is not a reflective object (S53), it is determined whether the same object is a headlight or taillight of another vehicle detected by the light distribution light detection unit 44 (S54).
三つの条件(S52、S53、S54)を満たしたときに、ロービームにすべき有効対象物があると判断し、配光指示部71にて、ロービームにする配光指示を生成し、ヘッドライト部10に対して配光指示を送出する(S55)。ヘッドライト部10の配光制御部29にて、ヘッドライトをロービームにする制御を行う(S56)。
When the three conditions (S52, S53, S54) are satisfied, it is determined that there is a valid object that should be set to low beam, and the light distribution instruction unit 71 generates a light distribution instruction to set low beam, and the headlight unit A light distribution instruction is sent to 10 (S55). The light distribution control section 29 of the headlight section 10 controls the headlights to be set to low beam (S56).
三つの条件(S52、S53、S54)のいずれかを満たさないときは、ロービームにすべき有効対象物がないと判断し、ハイビームにするための制御を行う。配光指示部71にて、ハイビームにする配光指示を生成し、ヘッドライト部10に対して配光指示を送出する(S57)。ヘッドライト部10の配光制御部29にて、ヘッドライトをハイビームにする制御を行う(S58)。
If any of the three conditions (S52, S53, S54) is not satisfied, it is determined that there is no valid target to be set to low beam, and control is performed to set high beam. The light distribution instruction unit 71 generates a light distribution instruction to set the high beam, and sends the light distribution instruction to the headlight unit 10 (S57). The light distribution control section 29 of the headlight section 10 controls the headlights to be set to high beam (S58).
配光制御の反射物体判定条件(S53)や対象物判定条件(S54)は、変更、拡張が可能である。例えば、本実施例では、発光物体の他車両のヘッドライトやテールライトとしたが、歩行者や自転車の属性が付けられるのであれば、反射物体の歩行者や自転車を検知したときを条件に加えることもできる。
The reflective object determination conditions (S53) and target object determination conditions (S54) for light distribution control can be changed and expanded. For example, in this example, the light-emitting object is the headlight or taillight of another vehicle, but if the attribute of a pedestrian or bicycle is to be added, then the condition should be added when a reflective object such as a pedestrian or bicycle is detected. You can also do that.
次に、カメラ制御部3の認識結果の対象物情報とレーン情報を用いて、車両制御部4で自動緊急ブレーキ制御と車線逸脱制御を算出する。
Next, the vehicle control unit 4 calculates automatic emergency brake control and lane departure control using the object information and lane information obtained from the recognition results of the camera control unit 3.
図9に自動緊急ブレーキ制御のフローチャートを示す。
Figure 9 shows a flowchart of automatic emergency brake control.
カメラ制御部3から、認識の対象物情報として、反射・発光物体属性、相対位置属性、移動体属性を送出する(S61)。自動緊急ブレーキ制御部25は、認識の対象物情報を受け取った後、障害物軌道計算部60にて、蓄積した対象物情報を用いて、障害物の軌道を予測する(S62)。障害物軌道計算部60にて、反射・発光物体属性で発光物体は、対象物情報の相対位置情報を用いて軌道を予測し、反射・発光物体属性で判定した反射物体は、消灯期間中は検知できず、相対位置が不明の為、遡った複数時間の相対位置から、線形補完や多項式補完により、位置を予測することで軌道を予測する(S63)。反射物体は静止対象物、移動対象物いずれも対応するため、移動体属性を使用しない。但し、移動体属性を使用し、移動体属性で静止物体は、対象物の軌道として、軌道予測はスキップして静止として扱い、自車軌道に対して自車との相対位置を求めてもよい。例えば、障害物軌道計算部60にて、反射物体が移動物体である場合、移動物体に対して、自車両との相対速度との対比から歩行者または自転車と判断し、過去の軌道から将来の軌道を予測する。そして、リスク計算部61にて、発光物体と反射物体の対象物の軌道が、自車が進行する自車進行路上にあるか、衝突危険性のリスクを計算する(S64)。制御部62にて、リスク計算部61の衝突危険性リスクに基づき(自車両との衝突危険性のリスクが高いと判断されたときに)、警報または緊急ブレーキ制御指示を行う(S65)。その後、アクチュエータのブレーキ部13はCAN6で緊急ブレーキ制御指示を受取り、その制御指示に基づいて、自車両のブレーキ制御を行う。
The camera control unit 3 sends reflective/luminescent object attributes, relative position attributes, and moving object attributes as object information for recognition (S61). After receiving the object information to be recognized, the automatic emergency brake control section 25 uses the accumulated object information to predict the trajectory of the obstacle in the obstacle trajectory calculation section 60 (S62). The obstacle trajectory calculation unit 60 predicts the trajectory of the light-emitting object using the relative position information of the object information based on the reflective/light-emitting object attribute, and the trajectory of the light-emitting object determined based on the reflective/light-emitting object attribute during the lights-out period. Since it cannot be detected and the relative position is unknown, the trajectory is predicted by predicting the position using linear interpolation or polynomial interpolation from the relative positions of multiple times back (S63). The reflecting object corresponds to both stationary objects and moving objects, so the moving object attribute is not used. However, if you use the moving object attribute, a stationary object with the moving object attribute may be treated as the target object's trajectory, skipping trajectory prediction and treated as stationary, and calculate the relative position of the own vehicle with respect to the own vehicle trajectory. . For example, when the reflecting object is a moving object, the obstacle trajectory calculation unit 60 determines that the moving object is a pedestrian or a bicycle based on the relative speed of the own vehicle, and calculates the future trajectory based on the past trajectory. Predict the trajectory. Then, the risk calculation unit 61 calculates whether the trajectory of the light-emitting object and the reflective object is on the path of the own vehicle, and calculates the risk of collision (S64). The control unit 62 issues a warning or an emergency brake control instruction based on the collision risk calculated by the risk calculation unit 61 (when it is determined that the risk of collision with the host vehicle is high) (S65). Thereafter, the brake unit 13 of the actuator receives the emergency brake control instruction via the CAN 6, and performs brake control of the own vehicle based on the control instruction.
図10に自動車線逸脱制御のフローチャートを示す。
FIG. 10 shows a flowchart of automatic lane departure control.
カメラ制御部3から、認識の対象物情報として、反射・発光物体属性、相対位置属性、移動体属性と、認識のレーン情報を送出する(S71)。自車軌道計算部52にて、自車の操舵情報から自車の軌道を予測する(S72)。逸脱判定部53にて、操舵情報から予測した軌道と認識のレーン情報を入力し、レーンから逸脱するかどうかのリスクを計算する(S73)。制御部51にて、逸脱判定部53の車線逸脱リスクに基づき、車線逸脱を避けるように(道路境界への侵入を防止するように)、警報または操舵制御指示を行う(S74)。その後、アクチュエータの操舵部12はCAN6で操舵制御指示を受取り、その制御指示に基づいて、自車両の操舵制御を行う。
The camera control unit 3 sends out reflective/luminous object attributes, relative position attributes, moving object attributes, and recognition lane information as recognition object information (S71). The own vehicle trajectory calculation unit 52 predicts the trajectory of the own vehicle from the steering information of the own vehicle (S72). The deviation determination unit 53 inputs the trajectory predicted from the steering information and the recognized lane information, and calculates the risk of deviating from the lane (S73). Based on the lane departure risk determined by the departure determination unit 53, the control unit 51 issues a warning or a steering control instruction to avoid lane departure (to prevent entry into the road boundary) (S74). Thereafter, the steering unit 12 of the actuator receives the steering control instruction via the CAN 6, and performs steering control of the own vehicle based on the control instruction.
[第二の実施例]
ヘッドライトの点消灯のデューティが他車と同期して同じデューティであるとすると、自車からは他車のヘッドライトが常に消灯していることになる。そこで、ヘッドライトの点消灯の周期を複数のデューティを有するマルチデューティ方式とすることで、この事態を避けることができる。 [Second example]
Assuming that the duty of turning on and off the headlights is synchronized with the duty of the other vehicle, the headlights of the other vehicle will always be turned off from the perspective of the own vehicle. Therefore, this situation can be avoided by using a multi-duty system that has a plurality of duties for turning on and off the headlights.
ヘッドライトの点消灯のデューティが他車と同期して同じデューティであるとすると、自車からは他車のヘッドライトが常に消灯していることになる。そこで、ヘッドライトの点消灯の周期を複数のデューティを有するマルチデューティ方式とすることで、この事態を避けることができる。 [Second example]
Assuming that the duty of turning on and off the headlights is synchronized with the duty of the other vehicle, the headlights of the other vehicle will always be turned off from the perspective of the own vehicle. Therefore, this situation can be avoided by using a multi-duty system that has a plurality of duties for turning on and off the headlights.
第二の実施例の原理を図11に示す。2周期で、1周期目の消灯と点灯の比が1:9、2周期目の消灯と点灯の比が1:4であるものとする。つまり、ヘッドライトを、周期的に複数回の異なるデューティ(1周期目の1:9、2周期目の1:4)で点消灯する。カメラの撮像タイミングも2周期の点消灯デューティに合わせて可変の撮像タイミングとしている。位相調整を行うことで、1周期目は消灯時に2回撮像して、点灯時に6回撮像し、2周期は消灯時に3回撮像して、点灯時に5回撮像する。この周期を繰り返す。2周期は一つの例であり、3、4周期と複数周期のマルチデューティ方式とすることで、他車と同期する可能性を著しく減少することができる。
The principle of the second embodiment is shown in FIG. It is assumed that there are two cycles, and the ratio of lights off and lights on in the first cycle is 1:9, and the ratio of lights off and lights on in the second cycle is 1:4. That is, the headlights are turned on and off periodically at different duties (1:9 in the first cycle, 1:4 in the second cycle). The imaging timing of the camera is also variable according to the two-cycle on/off duty. By performing phase adjustment, in the first cycle, images are taken twice when the lights are off, and six times when the lights are on, and in the second cycle, images are taken three times when the lights are off, and five times when the lights are on. Repeat this cycle. Two cycles is one example; by using a multi-duty system with three or four cycles and multiple cycles, the possibility of synchronization with other vehicles can be significantly reduced.
第二の実施例のシステム構成図は図1と同じでよい。消灯デューティ計算部40がマルチデューティとなり、これに伴い、ライト制御部23のデューティ指示部70と、ヘッドライト部10の消灯デューティ設定部28がマルチデューティ対応のライト制御となる。また、ヘッドライトに合わせて、カメラの撮像タイミングもマルチデューティに対応した可変撮像を行う。具体的には、画像処理部20の撮像タイミング計算部30、撮像位相調整部31、カメラ撮像部2のシャッター設定部27が点消灯のマルチデューティに対応した可変シャッター制御を行う。
The system configuration diagram of the second embodiment may be the same as FIG. 1. The light-off duty calculation section 40 becomes multi-duty, and accordingly, the duty instruction section 70 of the light control section 23 and the light-off duty setting section 28 of the headlight section 10 become light control compatible with multi-duty. In addition, the camera's imaging timing is variable to match the headlights and is compatible with multi-duty. Specifically, the imaging timing calculation section 30 of the image processing section 20, the imaging phase adjustment section 31, and the shutter setting section 27 of the camera imaging section 2 perform variable shutter control corresponding to multi-duty of turning on and off.
本発明の第一の実施例、第二の実施例は、次のように拡張することもできる。
The first embodiment and second embodiment of the present invention can also be expanded as follows.
認識部21のヘッドライトの消灯デューティ計算は、消灯デューティ計算部40にて固定ではなく、可変に設定することができる。消灯デューティを可変に設定するために、例えばイグニッションオン時に乱数に基づき設定することができる。また、ヘッドライトの消灯の頻度を減らすために、ヘッドライトの消灯は、地図部11の地図情報に基づき特定期間もしくは特定場所にのみ実行させることも可能である。
The headlight turn-off duty calculation of the recognition unit 21 can be set variably instead of fixedly in the light-off duty calculation unit 40. In order to variably set the light-off duty, it can be set based on a random number when the ignition is turned on, for example. Further, in order to reduce the frequency of turning off the headlights, it is also possible to turn off the headlights only during a specific period or at a specific location based on the map information of the map section 11.
[第一、第二の実施例のまとめ]
以上述べたように、本実施例のカメラ制御装置(カメラ制御部3)は、自車両の前方を照らすヘッドライト部10と、前記自車両の前方を撮像するカメラ撮像部2とを制御するカメラ制御装置であって、前記カメラ制御装置は、前記ヘッドライト部10に所定の周波数とデューティの点消灯パターンを設定し(デューティ指示部70)、前記点消灯パターンを用いて、点消灯するように前記ヘッドライト部10(の配光)を制御し(配光指示部71)、前記カメラ撮像部2には、前記ヘッドライト部10向けの所定の周波数とデューティを基に、定期的な第1の撮像時間間隔と該第1の撮像時間間隔より短い第2の撮像時間間隔の、前記点消灯パターンと異なる撮像タイミングパターンを設定し(撮像タイミング計算部30)、前記カメラ撮像部2の前記撮像タイミングパターンの時間位相(撮像位相)を調整することにより、前記ヘッドライト部10の消灯時と点灯時の切替時の撮像時間間隔を削減(短縮)する(撮像位相調整部31)。 [Summary of the first and second embodiments]
As described above, the camera control device (camera control section 3) of this embodiment is a camera that controls theheadlight section 10 that illuminates the front of the own vehicle and the camera imaging section 2 that takes an image of the front of the own vehicle. In the control device, the camera control device sets a lighting/extinguishing pattern of a predetermined frequency and duty in the headlight section 10 (duty instruction section 70), and turns on/off using the lighting/extinguishing pattern. The headlight unit 10 (light distribution) is controlled (light distribution instruction unit 71), and the camera imaging unit 2 receives a periodic first signal based on a predetermined frequency and duty for the headlight unit 10. An imaging timing pattern different from the lighting/extinguishing pattern is set for an imaging time interval of 1 and a second imaging time interval shorter than the first imaging time interval (imaging timing calculation unit 30), and the imaging timing pattern of the camera imaging unit 2 is set. By adjusting the time phase (imaging phase) of the timing pattern, the imaging time interval when the headlight section 10 is switched between turning off and turning on is reduced (shortened) (imaging phase adjustment section 31).
以上述べたように、本実施例のカメラ制御装置(カメラ制御部3)は、自車両の前方を照らすヘッドライト部10と、前記自車両の前方を撮像するカメラ撮像部2とを制御するカメラ制御装置であって、前記カメラ制御装置は、前記ヘッドライト部10に所定の周波数とデューティの点消灯パターンを設定し(デューティ指示部70)、前記点消灯パターンを用いて、点消灯するように前記ヘッドライト部10(の配光)を制御し(配光指示部71)、前記カメラ撮像部2には、前記ヘッドライト部10向けの所定の周波数とデューティを基に、定期的な第1の撮像時間間隔と該第1の撮像時間間隔より短い第2の撮像時間間隔の、前記点消灯パターンと異なる撮像タイミングパターンを設定し(撮像タイミング計算部30)、前記カメラ撮像部2の前記撮像タイミングパターンの時間位相(撮像位相)を調整することにより、前記ヘッドライト部10の消灯時と点灯時の切替時の撮像時間間隔を削減(短縮)する(撮像位相調整部31)。 [Summary of the first and second embodiments]
As described above, the camera control device (camera control section 3) of this embodiment is a camera that controls the
前記カメラ制御装置は、前記時間位相をずらして前記カメラ撮像部2で撮像された各画像の点灯時の画像数と消灯時の画像数に基づいて、前記カメラ撮像部2の前記撮像タイミングパターンの時間位相を調整する(撮像位相調整部31)。
The camera control device adjusts the imaging timing pattern of the camera imaging unit 2 based on the number of images at the time of lighting and the number of images at the time of turning off of each image captured by the camera imaging unit 2 with the time phase shifted. Adjust the time phase (imaging phase adjustment section 31).
前記カメラ制御装置は、前記時間位相をずらして前記カメラ撮像部2で撮像された各画像の特定の画素の輝度を測定し、当該輝度で点灯時の画像と消灯時の画像を判断し、前記時間位相でのシャッタータイミングで周期的な点灯時の画像数と消灯時の画像数を確認することにより、前記カメラ撮像部2の前記撮像タイミングパターンの時間位相を調整する(撮像位相調整部31)。
The camera control device measures the brightness of a specific pixel of each image captured by the camera imaging unit 2 while shifting the time phase, determines an image when the light is on and an image when the light is off based on the luminance, and The time phase of the imaging timing pattern of the camera imaging unit 2 is adjusted by checking the number of images when the light is on and the number of images when the light is off periodically at the shutter timing in the time phase (imaging phase adjustment unit 31). .
前記カメラ制御装置は、前記カメラ撮像部2が撮像した前記ヘッドライト部10の消灯と点灯による第一の輝度画像と第二の輝度画像を比較して、反射物体を抽出し、複数の輝度画像を比較して画像内の前記反射物体の移動ベクトルを算出し、前記反射物体の移動ベクトルが前記自車両の移動ベクトルと反対のときに前記反射物体を静止物体、前記反射物体の移動ベクトルがゼロのときに前記反射物体を疑似物体、それ以外のときに前記反射物体を移動物体と判定する(輝度画像生成部32、画像比較部34、疑似物体除去部41、移動体判定部42)。
The camera control device compares a first brightness image and a second brightness image obtained by turning off and turning on the headlight unit 10 captured by the camera imaging unit 2, extracts a reflective object, and generates a plurality of brightness images. The moving vector of the reflecting object in the image is calculated by comparing the moving vectors of the reflecting object, and when the moving vector of the reflecting object is opposite to the moving vector of the own vehicle, the reflecting object is regarded as a stationary object and the moving vector of the reflecting object is zero. When this occurs, the reflective object is determined to be a pseudo object, and in all other cases, the reflective object is determined to be a moving object (luminance image generation section 32, image comparison section 34, pseudo object removal section 41, and moving object determination section 42).
前記カメラ制御装置は、前記ヘッドライト部10のデューティを可変に設定する(消灯デューティ計算部40)。
The camera control device variably sets the duty of the headlight section 10 (light-off duty calculation section 40).
前記カメラ制御装置は、前記ヘッドライト部10のデューティを乱数に基づき設定する(消灯デューティ計算部40)。
The camera control device sets the duty of the headlight unit 10 based on a random number (light-off duty calculation unit 40).
前記カメラ制御装置は、前記ヘッドライト部10を、周期的に複数回の異なるデューティで点消灯する(消灯デューティ計算部40、デューティ指示部70、配光指示部71)。
The camera control device periodically turns on and off the headlight section 10 at different duties a plurality of times (light-off duty calculation section 40, duty instruction section 70, light distribution instruction section 71).
前記カメラ制御装置は、前記ヘッドライト部10の消灯を、地図情報に基づき特定期間もしくは特定場所にのみ実行させる(消灯デューティ計算部40)。
The camera control device causes the headlight section 10 to be turned off only during a specific period or at a specific location based on the map information (lights-off duty calculation section 40).
前記カメラ制御装置は、障害物を検知したときにハイビームからロービームに切り替える前記ヘッドライト部10の配光制御において、前記反射物体の前記疑似物体を前記配光制御の障害物対象から除く(疑似物体除去部41、配光指示部71)。
In the light distribution control of the headlight unit 10 that switches from high beam to low beam when an obstacle is detected, the camera control device excludes the pseudo object of the reflective object from the obstacle target of the light distribution control. removal section 41, light distribution instruction section 71).
また、本実施例の車両制御装置(車両制御部4)は、前記カメラ制御装置から出力される情報を基に、前記自車両を制御する車両制御装置であって、前記車両制御装置は、前記静止物体と判定された前記反射物体が一定の距離で配置されている場合、道路境界と判断し、前記道路境界への侵入を防止するように警報または前記自車両の操舵を制御する(車線逸脱制御部24)。
Further, the vehicle control device (vehicle control unit 4) of this embodiment is a vehicle control device that controls the own vehicle based on information output from the camera control device, and the vehicle control device controls the own vehicle based on the information output from the camera control device. If the reflective object determined to be a stationary object is located at a certain distance, it is determined to be a road boundary, and an alarm is issued or the steering of the own vehicle is controlled to prevent intrusion into the road boundary (lane departure). control unit 24).
また、本実施例の車両制御装置(車両制御部4)は、前記カメラ制御装置から出力される情報を基に、前記自車両を制御する車両制御装置であって、前記車両制御装置は、前記反射物体が前記移動物体である場合、前記移動物体に対して、前記自車両との相対速度との対比から歩行者または自転車と判断し、過去の軌道から将来の軌道を予測し、前記自車両との衝突可能性のリスクを計算し、前記自車両との衝突可能性のリスクが高いと判断されたときに警報または自動緊急ブレーキ制御を行う(自動緊急ブレーキ制御部25)。
Further, the vehicle control device (vehicle control unit 4) of this embodiment is a vehicle control device that controls the own vehicle based on information output from the camera control device, and the vehicle control device controls the own vehicle based on the information output from the camera control device. If the reflective object is the moving object, it is determined that the moving object is a pedestrian or a bicycle based on the relative speed of the vehicle, the future trajectory is predicted from the past trajectory, and the vehicle The risk of a collision with the own vehicle is calculated, and when it is determined that the risk of a collision with the own vehicle is high, a warning or automatic emergency brake control is performed (automatic emergency brake control unit 25).
以上述べたように、本実施例によれば、ヘッドライトを周期的に短時間消灯し、消灯と点灯の境界で短いタイミングで撮像することにより、反射物体と発光物体を検知することができ、反射物体のうち、ヘッドライトの映り込み等で道路上に物体として見えている疑似物体を正しく除去することで、疑似物体で誤ったヘッドライトの配光制御や自動緊急ブレーキを行うことを回避し、更に、縁石やガードレールの反射物を用いて道路境界とし、レーン検知ができないときに、その代替として利用することにより、自動逸脱回避システムの性能を向上する、といった、種々の高度運転支援システム(Advanced Driver Assistance Systems:ADAS)を実現することができる。
As described above, according to this embodiment, reflective objects and light-emitting objects can be detected by periodically turning off the headlights for short periods of time and capturing images at short timings at the boundary between turning off and turning on the headlights. Among reflective objects, by correctly removing pseudo-objects that appear as objects on the road due to reflections from headlights, etc., it is possible to avoid incorrect headlight light distribution control or automatic emergency braking caused by pseudo-objects. Furthermore, various advanced driving support systems (such as those that use reflective objects such as curbs and guardrails as road boundaries and use them as substitutes when lane detection is not possible, improving the performance of automatic departure avoidance systems) are being developed. Advanced Driver Assistance Systems (ADAS) can be realized.
なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。
Note that the present invention is not limited to the embodiments described above, and includes various modifications. For example, the embodiments described above are described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described.
また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記憶装置、または、ICカード、SDカード、DVD等の記録媒体に置くことができる。
Further, each of the above-mentioned configurations, functions, processing units, processing means, etc. may be partially or entirely realized in hardware by, for example, designing an integrated circuit. Further, each of the above-mentioned configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function. Information such as programs, tables, files, etc. that realize each function can be stored in a memory, a storage device such as a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD.
また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。
In addition, control lines and information lines are shown that are considered necessary for explanation, and not all control lines and information lines are necessarily shown in the product. In reality, almost all components may be considered to be interconnected.
1・・・車両
2・・・カメラ撮像部
3・・・カメラ制御部(カメラ制御装置)
4・・・車両制御部(車両制御装置)
5・・・照度計
6・・・CAN
10・・・ヘッドライト部
11・・・地図部
12・・・操舵部
13・・・ブレーキ部
20・・・画像処理部
21・・・認識部
22、26・・・CAN-IF部
23・・・ライト制御部
24・・・車線逸脱制御部
25・・・自動緊急ブレーキ制御部
27・・・シャッター設定部
28・・・消灯デューティ設定部
29・・・配光制御部
30・・・撮像タイミング計算部
31・・・撮像位相調整部
32・・・輝度画像生成部
33・・・視差画像生成部
34・・・画像比較部
35・・・エッジ画像生成部
40・・・消灯デューティ計算部
41・・・疑似物体除去部
42・・・移動体判定部
43・・・レーン・道路境界検知部
44・・・配光用ライト検出部
51、62・・・制御部
52・・・自車軌道計算部
53・・・逸脱判定部
60・・・障害物軌道計算部
61・・・リスク計算部
70・・・デューティ指示部
71・・・配光指示部
100・・・時刻t1の対向車のヘッドライト
101・・・時刻t1の縁石の反射物体での反射
102・・・時刻t1のガードレールの反射物体での反射
103・・・時刻t1の自車のヘッドライトでの反射(疑似物体)
104・・・時刻t2の対向車のヘッドライト
111・・・時刻t1の車道内の自転車の反射物体
112・・・時刻t1の車道外の静止物体の反射物体
113・・・時刻t1の車道内のヘッドライトの反射による疑似物体
114・・・時刻t2の車道内の自転車の反射物体
115・・・時刻t2の車道外の静止物体の反射物体
116・・・時刻t2の車道内のヘッドライトの反射による疑似物体
117・・・時刻t2-t1の車道内の自転車の反射物体の移動ベクトル
118・・・時刻t2-t1の車道外の静止物体の反射物体の移動ベクトル
119・・・時刻t2-t1の車道内のヘッドライトの反射による疑似物体の移動ベクトル 1...Vehicle 2...Camera imaging section 3...Camera control section (camera control device)
4...Vehicle control unit (vehicle control device)
5...Luminometer 6...CAN
10...Headlight section 11... Map section 12... Steering section 13... Brake section 20... Image processing section 21... Recognition section 22, 26... CAN-IF section 23. ...Light control section 24...Lane departure control section 25...Automatic emergency brake control section 27...Shutter setting section 28...Lights off duty setting section 29...Light distribution control section 30...Image capture Timing calculation unit 31...imaging phase adjustment unit 32...luminance image generation unit 33...parallax image generation unit 34...image comparison unit 35...edge image generation unit 40...light-off duty calculation unit 41... Pseudo object removal unit 42... Moving object determination unit 43... Lane/road boundary detection unit 44... Light distribution light detection unit 51, 62... Control unit 52... Self-vehicle Trajectory calculation unit 53... Deviation determination unit 60... Obstacle trajectory calculation unit 61... Risk calculation unit 70... Duty instruction unit 71... Light distribution instruction unit 100... Oncoming vehicle at time t1 Headlights 101...Reflection on a reflective object on the curb at time t1 102...Reflection on a reflective object on a guardrail at time t1 103...Reflection on the own vehicle's headlight at time t1 (pseudo object)
104... Headlight of an oncoming vehicle attime t2 111... Reflective object of a bicycle in the roadway at time t1... Reflective object of a stationary object outside the roadway at time t1 113... Inside the roadway at time t1 Pseudo object 114 due to the reflection of the headlights...Reflection object 115 of a bicycle on the roadway at time t2...Reflection object 116 of a stationary object outside the roadway at time t2...Reflection of the headlights on the roadway at time t2 Pseudo object due to reflection 117...Movement vector of the reflective object of the bicycle on the roadway at time t2-t1...Movement vector of the reflective object of a stationary object outside the roadway at time t2-t1...Time t2- Movement vector of the pseudo object due to reflection of headlights in the roadway at t1
2・・・カメラ撮像部
3・・・カメラ制御部(カメラ制御装置)
4・・・車両制御部(車両制御装置)
5・・・照度計
6・・・CAN
10・・・ヘッドライト部
11・・・地図部
12・・・操舵部
13・・・ブレーキ部
20・・・画像処理部
21・・・認識部
22、26・・・CAN-IF部
23・・・ライト制御部
24・・・車線逸脱制御部
25・・・自動緊急ブレーキ制御部
27・・・シャッター設定部
28・・・消灯デューティ設定部
29・・・配光制御部
30・・・撮像タイミング計算部
31・・・撮像位相調整部
32・・・輝度画像生成部
33・・・視差画像生成部
34・・・画像比較部
35・・・エッジ画像生成部
40・・・消灯デューティ計算部
41・・・疑似物体除去部
42・・・移動体判定部
43・・・レーン・道路境界検知部
44・・・配光用ライト検出部
51、62・・・制御部
52・・・自車軌道計算部
53・・・逸脱判定部
60・・・障害物軌道計算部
61・・・リスク計算部
70・・・デューティ指示部
71・・・配光指示部
100・・・時刻t1の対向車のヘッドライト
101・・・時刻t1の縁石の反射物体での反射
102・・・時刻t1のガードレールの反射物体での反射
103・・・時刻t1の自車のヘッドライトでの反射(疑似物体)
104・・・時刻t2の対向車のヘッドライト
111・・・時刻t1の車道内の自転車の反射物体
112・・・時刻t1の車道外の静止物体の反射物体
113・・・時刻t1の車道内のヘッドライトの反射による疑似物体
114・・・時刻t2の車道内の自転車の反射物体
115・・・時刻t2の車道外の静止物体の反射物体
116・・・時刻t2の車道内のヘッドライトの反射による疑似物体
117・・・時刻t2-t1の車道内の自転車の反射物体の移動ベクトル
118・・・時刻t2-t1の車道外の静止物体の反射物体の移動ベクトル
119・・・時刻t2-t1の車道内のヘッドライトの反射による疑似物体の移動ベクトル 1...Vehicle 2...
4...Vehicle control unit (vehicle control device)
5...Luminometer 6...CAN
10...
104... Headlight of an oncoming vehicle at
Claims (11)
- 自車両の前方を照らすヘッドライト部と、前記自車両の前方を撮像するカメラ撮像部とを制御するカメラ制御装置であって、
前記カメラ制御装置は、
前記ヘッドライト部に所定の周波数とデューティの点消灯パターンを設定し、前記点消灯パターンを用いて、点消灯するように前記ヘッドライト部を制御し、
前記カメラ撮像部には、前記ヘッドライト部向けの所定の周波数とデューティを基に、定期的な第1の撮像時間間隔と該第1の撮像時間間隔より短い第2の撮像時間間隔の、前記点消灯パターンと異なる撮像タイミングパターンを設定し、前記カメラ撮像部の前記撮像タイミングパターンの時間位相を調整することにより、前記ヘッドライト部の消灯時と点灯時の切替時の撮像時間間隔を削減する、ことを特徴とする、カメラ制御装置。 A camera control device that controls a headlight unit that illuminates the front of the host vehicle, and a camera imaging unit that captures an image of the front of the host vehicle,
The camera control device includes:
Setting a lighting/extinguishing pattern with a predetermined frequency and duty in the headlight section, and controlling the headlight section to turn on/off using the lighting/extinguishing pattern;
The camera imaging unit has a regular first imaging time interval and a second imaging time interval shorter than the first imaging time interval, based on a predetermined frequency and duty for the headlight unit. By setting an imaging timing pattern different from a lighting-on/off pattern and adjusting the time phase of the imaging timing pattern of the camera imaging section, an imaging time interval when switching between turning off and turning on the headlight section is reduced. , a camera control device. - 前記カメラ制御装置は、前記時間位相をずらして前記カメラ撮像部で撮像された各画像の点灯時の画像数と消灯時の画像数に基づいて、前記カメラ撮像部の前記撮像タイミングパターンの時間位相を調整する、ことを特徴とする、請求項1に記載のカメラ制御装置。 The camera control device adjusts the time phase of the imaging timing pattern of the camera imaging unit based on the number of images at the time of lighting and the number of images at the time of turning off of each image captured by the camera imaging unit with the time phase shifted. The camera control device according to claim 1, wherein the camera control device adjusts.
- 前記カメラ制御装置は、前記時間位相をずらして前記カメラ撮像部で撮像された各画像の特定の画素の輝度を測定し、当該輝度で点灯時の画像と消灯時の画像を判断し、前記時間位相でのシャッタータイミングで周期的な点灯時の画像数と消灯時の画像数を確認することにより、前記カメラ撮像部の前記撮像タイミングパターンの時間位相を調整する、ことを特徴とする、請求項1に記載のカメラ制御装置。 The camera control device measures the brightness of a specific pixel of each image captured by the camera imaging unit while shifting the time phase, determines an image when the light is on and an image when the light is off based on the luminance, and calculates the time 10. The time phase of the imaging timing pattern of the camera imaging unit is adjusted by checking the number of images when the light is on and the number of images when the light is off periodically with shutter timing in the phase. 1. The camera control device according to 1.
- 前記カメラ制御装置は、前記カメラ撮像部が撮像した前記ヘッドライト部の消灯と点灯による第一の輝度画像と第二の輝度画像を比較して、反射物体を抽出し、複数の輝度画像を比較して画像内の前記反射物体の移動ベクトルを算出し、前記反射物体の移動ベクトルが前記自車両の移動ベクトルと反対のときに前記反射物体を静止物体、前記反射物体の移動ベクトルがゼロのときに前記反射物体を疑似物体、それ以外のときに前記反射物体を移動物体と判定する、ことを特徴とする、請求項1に記載のカメラ制御装置。 The camera control device compares a first brightness image and a second brightness image obtained by turning off and turning on the headlight unit captured by the camera imaging unit, extracts a reflective object, and compares the plurality of brightness images. to calculate the movement vector of the reflecting object in the image, and when the movement vector of the reflecting object is opposite to the movement vector of the host vehicle, the reflecting object is a stationary object, and when the moving vector of the reflecting object is zero, the reflecting object is a stationary object. 2. The camera control device according to claim 1, wherein the reflective object is determined to be a pseudo object when the reflective object is determined to be a pseudo object, and the reflective object is determined to be a moving object at other times.
- 前記カメラ制御装置は、前記ヘッドライト部のデューティを可変に設定する、ことを特徴とする、請求項1に記載のカメラ制御装置。 The camera control device according to claim 1, wherein the camera control device variably sets the duty of the headlight section.
- 前記カメラ制御装置は、前記ヘッドライト部のデューティを乱数に基づき設定する、ことを特徴とする、請求項5に記載のカメラ制御装置。 The camera control device according to claim 5, wherein the camera control device sets the duty of the headlight section based on a random number.
- 前記カメラ制御装置は、前記ヘッドライト部を、周期的に複数回の異なるデューティで点消灯する、ことを特徴とする、請求項1に記載のカメラ制御装置。 2. The camera control device according to claim 1, wherein the camera control device periodically turns on and off the headlight unit a plurality of times with different duties.
- 前記カメラ制御装置は、前記ヘッドライト部の消灯を、地図情報に基づき特定期間もしくは特定場所にのみ実行させる、ことを特徴とする、請求項1に記載のカメラ制御装置。 The camera control device according to claim 1, wherein the camera control device turns off the headlight section only during a specific period or at a specific location based on map information.
- 前記カメラ制御装置は、障害物を検知したときにハイビームからロービームに切り替える前記ヘッドライト部の配光制御において、前記反射物体の前記疑似物体を前記配光制御の障害物対象から除く、ことを特徴とする、請求項4に記載のカメラ制御装置。 The camera control device is characterized in that, in light distribution control of the headlight section that switches from high beam to low beam when an obstacle is detected, the pseudo object of the reflective object is excluded from the obstacle target of the light distribution control. The camera control device according to claim 4.
- 請求項4に記載のカメラ制御装置から出力される情報を基に、前記自車両を制御する車両制御装置であって、
前記車両制御装置は、前記静止物体と判定された前記反射物体が一定の距離で配置されている場合、道路境界と判断し、前記道路境界への侵入を防止するように警報または前記自車両の操舵を制御する、ことを特徴とする、車両制御装置。 A vehicle control device that controls the host vehicle based on information output from the camera control device according to claim 4,
If the reflective object determined to be a stationary object is located at a certain distance, the vehicle control device determines that the reflective object is a road boundary, and issues an alarm to prevent the own vehicle from invading the road boundary. A vehicle control device that controls steering. - 請求項4に記載のカメラ制御装置から出力される情報を基に、前記自車両を制御する車両制御装置であって、
前記車両制御装置は、前記反射物体が前記移動物体である場合、前記移動物体に対して、前記自車両との相対速度との対比から歩行者または自転車と判断し、過去の軌道から将来の軌道を予測し、前記自車両との衝突可能性のリスクを計算し、前記自車両との衝突可能性のリスクが高いと判断されたときに警報または自動緊急ブレーキ制御を行う、ことを特徴とする、車両制御装置。 A vehicle control device that controls the host vehicle based on information output from the camera control device according to claim 4,
When the reflective object is the moving object, the vehicle control device determines that the moving object is a pedestrian or a bicycle based on a comparison with the relative speed of the own vehicle, and determines the future trajectory from the past trajectory. The present invention is characterized by predicting the possibility of a collision with the own vehicle, calculating the risk of a possibility of a collision with the own vehicle, and performing a warning or automatic emergency braking control when it is determined that the risk of a possibility of a collision with the own vehicle is high. , vehicle control equipment.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022097965 | 2022-06-17 | ||
JP2022-097965 | 2022-06-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023243162A1 true WO2023243162A1 (en) | 2023-12-21 |
Family
ID=89192563
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/008918 WO2023243162A1 (en) | 2022-06-17 | 2023-03-08 | Camera control device and vehicle control device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023243162A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007076429A (en) * | 2005-09-13 | 2007-03-29 | Koito Mfg Co Ltd | Head lamp system |
JP2010235045A (en) * | 2009-03-31 | 2010-10-21 | Toyota Central R&D Labs Inc | Lighting control device and program |
JP2011205619A (en) * | 2010-01-29 | 2011-10-13 | Shinsedai Kk | Image processing apparatus, image processing method, computer program, and electronic device |
JP2011209961A (en) * | 2010-03-29 | 2011-10-20 | Kyocera Corp | Onboard imaging apparatus |
-
2023
- 2023-03-08 WO PCT/JP2023/008918 patent/WO2023243162A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007076429A (en) * | 2005-09-13 | 2007-03-29 | Koito Mfg Co Ltd | Head lamp system |
JP2010235045A (en) * | 2009-03-31 | 2010-10-21 | Toyota Central R&D Labs Inc | Lighting control device and program |
JP2011205619A (en) * | 2010-01-29 | 2011-10-13 | Shinsedai Kk | Image processing apparatus, image processing method, computer program, and electronic device |
JP2011209961A (en) * | 2010-03-29 | 2011-10-20 | Kyocera Corp | Onboard imaging apparatus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10552688B2 (en) | Method and device for detecting objects in the surroundings of a vehicle | |
JP4253271B2 (en) | Image processing system and vehicle control system | |
CN105270254B (en) | Method and device for controlling the light emission of at least one headlight of a vehicle | |
JP4538468B2 (en) | Image processing apparatus, image processing method, and image processing system | |
KR20170102820A (en) | Anti-dazzle headlamp | |
CN113544011B (en) | Method and device for controlling motor vehicle headlights | |
CN103249597A (en) | Vehicle light distribution control device and method | |
US9586515B2 (en) | Method and device for recognizing an illuminated roadway ahead of a vehicle | |
EP2525302A1 (en) | Image processing system | |
RU2760074C1 (en) | Headlamp control method and headlamp control device | |
JP2023516994A (en) | Automotive ambient monitoring system | |
CN114514143A (en) | Light distribution control device, vehicle position detection device, vehicle lamp system, light distribution control method, and vehicle position detection method | |
JP5251680B2 (en) | Lighting control apparatus and program | |
JP7381388B2 (en) | Signal lamp status identification device, signal lamp status identification method, computer program for signal lamp status identification, and control device | |
WO2023243162A1 (en) | Camera control device and vehicle control device | |
CN110774976B (en) | Device and method for controlling a vehicle headlight | |
JP6335065B2 (en) | Outside environment recognition device | |
CN112565618B (en) | Exposure control device | |
JP6539191B2 (en) | Outside environment recognition device | |
JP6549974B2 (en) | Outside environment recognition device | |
JP6853890B2 (en) | Object detection system | |
JP7084223B2 (en) | Image processing equipment and vehicle lighting equipment | |
CN110161523B (en) | Method for estimating wall position and activating active triangulation of matrix headlight system of motor vehicle | |
WO2019156087A1 (en) | Image processing device and vehicle light fixture | |
JP6278217B1 (en) | Vehicle headlight control device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23823466 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2024528298 Country of ref document: JP Kind code of ref document: A |