WO2020121758A1 - Imaging unit control device - Google Patents

Imaging unit control device Download PDF

Info

Publication number
WO2020121758A1
WO2020121758A1 PCT/JP2019/045356 JP2019045356W WO2020121758A1 WO 2020121758 A1 WO2020121758 A1 WO 2020121758A1 JP 2019045356 W JP2019045356 W JP 2019045356W WO 2020121758 A1 WO2020121758 A1 WO 2020121758A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving body
image
unit
control device
moving
Prior art date
Application number
PCT/JP2019/045356
Other languages
French (fr)
Japanese (ja)
Inventor
一貴 湯浅
門司 竜彦
野中 進一
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Priority to JP2020559891A priority Critical patent/JP7122394B2/en
Priority to CN201980079112.1A priority patent/CN113170057B/en
Publication of WO2020121758A1 publication Critical patent/WO2020121758A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the present disclosure relates to an imaging unit control device.
  • the moving body imaging device described in Patent Document 1 includes a plurality of imaging units, a system control unit, and a display/recording unit, and is mounted on a moving body.
  • the image pickup unit includes at least an image pickup device, a signal processing unit that generates a video signal from an output signal of the image pickup device, and an image pickup control unit that controls the image pickup device and the signal processing unit.
  • the system control unit controls the plurality of imaging units.
  • the display/recording unit displays or records the video signals output from the plurality of imaging units.
  • the system control unit causes the plurality of imaging units to have a time difference from each other based on position information of the moving body and map information including a range in which the moving body is moving. It is characterized in that the control is performed by (see the same document, claim 1, etc.).
  • an extremely dark area or an extremely bright area exists in a part of an image in front of the moving body taken by the image pickup section, and the exposure of the image pickup section is corrected based on the area. May occur.
  • the image captured by the image capturing unit may have overexposure due to overexposure or underexposure due to underexposure. It may occur.
  • the present disclosure provides an image capturing unit control device that can more appropriately adjust the exposure of the image capturing unit than in the past.
  • One aspect of the present disclosure is an imaging unit control device that controls an imaging unit mounted on a moving body, the moving direction calculating unit calculating a moving direction of the moving body, and the imaging unit according to the moving direction.
  • An image pickup unit including an image region setting unit that sets an image region serving as a reference for exposure correction in an image captured by the image pickup unit, and an exposure correction unit that performs exposure correction of the image pickup unit based on image information of the image region. It is a control device.
  • an image capturing unit control device capable of appropriately adjusting the exposure of the image capturing unit as compared with the related art. Can be provided.
  • FIG. 3 is a block diagram of an imaging unit controller according to an embodiment of the present disclosure.
  • FIG. 3 is a flowchart showing an example of a method of controlling the image pickup unit by the image pickup unit control device of FIG. 1.
  • FIG. 3 is a diagram showing an example of an image area set by an image area setting unit in FIG. 1.
  • 3 is a graph illustrating exposure compensation by the exposure compensation unit in FIG. 1.
  • 3 is a graph illustrating exposure compensation by the exposure compensation unit in FIG. 1.
  • 3 is a graph illustrating exposure compensation by the exposure compensation unit in FIG. 1.
  • FIG. 9 is a flowchart showing a modified example of a method of controlling the image pickup unit by the image pickup unit control device of FIG.
  • FIG. 6 is a diagram showing an example of an image region serving as a reference for exposure correction in an image pickup apparatus of a comparative form.
  • FIG. 1 is a block diagram of an image capturing unit controller 100 according to an embodiment of the present disclosure. Although details will be described later, the imaging unit control device 100 of the present embodiment is mainly characterized by the following configuration.
  • the image capturing unit control device 100 is a device that controls the image capturing unit 11 mounted on the moving body 10.
  • the image capturing unit control apparatus 100 includes a moving direction calculation unit 110 that calculates a moving direction of the moving body 10 and an image area that is a reference for exposure correction in an image captured by the image capturing unit 11 according to the moving direction of the moving body 10.
  • An image region setting unit 120 to be set and an exposure correction unit 130 that performs exposure correction of the image pickup unit 11 based on image information of the image region are provided.
  • the moving body 10 is, for example, a vehicle such as an automobile, and includes an imaging unit 11, a yaw rate sensor 12, a steering angle sensor 13, a wheel speed sensor 14, and a direction indicator 15.
  • the moving body 10 is equipped with, for example, a power generation device, a transmission device, an electronic control unit (ECU), a traveling device, instruments, a headlight, a horn, and the like.
  • the power generation device includes, for example, at least one of an engine and an electric motor.
  • the traveling device includes, for example, a frame, a suspension, a steering system, a braking system, wheels, tires, and the like.
  • the image pickup unit 11 is, for example, a stereo camera device including a right camera and a left camera or a monocular camera device including a monocular camera.
  • the image pickup unit 11 includes, for example, a lens barrel, a lens, an iris, a shutter, an image pickup device such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device Image Sensor), and an image processing device that processes image data. ..
  • the imaging unit control device 100 may be a stereo camera device or a monocular camera device including the imaging unit 11.
  • a stereo camera device is equipped with, for example, a right camera and a left camera that have optical axes parallel to each other and have a distance (baseline length) between them. There is a difference called parallax between the coordinates of the object captured in the image captured by the right camera and the coordinates of the same object captured in the image captured by the left camera.
  • the stereo camera device can apply the template matching to the images of the right camera and the left camera to obtain the parallax, and use the triangulation method from the obtained parallax to obtain the position of the object in the three-dimensional space.
  • the imaging unit 11 outputs the captured image and the parallax information to the imaging unit control device 100.
  • the yaw rate sensor 12 is, for example, a gyro sensor for an automobile, and measures the yaw rate of the moving body 10 and outputs it to the image pickup unit control device 100.
  • the steering angle sensor 13 is attached to, for example, the steering shaft of the moving body 10 that is an automobile, measures the steering angle of the front wheels of the moving body 10, and outputs it to the imaging unit control device 100.
  • the wheel speed sensor 14 is attached to, for example, a hub or a knuckle of the moving body 10 which is an automobile, and outputs a signal corresponding to the rotation speed of each wheel of the moving body 10 to the imaging unit control device 100.
  • the direction indicator 15 is operated, for example, by an occupant driving the moving body 10 in accordance with the planned moving direction of the moving body 10, and outputs the right or left planned moving direction to the imaging unit control device 100.
  • Each unit of the imaging unit control device 100 is configured by, for example, a central processing unit (CPU), an integrated circuit (LSI), a storage device, a microcomputer including a program, and the like. As described above, when the imaging unit 11 is a stereo camera device or a monocular camera device, the imaging unit control device 100 may be a part of the ECU mounted on the moving body 10.
  • the moving direction calculation unit 110 calculates the moving direction of the moving body 10.
  • the moving direction is, for example, three directions of a straight traveling direction, a rightward direction, and a leftward direction.
  • the moving direction calculation unit 110 for example, of the moving body 10 based on the position information of the moving body 10 input from the position information acquisition unit 140 and the moving route of the moving body 10 input from the moving route calculation unit 150. It is configured to perform a calculation for predicting a moving direction. Further, the moving direction calculation unit 110 is configured to calculate the actual moving direction of the mobile body 10 based on the behavior information of the mobile body 10 acquired by the behavior information acquisition unit 160.
  • the image area setting unit 120 sets an image area IR, which is a reference for exposure correction, in a part of the image IM captured by the image capturing unit 11 according to the moving direction of the moving body 10 calculated by the moving direction calculating unit 110.
  • the image region setting unit 120 for example, the central region CR including the vanishing point of the image IM and the central region according to the moving direction of the straight direction S, the right direction R, or the left direction L.
  • the image region IR is set in either the right region RR on the right side of the CR or the left region LR on the left side of the central region CR.
  • the image area setting unit 120 may set the image area IR based on the external world information recognized by the external world information recognition unit 170, as described later.
  • the exposure correction unit 130 performs the exposure correction of the image pickup unit 11 based on the image information of the image area IR set by the image area setting unit 120. More specifically, the exposure correction unit 130 determines, for example, red, green, and blue (RGB) color information and brightness of each pixel as the image information of the image region IR. Further, the exposure correction unit 130 controls the iris, the gain, and the shutter speed of the image pickup unit 11 to perform the exposure correction of the image pickup unit 11, for example. In addition, the exposure correction unit 130 may perform the exposure correction of the imaging unit 11 by electronically correcting the image captured by the imaging unit 11, for example.
  • RGB red, green, and blue
  • the position information acquisition unit 140 acquires the position information of the mobile unit 10. More specifically, the position information acquisition unit 140 acquires the position information of the moving body 10 using, for example, the Global Positioning System (GPS) or the Global Navigation Satellite System (GNSS). Further, the position information acquisition unit 140 calculates the moving direction and the movement amount of the moving body 10 based on, for example, the yaw rate of the moving body 10, the steering angle, the rotation speed of each wheel, and the like acquired by the behavior information acquiring unit 160. By doing so, the position information of the moving body 10 may be acquired.
  • GPS Global Positioning System
  • GNSS Global Navigation Satellite System
  • the moving route calculation unit 150 calculates the moving route of the moving body 10 from the position information of the moving body 10 acquired by the position information acquiring unit 140. More specifically, the movement route calculation unit 150 is provided, for example, so that the road map information can be accessed, and the road map information, the position information of the moving body 10, and the destination input by the occupant of the moving body 10 are displayed. Based on the position information, the moving route from the current position of the moving body 10 to the destination is calculated.
  • the behavior information acquisition unit 160 acquires the yaw rate of the moving body 10 as the behavior information of the moving body 10 from the yaw rate sensor 12 mounted on the moving body 10, and outputs the acquired yaw rate to the moving direction calculation unit 110, for example. .. Further, the behavior information acquisition unit 160 acquires the steering angle of the moving body 10 as the behavior information of the moving body 10 from the steering angle sensor 13, and outputs the acquired steering angle to the moving direction calculation unit 110, for example.
  • the behavior information acquisition unit 160 acquires, for example, the rotation speed of each wheel from the wheel speed sensor 14 as the behavior information of the moving body 10, and outputs the acquired rotation speed of each wheel to the movement direction calculation unit 110. .. Further, the behavior information acquisition unit 160 acquires, for example, the planned movement direction of the mobile body 10 as the behavior information of the mobile body 10 from the direction indicator 15, and the acquired planned movement direction of the mobile body 10 is calculated as the movement direction calculation unit. Output to 110.
  • the outside world information recognition unit 170 recognizes outside world information including obstacle information around the moving body 10 and road information from the image captured by the image capturing unit 11. More specifically, the outside world information recognition unit 170 performs an object detection process using the image data captured by the imaging unit 11 and recognizes outside world information including obstacle information and road information.
  • the obstacle information includes, for example, the position, size, and shape of vehicles, pedestrians, buildings, guardrails, median strips, curbs, and utility poles.
  • the road information includes, for example, road shapes, white lines, road markings, and road signs.
  • the imaging device of the comparative form is different from the imaging unit control device 100 of the present embodiment in that, for example, the imaging device control device 100 of the present embodiment does not include the image region setting unit 120 that sets the image region IR according to the moving direction of the moving body 10.
  • the imaging device control device 100 of the present embodiment does not include the image region setting unit 120 that sets the image region IR according to the moving direction of the moving body 10.
  • FIG. 8A is a diagram showing an example of an image region IR serving as a reference for exposure correction of the image capturing unit 11 in the image capturing apparatus of the comparative mode.
  • FIG. 8B is a diagram showing an example of whiteout WO generated in the image IM of the image pickup unit 11 in the image pickup apparatus of the comparative form.
  • the imaging device of the comparative form does not have the image region setting unit 120 that sets the image region IR according to the moving direction of the moving body 10. Therefore, in the image pickup apparatus of the comparative form, the image region IR that is the reference for the exposure correction of the image pickup unit 11 is, for example, an image regardless of the moving direction of the moving body 10, such as the straight traveling direction S, the right direction R, and the left direction L. It is fixed at the center including the vanishing point of IM.
  • the image IM of the imaging unit 11 includes an extremely dark region DR such as a recessed portion of the building B where sunlight is hard to reach.
  • the extremely dark region DR may be included in the image region IR that serves as a reference for the exposure correction of the imaging unit 11. Then, in the image pickup apparatus of the comparative form, the exposure of the image pickup unit 11 is corrected with reference to the image region IR including the extremely dark region DR, so that the exposure of the image pickup unit 11 increases.
  • the image area IR at the center of the image IM may include an extremely bright area. Then, in the image pickup apparatus of the comparative mode, the exposure of the image pickup unit 11 is corrected with the image region IR including an extremely bright region as a reference, so that the exposure of the image pickup unit 11 is reduced.
  • the extremely bright region in the center of the image IM can be easily recognized.
  • the image IM of the image pickup apparatus of the comparative form for example, in the regions on the left and right of the image region IR shown in FIG. 8A, which is the reference for the exposure correction of the image pickup unit 11, due to underexposure, contrary to the example shown in FIG. Problems such as blackouts may occur.
  • ADAS advanced driving support system
  • automatic driving a technique of recognizing information on obstacles and roads from the image IM of the imaging unit 11 is used.
  • a defect such as overexposure or blackout occurs in the image IM of the imaging unit 11 at the intersection in front of the moving body 10 when the moving body 10 moves in the right direction R or the left direction L when making a right turn or a left turn.
  • it is required to suppress the above-mentioned problems that occur in the image IM of the imaging unit 11.
  • FIG. 2 is a flowchart showing an example of a control method of the image pickup unit 11 by the image pickup unit control apparatus 100 of the present embodiment.
  • FIG. 3 is a diagram showing an example of the image region IR set in the image IM of the image pickup unit 11 by the image region setting unit 120 of the image pickup unit control apparatus 100 of the present embodiment.
  • the imaging unit control device 100 calculates the movement route of the moving body 10. More specifically, in the imaging unit control device 100, the position information acquisition unit 140 acquires the position information of the moving body 10, and the moving route calculation unit 150 calculates the moving route of the moving body 10 from the position information.
  • step S12 the imaging unit control device 100 calculates the moving direction of the moving body 10. More specifically, the image capturing unit control apparatus 100 calculates the moving direction of the moving body 10 based on the position information and the moving route of the moving body 10 obtained in step S11. Thereby, as shown in FIG. 3, for example, the imaging unit control device 100 determines that the moving direction of the moving body 10 at the intersection in front of the moving body 10 is left among the straight traveling direction S, the rightward direction R, and the leftward direction L. Predict the direction L.
  • the imaging unit control apparatus 100 acquires the behavior information of the moving body 10. More specifically, in the imaging unit control device 100, the behavior information acquisition unit 160 acquires the yaw rate of the moving body 10 from the yaw rate sensor 12 mounted on the moving body 10 as the behavior information. In addition, the image capturing unit control apparatus 100 acquires the steering angle of the moving body 10 as the behavior information from the steering angle sensor 13 mounted on the moving body 10 by the behavior information acquiring unit 160. In addition, the imaging unit control device 100 causes the behavior information acquisition unit 160 to acquire the rotation speeds of the plurality of wheels of the moving body 10 as the behavior information from the wheel speed sensor 14 mounted on the moving body 10. Further, the imaging unit control device 100 causes the behavior information acquisition unit 160 to acquire the planned movement direction of the mobile body 10 as the behavior information from the direction indicator 15 mounted on the mobile body 10.
  • the imaging unit control device 100 determines whether the left direction L, which is the moving direction of the moving body 10 calculated in step S12, and the behavior information of the moving body 10 acquired in step S13 match. To determine. More specifically, the imaging unit control device 100 causes the movement direction calculation unit 110 to perform at least one behavior of the yaw rate of the moving body 10, the steering angle, the difference between the rotation speeds of the left and right wheels, and the planned movement direction. Based on the information, the actual moving direction of the moving body 10 at the intersection shown in the image IM of the imaging unit 11 is calculated.
  • the moving direction calculation unit 110 determines that the left direction L, which is the moving direction of the moving body 10 predicted in step S12, and the actual moving direction of the moving body 10 calculated based on the behavior information of the moving body 10 are equal to each other. Determine whether you are doing.
  • step S14 when the moving direction of the moving body 10 and the behavior information match (YES), the process proceeds to step S15, and the moving direction calculation unit 110 is the moving direction of the moving body 10 calculated in step S12.
  • the direction L is output to the image area setting unit 120.
  • the image capturing unit control apparatus 100 sets an image region IR as a reference for exposure correction in the image IM captured by the image capturing unit 11 according to the moving direction of the moving body 10. More specifically, the image capturing unit control apparatus 100 uses, for example, the image region setting unit 120 to move in the left direction, which is the moving direction of the moving body 10 among the central region CR, the right region RR, and the left region LR of the image IM. The image region IR is set in the left region LR corresponding to L.
  • the imaging unit control device 100 performs exposure correction of the imaging unit 11 based on the image information of the image area IR set according to the moving direction of the moving body 10. More specifically, the image capturing unit control device 100, based on the image information of the image region IR set in the left region LR of the image IM according to the left direction L which is the moving direction of the moving body 10, the exposure correction unit. The exposure correction of the imaging unit 11 is performed by 130. When the image capturing unit 11 includes two cameras, exposure correction is performed on these two cameras at the same time.
  • the 4A to 4C are graphs for explaining exposure correction by the exposure correction unit 130 of the image pickup unit control device 100.
  • the exposure correction unit 130 performs the exposure correction of the imaging unit 11 based on the image information of the image area IR. More specifically, the exposure correction unit 130 uses the brightness information as the image information of the image region IR, for example, and determines the exposure correction value of the imaging unit 11 according to the histogram created based on the brightness information.
  • the exposure correction unit 130 determines a negative exposure correction value so as to reduce the exposure.
  • the exposure correction unit 130 sets the exposure correction value to zero so as to maintain the exposure.
  • the exposure correction unit 130 determines a positive exposure correction value so as to increase the exposure.
  • step S14 when the moving direction of the moving body 10 and the behavior information do not match (NO), the process proceeds to step S18, and the moving direction calculation unit 110 uses the behavior information of the moving body 10 acquired in step S13.
  • the straight traveling direction S or the right direction R which is the traveling direction based on the traveling direction, is output to the image area setting unit 120. This is the case, for example, when the occupant driving the moving body 10 selects a route different from the moving route of the moving body 10 calculated by the moving route calculation unit 150.
  • the imaging unit control device 100 causes, for example, the image region setting unit 120 to move straight in the moving direction of the moving body 10 among the central region CR, the right region RR, and the left region LR of the image IM.
  • the image region IR is set in the central region CR or the right region RR corresponding to the direction S or the right direction R. In this way, by comparing the moving direction of the moving body 10 and the behavior information in step S14, it becomes possible for the occupant driving the moving body 10 to respond to a case where the moving route is changed.
  • step S17 the imaging unit control device 100 performs the above-described processing based on the image information of the image region IR set in the central region CR or the right region RR of the image IM according to the moving direction of the moving body 10.
  • the exposure correction of the image pickup unit 11 is performed by the exposure correction unit 130.
  • the imaging unit control device 100 of the present embodiment is a device that controls the imaging unit 11 mounted on the moving body 10.
  • the image capturing unit control apparatus 100 includes a moving direction calculation unit 110 that calculates a moving direction of the moving body 10 and an image region IR that serves as a reference for exposure correction in an image captured by the image capturing unit 11 according to the moving direction of the moving body 10.
  • the image area setting unit 120 that sets the image area and the exposure correction unit 130 that performs the exposure correction of the imaging unit 11 based on the image information of the image area IR.
  • the exposure correction can be performed with reference to the image region IR set in the left region LR of the image IM according to the left direction L which is the traveling direction of the moving body 10. .. Therefore, for example, as shown in FIG. 8A, in a straight traveling direction S different from the left direction L which is the moving direction of the moving body 10, an extremely dark region DR or an extremely bright region is formed in the central portion of the image IM of the imaging unit 11. Even if it exists, the exposure correction can be performed on the basis of the image region IR set in the left region LR that does not include the region.
  • the exposure of the imaging unit 11 can be adjusted more appropriately than the conventional imaging unit control.
  • a device 100 can be provided.
  • the imaging unit control device 100 of the present embodiment the position information acquisition unit 140 that acquires the position information of the moving body 10, and the movement route calculation unit 150 that calculates the moving route of the moving body 10 from the position information of the moving body 10. And a behavior information acquisition unit 160 that acquires behavior information of the moving body 10.
  • the image capturing unit control apparatus 100 can control the image capturing unit 11 using the position information, the moving route, and the behavior information of the moving body 10.
  • the moving direction calculation unit 110 is configured to calculate the moving direction based on the position information of the moving body 10 and the moving route. With this configuration, the imaging unit control device 100 can predict the moving direction of the moving body 10 at an intersection in front of the moving body 10, for example.
  • the behavior information acquisition unit 160 is configured to acquire the yaw rate of the moving body 10 from the yaw rate sensor 12 mounted on the moving body 10 as the behavior information.
  • the moving direction calculation unit 110 is configured to calculate the moving direction of the moving body 10 based on the yaw rate of the moving body 10.
  • the image capturing unit control apparatus 100 can set the image region IR that serves as a reference for the exposure correction according to the actual moving direction of the moving body 10 based on the yaw rate of the moving body 10.
  • the behavior information acquisition unit 160 is configured to acquire the steering angle of the moving body 10 as the behavior information from the steering angle sensor 13 mounted on the moving body 10.
  • the moving direction calculation unit 110 is configured to calculate the moving direction of the moving body 10 based on the steering angle of the moving body 10. With this configuration, the imaging unit control device 100 can set the image region IR that is the reference of the exposure correction, according to the actual moving direction of the moving body 10 based on the steering angle of the moving body 10.
  • the behavior information acquisition unit 160 acquires the rotation speeds of the plurality of wheels of the moving body 10 from the wheel speed sensor 14 mounted on the moving body 10 as the behavior information. It is configured. Further, the moving direction calculation unit 110 is configured to calculate the moving direction of the moving body 10 based on the difference in rotation speed of the plurality of wheels of the moving body 10. With this configuration, the image capturing unit control apparatus 100 can set the image region IR that is the reference for the exposure correction, according to the actual moving direction of the moving body 10 based on the difference in the rotation speed of the wheels of the moving body 10. ..
  • the behavior information acquisition unit 160 is configured to acquire the planned movement direction of the mobile body 10 as the behavior information from the direction indicator 15 mounted on the mobile body 10.
  • the moving direction calculation unit 110 is configured to calculate the moving direction of the moving body 10 based on the planned moving direction of the moving body 10. With this configuration, the imaging unit control device 100 can set the image region IR that is the reference for the exposure correction, according to the predicted moving direction of the moving body 10 based on the operation of the direction indicator 15 of the moving body 10.
  • the exposure of the imaging unit 11 can be adjusted more appropriately than before by setting the image area IR that is the reference of the exposure correction according to the moving direction of the moving body 10. It is possible to provide the image capturing unit control device 100.
  • imaging unit control device is not limited to the configuration of the imaging unit control device 100 according to the above-described embodiment.
  • some modified examples of the imaging unit control device 100 according to the above-described embodiment will be described with reference to FIGS. 5 to 7.
  • FIG. 5 is a diagram showing a modified example of the image area IR set by the image area setting unit 120 of the imaging unit control device 100 according to the above-described embodiment.
  • the image pickup unit control device 100 includes the outside world information recognition unit 170 that recognizes outside world information including obstacle information around the moving body 10 and road information from the image taken by the image pickup unit 11 as described above. Further, in the present modification, the image area setting unit 120 is configured to set the image area IR based on the external world information. With this configuration, it is possible to set an appropriate image region IR in the image IM even when the mounting position of the camera of the imaging unit 11 is displaced or the diameter of the wheel of the moving body 10 is changed.
  • the external world information recognition unit 170 obtains the position of the center of gravity G of the moving body such as the vehicle V or a pedestrian recognized from the image IM based on the coordinates of the image IM of the imaging unit 11, and determines the position of the center of gravity G. Record the trajectory. After recording a predetermined number or more of loci, the external information recognizing unit 170 plots the center of gravity G moving in the lateral direction of the image IM on the image IM. As a result, the outside world information recognition unit 170 can recognize a region in the image IM through which a moving body such as a vehicle or a pedestrian passes.
  • the image area setting unit 120 sets an image area IR in an area through which a moving body such as a vehicle or a pedestrian recognized by the external information recognition unit 170 passes. More specifically, the image region setting unit 120 sets the image region IR in the region where a moving body such as a vehicle or a pedestrian passes in accordance with the moving direction of the moving body 10. More specifically, the image region setting unit 120 sets the image region IR in the central region CR of the image IM when the moving body 10 goes straight, and sets the image region IR in the right region RR of the image IM when the moving body 10 turns right. When the mobile body 10 is turned left, the image area IR is set in the left area LR of the image IM.
  • the image region setting unit 120 for example, when the moving body 10 is determined to be traveling straight on a straight road and the curvature of the white line displayed on the road is equal to or less than a threshold value,
  • the image region IR may be set in an arbitrary central region CR centered on the vanishing points of the white lines on both sides.
  • the vanishing point of the white line can be recognized by the external world information recognition unit 170, for example.
  • whether or not the moving body 10 is traveling straight on a straight road can be determined using, for example, the behavior information acquired by the behavior information acquisition unit 160. For example, if both the steering angle and the yaw rate of the moving body 10 are equal to or less than the threshold value, it can be determined that the moving body 10 is moving straight ahead.
  • FIG. 6 is a flowchart showing a modified example of the control method of the image pickup unit 11 by the image pickup unit control apparatus 100 according to the above-described embodiment.
  • FIG. 7 is a diagram showing the image region IR set by the image region setting unit 120 of the image capturing unit control apparatus 100 according to this modification.
  • the imaging unit control device 100 does not have to include the position information acquisition unit 140, the movement route calculation unit 150, and the external world information recognition unit 170.
  • step S21 the imaging unit control apparatus 100 according to the present modification acquires the behavior information of the moving body 10 by the behavior information acquisition unit 160, as in step S13 described above shown in FIG.
  • step S22 the imaging unit control device 100 according to the present modification causes the moving direction calculation unit 110 to calculate the moving direction of the moving body 10 based on the behavior information of the moving body 10.
  • the moving direction calculation unit 110 calculates the moving direction of the moving body 10 as the straight traveling direction S. Further, when the moving body 10 is traveling on the left curve or turning left at the intersection, the steering angle and the yaw rate are equal to or more than the threshold value. In this case, the moving direction calculation unit 110 calculates the moving direction of the moving body 10 as the left direction L shown in FIG. 7.
  • the imaging unit control apparatus 100 sets the image region IR in the image IM by the image region setting unit 120 according to the moving direction of the moving body 10.
  • the image region setting unit 120 sets the image region IR in the central region CR shown in FIG. 7.
  • the image area setting unit 120 sets the image area IR in the left area LR shown in FIG. 7.
  • the image area IR may be set by the image area setting unit 120 dynamically, for example, according to the temporal change of the behavior information of the moving body 10.
  • step S24 the imaging unit control device 100 according to the present modification, based on the image information of the image region IR set according to the moving direction of the moving body 10, as in step S17 shown in FIG. Exposure compensation of the image pickup unit 11 is performed.
  • the exposure of the imaging unit 11 is adjusted more appropriately than before by setting the image region IR that is the reference of the exposure correction according to the moving direction of the moving body 10. It is possible to provide a possible imaging unit control device 100.
  • the embodiment of the imaging unit control device according to the present disclosure and its modifications have been described in detail with reference to the drawings.
  • the specific configuration of the imaging unit control device according to the present disclosure is not limited to the above-described embodiments and modifications thereof, and even if there are design changes and the like within a range not departing from the gist of the present disclosure, Are included in the present disclosure.

Abstract

An imaging unit control device is provided which can more suitably adjust the exposure of an imaging unit. This imaging unit control device 100 controls an imaging unit 11 mounted on a moving body 10. The imaging unit control device 100 is provided with a movement direction calculation unit 110 which calculates the direction of movement of the moving body 10, an image region setting unit 120 which, depending on the direction of movement of the moving body 10, sets an image region, which acts as a reference for exposure correction, in an image captured by the imaging unit 11, and an exposure correction unit 130 which corrects exposure of the imaging unit 11 on the basis of image information about the image region.

Description

撮像部制御装置Imaging unit control device
 本開示は、撮像部制御装置に関する。 The present disclosure relates to an imaging unit control device.
 従来から撮像素子を用いた撮影装置に係る発明が知られている(下記特許文献1を参照)。特許文献1に記載された移動体撮像装置は、複数の撮像部と、システム制御部と、表示・記録部とを備え、移動体に搭載される。上記撮像部は、撮像素子と、その撮像素子の出力信号から映像信号を生成する信号処理部と、これら撮像素子と信号処理部とを制御する撮像制御部を少なくとも有する。 An invention relating to a photographing device using an image sensor has been known in the past (see Patent Document 1 below). The moving body imaging device described in Patent Document 1 includes a plurality of imaging units, a system control unit, and a display/recording unit, and is mounted on a moving body. The image pickup unit includes at least an image pickup device, a signal processing unit that generates a video signal from an output signal of the image pickup device, and an image pickup control unit that controls the image pickup device and the signal processing unit.
 上記システム制御部は、上記複数の撮像部を制御する。上記表示・記録部は、上記複数の撮像部から出力される映像信号を表示もしくは記録する。この従来の移動体撮像装置は、上記システム制御部が、上記移動体の位置情報や上記移動体が移動している範囲を含む地図情報に基づいて、上記複数の撮像部を互いに時差を持たせて制御を行なうことを特徴とする(同文献、請求項1等を参照)。 The system control unit controls the plurality of imaging units. The display/recording unit displays or records the video signals output from the plurality of imaging units. In this conventional moving body imaging device, the system control unit causes the plurality of imaging units to have a time difference from each other based on position information of the moving body and map information including a range in which the moving body is moving. It is characterized in that the control is performed by (see the same document, claim 1, etc.).
 この従来の発明によると、移動体の現状の周囲状況に対して、次に生じると予測される周囲状況の変化が複数の撮像部に対して時系列的に及ぼす影響に対応して、それぞれの撮像部を適正に制御することができるので、移動体の操縦者による映像の視認性が向上する(同文献、第0020段落等を参照)。 According to this conventional invention, with respect to the current surrounding situation of the moving body, in response to the influence of a change in the surrounding situation expected to occur next on the plurality of imaging units in time series, Since the imaging unit can be appropriately controlled, the visibility of the image by the operator of the moving body is improved (see the same document, paragraph 0020, etc.).
特開2007-081695号公報JP, 2007-081695, A
 前記従来のような撮像装置では、撮像部によって撮影された移動体の前方の画像の一部に極端に暗い領域や極端に明るい領域が存在し、その領域に基づいて撮像部の露出補正がされるおそれがある。この場合、移動体が方向転換して露出補正に用いられた画像領域とは異なる方向へ移動すると、撮像部によって撮影された画像に、露出オーバーによる白とびや露出アンダーによる黒つぶれなどの不具合が発生するおそれがある。 In the conventional image pickup apparatus, an extremely dark area or an extremely bright area exists in a part of an image in front of the moving body taken by the image pickup section, and the exposure of the image pickup section is corrected based on the area. May occur. In this case, if the moving body changes its direction and moves in a direction different from the image area used for the exposure compensation, the image captured by the image capturing unit may have overexposure due to overexposure or underexposure due to underexposure. It may occur.
 本開示は、撮像部の露出を従来よりも適切に調節可能な撮像部制御装置を提供する。 The present disclosure provides an image capturing unit control device that can more appropriately adjust the exposure of the image capturing unit than in the past.
 本開示の一態様は、移動体に搭載される撮像部を制御する撮像部制御装置であって、前記移動体の移動方向を算出する移動方向演算部と、前記移動方向に応じて前記撮像部によって撮影された画像に露出補正の基準となる画像領域を設定する画像領域設定部と、前記画像領域の画像情報に基づいて前記撮像部の露出補正を行う露出補正部と、を備える、撮像部制御装置である。 One aspect of the present disclosure is an imaging unit control device that controls an imaging unit mounted on a moving body, the moving direction calculating unit calculating a moving direction of the moving body, and the imaging unit according to the moving direction. An image pickup unit including an image region setting unit that sets an image region serving as a reference for exposure correction in an image captured by the image pickup unit, and an exposure correction unit that performs exposure correction of the image pickup unit based on image information of the image region. It is a control device.
 本開示の上記一態様によれば、移動体の移動方向に応じて露出補正の基準となる画像領域を設定することで、撮像部の露出を従来よりも適切に調節可能な撮像部制御装置を提供することができる。 According to the above aspect of the present disclosure, by setting an image region serving as a reference for exposure correction according to a moving direction of a moving body, an image capturing unit control device capable of appropriately adjusting the exposure of the image capturing unit as compared with the related art. Can be provided.
本開示の実施形態に係る撮像部制御装置のブロック図。FIG. 3 is a block diagram of an imaging unit controller according to an embodiment of the present disclosure. 図1の撮像部制御装置による撮像部の制御方法の一例を示すフロー図。FIG. 3 is a flowchart showing an example of a method of controlling the image pickup unit by the image pickup unit control device of FIG. 1. 図1の画像領域設定部によって設定された画像領域の一例を示す図。FIG. 3 is a diagram showing an example of an image area set by an image area setting unit in FIG. 1. 図1の露出補正部による露出補正を説明するグラフ。3 is a graph illustrating exposure compensation by the exposure compensation unit in FIG. 1. 図1の露出補正部による露出補正を説明するグラフ。3 is a graph illustrating exposure compensation by the exposure compensation unit in FIG. 1. 図1の露出補正部による露出補正を説明するグラフ。3 is a graph illustrating exposure compensation by the exposure compensation unit in FIG. 1. 図1の画像領域設定部によって設定された画像領域の変形例を示す図。The figure which shows the modification of the image area|region set by the image area|region setting part of FIG. 図1の撮像部制御装置による撮像部の制御方法の変形例を示すフロー図。FIG. 9 is a flowchart showing a modified example of a method of controlling the image pickup unit by the image pickup unit control device of FIG. 1. 図1の画像領域設定部によって設定された画像領域の変形例を示す図。The figure which shows the modification of the image area|region set by the image area|region setting part of FIG. 比較形態の撮像装置で露出補正の基準となる画像領域の一例を示す図。FIG. 6 is a diagram showing an example of an image region serving as a reference for exposure correction in an image pickup apparatus of a comparative form. 比較形態の撮像装置で発生した白とびの一例を示す図。The figure which shows an example of the whiteout which occurred in the imaging device of a comparative form.
 以下、図面を参照して本開示に係る撮像部制御装置の実施形態を説明する。 Hereinafter, an embodiment of an imaging unit control device according to the present disclosure will be described with reference to the drawings.
 図1は、本開示の実施形態に係る撮像部制御装置100のブロック図である。詳細については後述するが、本実施形態の撮像部制御装置100は次の構成を主な特徴としている。 FIG. 1 is a block diagram of an image capturing unit controller 100 according to an embodiment of the present disclosure. Although details will be described later, the imaging unit control device 100 of the present embodiment is mainly characterized by the following configuration.
 撮像部制御装置100は、移動体10に搭載される撮像部11を制御する装置である。撮像部制御装置100は、移動体10の移動方向を算出する移動方向演算部110と、移動体10の移動方向に応じて撮像部11によって撮影された画像に露出補正の基準となる画像領域を設定する画像領域設定部120と、その画像領域の画像情報に基づいて撮像部11の露出補正を行う露出補正部130と、を備える。 The image capturing unit control device 100 is a device that controls the image capturing unit 11 mounted on the moving body 10. The image capturing unit control apparatus 100 includes a moving direction calculation unit 110 that calculates a moving direction of the moving body 10 and an image area that is a reference for exposure correction in an image captured by the image capturing unit 11 according to the moving direction of the moving body 10. An image region setting unit 120 to be set and an exposure correction unit 130 that performs exposure correction of the image pickup unit 11 based on image information of the image region are provided.
 以下、撮像部制御装置100が搭載される移動体10の各部の構成と、撮像部制御装置100の各部の構成について、それぞれ詳細に説明する。 Hereinafter, the configuration of each part of the moving body 10 in which the imaging unit control device 100 is mounted and the configuration of each part of the imaging unit control device 100 will be described in detail.
 移動体10は、たとえば自動車などの車両であり、撮像部11と、ヨーレートセンサ12と、舵角センサ13と、車輪速センサ14と、方向指示器15と、を備えている。また、図示は省略するが、移動体10は、たとえば、動力発生装置、伝動装置、電子制御装置(ECU)、走行装置、計器類、ヘッドライト、ホーン、などを備えている。動力発生装置は、たとえば、エンジンと電動機の少なくとも一方を含む。走行装置は、たとえば、フレーム、サスペンション、ステアリングシステム、ブレーキシステム、ホイール、タイヤなどを含む。 The moving body 10 is, for example, a vehicle such as an automobile, and includes an imaging unit 11, a yaw rate sensor 12, a steering angle sensor 13, a wheel speed sensor 14, and a direction indicator 15. Although not shown, the moving body 10 is equipped with, for example, a power generation device, a transmission device, an electronic control unit (ECU), a traveling device, instruments, a headlight, a horn, and the like. The power generation device includes, for example, at least one of an engine and an electric motor. The traveling device includes, for example, a frame, a suspension, a steering system, a braking system, wheels, tires, and the like.
 撮像部11は、たとえば、右カメラおよび左カメラを備えたステレオカメラ装置または単眼カメラを備えた単眼カメラ装置である。撮像部11は、たとえば、鏡筒、レンズ、アイリス、シャッター、CMOS(Complementary Metal Oxide Semiconductor)やCCD(Charge Coupled Device Image Sensor)などの撮像素子、画像データを処理する画像処理装置などを備えている。なお、撮像部11が右カメラおよび左カメラまたは単眼カメラである場合、撮像部制御装置100は、撮像部11を備えたステレオカメラ装置または単眼カメラ装置であってもよい。 The image pickup unit 11 is, for example, a stereo camera device including a right camera and a left camera or a monocular camera device including a monocular camera. The image pickup unit 11 includes, for example, a lens barrel, a lens, an iris, a shutter, an image pickup device such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device Image Sensor), and an image processing device that processes image data. .. When the imaging unit 11 is a right camera, a left camera, or a monocular camera, the imaging unit control device 100 may be a stereo camera device or a monocular camera device including the imaging unit 11.
 ステレオカメラ装置は、たとえば光軸が平行で間に距離(基線長)を有する右カメラと左カメラを備え、これら右カメラと左カメラによって対象物を同時に撮影する。右カメラで撮影された画像に写った対象物の座標と、左カメラで撮影された画像に写った同じ対象物の座標との間には、視差と呼ばれる差が生じる。ステレオカメラ装置は、右カメラと左カメラの画像にテンプレートマッチングを適用して視差を求め、求めた視差から三角測量法を用いて3次元空間における対象物の位置を求めることができる。撮像部11は、撮影した画像や視差情報を、撮像部制御装置100へ出力する。 A stereo camera device is equipped with, for example, a right camera and a left camera that have optical axes parallel to each other and have a distance (baseline length) between them. There is a difference called parallax between the coordinates of the object captured in the image captured by the right camera and the coordinates of the same object captured in the image captured by the left camera. The stereo camera device can apply the template matching to the images of the right camera and the left camera to obtain the parallax, and use the triangulation method from the obtained parallax to obtain the position of the object in the three-dimensional space. The imaging unit 11 outputs the captured image and the parallax information to the imaging unit control device 100.
 ヨーレートセンサ12は、たとえば自動車用ジャイロセンサであり、移動体10のヨーレートを計測して撮像部制御装置100へ出力する。舵角センサ13は、たとえば、自動車である移動体10のステアリングシャフトに取り付けられ、移動体10の前輪の操舵角を計測して撮像部制御装置100へ出力する。 The yaw rate sensor 12 is, for example, a gyro sensor for an automobile, and measures the yaw rate of the moving body 10 and outputs it to the image pickup unit control device 100. The steering angle sensor 13 is attached to, for example, the steering shaft of the moving body 10 that is an automobile, measures the steering angle of the front wheels of the moving body 10, and outputs it to the imaging unit control device 100.
 車輪速センサ14は、たとえば、自動車である移動体10のハブやナックルに取り付けられ、移動体10の各車輪の回転速度に応じた信号を撮像部制御装置100へ出力する。方向指示器15は、たとえば、移動体10を運転する乗員によって、移動体10の予定移動方向に応じて操作され、右または左の予定移動方向を撮像部制御装置100へ出力する。 The wheel speed sensor 14 is attached to, for example, a hub or a knuckle of the moving body 10 which is an automobile, and outputs a signal corresponding to the rotation speed of each wheel of the moving body 10 to the imaging unit control device 100. The direction indicator 15 is operated, for example, by an occupant driving the moving body 10 in accordance with the planned moving direction of the moving body 10, and outputs the right or left planned moving direction to the imaging unit control device 100.
 撮像部制御装置100の各部は、たとえば、中央演算装置(CPU)、集積回路(LSI)、記憶装置、プログラム等を含むマイクロコンピュータによって構成されている。前述のように、撮像部11がステレオカメラ装置または単眼カメラ装置である場合、撮像部制御装置100は、移動体10に搭載されたECUの一部であってもよい。 Each unit of the imaging unit control device 100 is configured by, for example, a central processing unit (CPU), an integrated circuit (LSI), a storage device, a microcomputer including a program, and the like. As described above, when the imaging unit 11 is a stereo camera device or a monocular camera device, the imaging unit control device 100 may be a part of the ECU mounted on the moving body 10.
 移動方向演算部110は、移動体10の移動方向を算出する。ここで、移動方向とは、たとえば、直進方向、右方向、および左方向の三方向である。移動方向演算部110は、たとえば、位置情報取得部140から入力された移動体10の位置情報と、移動経路演算部150から入力された移動体10の移動経路とに基づいて、移動体10の移動方向を予測する演算を行うように構成されている。また、移動方向演算部110は、挙動情報取得部160によって取得された移動体10の挙動情報に基づいて、移動体10の実際の移動方向を演算するように構成されている。 The moving direction calculation unit 110 calculates the moving direction of the moving body 10. Here, the moving direction is, for example, three directions of a straight traveling direction, a rightward direction, and a leftward direction. The moving direction calculation unit 110, for example, of the moving body 10 based on the position information of the moving body 10 input from the position information acquisition unit 140 and the moving route of the moving body 10 input from the moving route calculation unit 150. It is configured to perform a calculation for predicting a moving direction. Further, the moving direction calculation unit 110 is configured to calculate the actual moving direction of the mobile body 10 based on the behavior information of the mobile body 10 acquired by the behavior information acquisition unit 160.
 画像領域設定部120は、移動方向演算部110によって算出された移動体10の移動方向に応じて、撮像部11によって撮影された画像IMの一部に、露出補正の基準となる画像領域IRを設定する(図3を参照)。より具体的には、画像領域設定部120は、たとえば、直進方向Sと右方向Rと左方向Lのいずれかの移動方向に応じて、画像IMの消失点を含む中央領域CRと、中央領域CRの右側の右領域RRと、中央領域CRの左側の左領域LRのいずれかに、画像領域IRを設定する。なお、画像領域設定部120は、後述するように、外界情報認識部170によって認識された外界情報に基づいて、画像領域IRを設定してもよい。 The image area setting unit 120 sets an image area IR, which is a reference for exposure correction, in a part of the image IM captured by the image capturing unit 11 according to the moving direction of the moving body 10 calculated by the moving direction calculating unit 110. Set (see Figure 3). More specifically, the image region setting unit 120, for example, the central region CR including the vanishing point of the image IM and the central region according to the moving direction of the straight direction S, the right direction R, or the left direction L. The image region IR is set in either the right region RR on the right side of the CR or the left region LR on the left side of the central region CR. The image area setting unit 120 may set the image area IR based on the external world information recognized by the external world information recognition unit 170, as described later.
 露出補正部130は、画像領域設定部120によって設定された画像領域IRの画像情報に基づいて、撮像部11の露出補正を行う。より具体的には、露出補正部130は、画像領域IRの画像情報として、たとえば、各画素の赤、緑、青(RGB)の色情報や輝度を求める。また、露出補正部130は、たとえば、撮像部11のアイリス、ゲイン、およびシャッタースピードを制御して、撮像部11の露光補正を行う。また、露出補正部130は、たとえば、撮像部11によって撮影された画像を電子的に補正することによって、撮像部11の露出補正を行ってもよい。 The exposure correction unit 130 performs the exposure correction of the image pickup unit 11 based on the image information of the image area IR set by the image area setting unit 120. More specifically, the exposure correction unit 130 determines, for example, red, green, and blue (RGB) color information and brightness of each pixel as the image information of the image region IR. Further, the exposure correction unit 130 controls the iris, the gain, and the shutter speed of the image pickup unit 11 to perform the exposure correction of the image pickup unit 11, for example. In addition, the exposure correction unit 130 may perform the exposure correction of the imaging unit 11 by electronically correcting the image captured by the imaging unit 11, for example.
 位置情報取得部140は、移動体10の位置情報を取得する。より具体的には、位置情報取得部140は、たとえば、全地球測位システム(GPS)または全地球航法衛星システム(GNSS)を用いて移動体10の位置情報を取得する。また、位置情報取得部140は、たとえば、挙動情報取得部160によって取得された移動体10のヨーレート、操舵角、各車輪の回転速度などに基づいて移動体10の移動方向と移動量を算出することで、移動体10の位置情報を取得するようにしてもよい。 The position information acquisition unit 140 acquires the position information of the mobile unit 10. More specifically, the position information acquisition unit 140 acquires the position information of the moving body 10 using, for example, the Global Positioning System (GPS) or the Global Navigation Satellite System (GNSS). Further, the position information acquisition unit 140 calculates the moving direction and the movement amount of the moving body 10 based on, for example, the yaw rate of the moving body 10, the steering angle, the rotation speed of each wheel, and the like acquired by the behavior information acquiring unit 160. By doing so, the position information of the moving body 10 may be acquired.
 移動経路演算部150は、位置情報取得部140によって取得された移動体10の位置情報から移動体10の移動経路を算出する。より具体的には、移動経路演算部150は、たとえば、道路地図情報にアクセス可能に設けられ、道路地図情報と、移動体10の位置情報と、移動体10の乗員によって入力された目的地の位置情報と、に基づいて、移動体10の現在位置から目的地までの移動経路を算出する。 The moving route calculation unit 150 calculates the moving route of the moving body 10 from the position information of the moving body 10 acquired by the position information acquiring unit 140. More specifically, the movement route calculation unit 150 is provided, for example, so that the road map information can be accessed, and the road map information, the position information of the moving body 10, and the destination input by the occupant of the moving body 10 are displayed. Based on the position information, the moving route from the current position of the moving body 10 to the destination is calculated.
 挙動情報取得部160は、たとえば、移動体10に搭載されたヨーレートセンサ12から、移動体10の挙動情報として、移動体10のヨーレートを取得し、取得したヨーレートを移動方向演算部110へ出力する。また、挙動情報取得部160は、たとえば、舵角センサ13から、移動体10の挙動情報として、移動体10の操舵角を取得し、取得した操舵角を移動方向演算部110へ出力する。 The behavior information acquisition unit 160 acquires the yaw rate of the moving body 10 as the behavior information of the moving body 10 from the yaw rate sensor 12 mounted on the moving body 10, and outputs the acquired yaw rate to the moving direction calculation unit 110, for example. .. Further, the behavior information acquisition unit 160 acquires the steering angle of the moving body 10 as the behavior information of the moving body 10 from the steering angle sensor 13, and outputs the acquired steering angle to the moving direction calculation unit 110, for example.
 また、挙動情報取得部160は、たとえば、車輪速センサ14から、移動体10の挙動情報として、各車輪の回転速度を取得し、取得した各車輪の回転速度を移動方向演算部110へ出力する。また、挙動情報取得部160は、たとえば、方向指示器15から、移動体10の挙動情報として、移動体10の予定移動方向を取得し、取得した移動体10の予定移動方向を移動方向演算部110へ出力する。 Further, the behavior information acquisition unit 160 acquires, for example, the rotation speed of each wheel from the wheel speed sensor 14 as the behavior information of the moving body 10, and outputs the acquired rotation speed of each wheel to the movement direction calculation unit 110. .. Further, the behavior information acquisition unit 160 acquires, for example, the planned movement direction of the mobile body 10 as the behavior information of the mobile body 10 from the direction indicator 15, and the acquired planned movement direction of the mobile body 10 is calculated as the movement direction calculation unit. Output to 110.
 外界情報認識部170は、撮像部11によって撮影された画像から移動体10の周囲の障害物情報および道路情報を含む外界情報を認識する。より具体的には、外界情報認識部170は、撮像部11によって撮影された画像データを用いて物体検出処理を行って、障害物情報および道路情報を含む外界情報を認識する。障害物情報は、たとえば、車両、歩行者、建造物、ガードレール、中央分離帯、縁石、および電柱などの位置、大きさ、形状などを含む。道路情報は、たとえば、道路形状、白線、道路標示、および道路標識などを含む。 The outside world information recognition unit 170 recognizes outside world information including obstacle information around the moving body 10 and road information from the image captured by the image capturing unit 11. More specifically, the outside world information recognition unit 170 performs an object detection process using the image data captured by the imaging unit 11 and recognizes outside world information including obstacle information and road information. The obstacle information includes, for example, the position, size, and shape of vehicles, pedestrians, buildings, guardrails, median strips, curbs, and utility poles. The road information includes, for example, road shapes, white lines, road markings, and road signs.
 以下、本開示に係る撮像部制御装置と異なる比較形態の撮像装置の問題点を説明し、その比較形態の撮像装置と対比しつつ、本実施形態の撮像部制御装置100の作用を説明する。比較形態の撮像装置は、たとえば、移動体10の移動方向に応じて画像領域IRを設定する画像領域設定部120を有しない点で、本実施形態の撮像部制御装置100と異なっている。以下では、比較形態の撮像装置と、本実施形態の撮像部制御装置100の間で共通する構成については、同一の符号を付して説明を省略する。 Hereinafter, problems of the imaging device of the comparative mode different from the imaging device control device according to the present disclosure will be described, and the operation of the imaging device control device 100 of the present embodiment will be described in comparison with the imaging device of the comparative mode. The imaging device of the comparative form is different from the imaging unit control device 100 of the present embodiment in that, for example, the imaging device control device 100 of the present embodiment does not include the image region setting unit 120 that sets the image region IR according to the moving direction of the moving body 10. In the following, configurations common to the image capturing apparatus of the comparative embodiment and the image capturing unit control apparatus 100 of the present embodiment will be denoted by the same reference numerals and description thereof will be omitted.
 図8Aは、比較形態の撮像装置における撮像部11の露出補正の基準となる画像領域IRの一例を示す図である。図8Bは、比較形態の撮像装置において、撮像部11の画像IMに発生した白とびWOの一例を示す図である。比較形態の撮像装置は、移動体10の移動方向に応じて画像領域IRを設定する画像領域設定部120を有しない。そのため、比較形態の撮像装置において、撮像部11の露出補正の基準となる画像領域IRは、移動体10の直進方向S、右方向R、および左方向Lなどの移動方向に関わらず、たとえば画像IMの消失点を含む中央部に固定される。 FIG. 8A is a diagram showing an example of an image region IR serving as a reference for exposure correction of the image capturing unit 11 in the image capturing apparatus of the comparative mode. FIG. 8B is a diagram showing an example of whiteout WO generated in the image IM of the image pickup unit 11 in the image pickup apparatus of the comparative form. The imaging device of the comparative form does not have the image region setting unit 120 that sets the image region IR according to the moving direction of the moving body 10. Therefore, in the image pickup apparatus of the comparative form, the image region IR that is the reference for the exposure correction of the image pickup unit 11 is, for example, an image regardless of the moving direction of the moving body 10, such as the straight traveling direction S, the right direction R, and the left direction L. It is fixed at the center including the vanishing point of IM.
 ここで、図8Aに示すように、移動体10の直進方向Sにおいて、撮像部11の画像IMに、たとえば太陽光が届きにくい建物Bの奥まった部分など、極端に暗い領域DRが存在している場合がある。また、その極端に暗い領域DRが、撮像部11の露出補正の基準となる画像領域IRに含まれる場合がある。すると、比較形態の撮像装置では、極端に暗い領域DRを含む画像領域IRを基準として撮像部11の露出補正が行われるため、撮像部11の露出が増加する。 Here, as shown in FIG. 8A, in the straight traveling direction S of the moving body 10, the image IM of the imaging unit 11 includes an extremely dark region DR such as a recessed portion of the building B where sunlight is hard to reach. There is a case. In addition, the extremely dark region DR may be included in the image region IR that serves as a reference for the exposure correction of the imaging unit 11. Then, in the image pickup apparatus of the comparative form, the exposure of the image pickup unit 11 is corrected with reference to the image region IR including the extremely dark region DR, so that the exposure of the image pickup unit 11 increases.
 これにより、図8Bに示すように、建物Bの奥まった部分をより認識しやすくなる。しかし、比較形態の撮像装置の画像IMにおいて、たとえば、撮像部11の露出補正の基準となる図8Aに示す画像領域IRの左右の領域では、図8Bに示すように、露出オーバーによる白とびWOなどの不具合が発生するおそれがある。 This will make it easier to recognize the recessed part of building B, as shown in FIG. 8B. However, in the image IM of the image pickup apparatus of the comparative form, for example, in the regions on the left and right of the image region IR shown in FIG. 8A, which is the reference for the exposure correction of the image pickup unit 11, as shown in FIG. Such problems may occur.
 図8Aに示す例とは逆に、移動体10の直進方向Sにおいて、画像IMの中央部の画像領域IRに、極端に明るい領域が含まれる場合がある。すると、比較形態の撮像装置では、極端に明るい領域を含む画像領域IRを基準として撮像部11の露出補正が行われるため、撮像部11の露出が減少する。 Contrary to the example shown in FIG. 8A, in the straight traveling direction S of the moving body 10, the image area IR at the center of the image IM may include an extremely bright area. Then, in the image pickup apparatus of the comparative mode, the exposure of the image pickup unit 11 is corrected with the image region IR including an extremely bright region as a reference, so that the exposure of the image pickup unit 11 is reduced.
 これにより、図8Bに示す例と同様に、画像IMの中央部の極端に明るい領域は認識しやすくなる。しかし、比較形態の撮像装置の画像IMにおいて、たとえば、撮像部11の露出補正の基準となる図8Aに示す画像領域IRの左右の領域では、図8Bに示す例とは逆に、露出アンダーによる黒つぶれなどの不具合が発生するおそれがある。 As a result, similarly to the example shown in FIG. 8B, the extremely bright region in the center of the image IM can be easily recognized. However, in the image IM of the image pickup apparatus of the comparative form, for example, in the regions on the left and right of the image region IR shown in FIG. 8A, which is the reference for the exposure correction of the image pickup unit 11, due to underexposure, contrary to the example shown in FIG. Problems such as blackouts may occur.
 たとえば、先進運転支援システム(ADAS)や自動運転では、撮像部11の画像IMから障害物や道路の情報を認識する技術が用いられている。しかし、撮像部11の画像IMに白とびや黒つぶれなどの不具合が発生すると、移動体10の前方の交差点において、移動体10が右方向Rまたは左方向Lに移動する右折時または左折時に、交差点内の車両や歩行者などの障害物の認識に支障を来すおそれがある。したがって、撮像部11の画像IMに発生する上記のような不具合を抑制することが求められている。 For example, in the advanced driving support system (ADAS) and automatic driving, a technique of recognizing information on obstacles and roads from the image IM of the imaging unit 11 is used. However, when a defect such as overexposure or blackout occurs in the image IM of the imaging unit 11, at the intersection in front of the moving body 10 when the moving body 10 moves in the right direction R or the left direction L when making a right turn or a left turn, There is a risk that it may interfere with the recognition of obstacles such as vehicles and pedestrians inside the intersection. Therefore, it is required to suppress the above-mentioned problems that occur in the image IM of the imaging unit 11.
 図2は、本実施形態の撮像部制御装置100による撮像部11の制御方法の一例を示すフロー図である。図3は、本実施形態の撮像部制御装置100の画像領域設定部120によって撮像部11の画像IMに設定された画像領域IRの一例を示す図である。 FIG. 2 is a flowchart showing an example of a control method of the image pickup unit 11 by the image pickup unit control apparatus 100 of the present embodiment. FIG. 3 is a diagram showing an example of the image region IR set in the image IM of the image pickup unit 11 by the image region setting unit 120 of the image pickup unit control apparatus 100 of the present embodiment.
 まず、ステップS11において、撮像部制御装置100は、移動体10の移動経路を算出する。より具体的には、撮像部制御装置100は、位置情報取得部140によって移動体10の位置情報を取得し、移動経路演算部150によってその位置情報から移動体10の移動経路を算出する。 First, in step S11, the imaging unit control device 100 calculates the movement route of the moving body 10. More specifically, in the imaging unit control device 100, the position information acquisition unit 140 acquires the position information of the moving body 10, and the moving route calculation unit 150 calculates the moving route of the moving body 10 from the position information.
 次に、ステップS12において、撮像部制御装置100は、移動体10の移動方向を演算する。より具体的には、撮像部制御装置100は、ステップS11で得られた移動体10の位置情報および移動経路に基づいて、移動体10の移動方向を算出する。これにより、撮像部制御装置100は、たとえば図3に示すように、移動体10の前方の交差点における移動体10の移動方向が、直進方向S、右方向R、および左方向Lのうち、左方向Lであることを予測する。 Next, in step S12, the imaging unit control device 100 calculates the moving direction of the moving body 10. More specifically, the image capturing unit control apparatus 100 calculates the moving direction of the moving body 10 based on the position information and the moving route of the moving body 10 obtained in step S11. Thereby, as shown in FIG. 3, for example, the imaging unit control device 100 determines that the moving direction of the moving body 10 at the intersection in front of the moving body 10 is left among the straight traveling direction S, the rightward direction R, and the leftward direction L. Predict the direction L.
 次に、ステップS13において、撮像部制御装置100は、移動体10の挙動情報を取得する。より具体的には、撮像部制御装置100は、挙動情報取得部160によって、移動体10に搭載されたヨーレートセンサ12から移動体10のヨーレートを挙動情報として取得する。また、撮像部制御装置100は、挙動情報取得部160によって、移動体10に搭載された舵角センサ13から移動体10の操舵角を挙動情報として取得する。また、撮像部制御装置100は、挙動情報取得部160によって、移動体10に搭載された車輪速センサ14から移動体10の複数の車輪の回転速度を挙動情報として取得する。さらに、撮像部制御装置100は、挙動情報取得部160によって、移動体10に搭載された方向指示器15から移動体10の予定移動方向を挙動情報として取得する。 Next, in step S13, the imaging unit control apparatus 100 acquires the behavior information of the moving body 10. More specifically, in the imaging unit control device 100, the behavior information acquisition unit 160 acquires the yaw rate of the moving body 10 from the yaw rate sensor 12 mounted on the moving body 10 as the behavior information. In addition, the image capturing unit control apparatus 100 acquires the steering angle of the moving body 10 as the behavior information from the steering angle sensor 13 mounted on the moving body 10 by the behavior information acquiring unit 160. In addition, the imaging unit control device 100 causes the behavior information acquisition unit 160 to acquire the rotation speeds of the plurality of wheels of the moving body 10 as the behavior information from the wheel speed sensor 14 mounted on the moving body 10. Further, the imaging unit control device 100 causes the behavior information acquisition unit 160 to acquire the planned movement direction of the mobile body 10 as the behavior information from the direction indicator 15 mounted on the mobile body 10.
 次に、ステップS14において、撮像部制御装置100は、ステップS12で算出した移動体10の移動方向である左方向Lと、ステップS13で取得した移動体10の挙動情報とが一致しているか否かを判定する。より具体的には、撮像部制御装置100は、たとえば、移動方向演算部110によって、移動体10のヨーレート、操舵角、左右の車輪の回転速度の差、予定移動方向のうち、少なくとも一つの挙動情報に基づいて、撮像部11の画像IMに写った交差点での移動体10の実際の移動方向を算出する。そして、移動方向演算部110は、ステップS12で予測した移動体10の移動方向である左方向Lと、移動体10の挙動情報に基づいて算出した移動体10の実際の移動方向とが、一致しているか否かを判定する。 Next, in step S14, the imaging unit control device 100 determines whether the left direction L, which is the moving direction of the moving body 10 calculated in step S12, and the behavior information of the moving body 10 acquired in step S13 match. To determine. More specifically, the imaging unit control device 100 causes the movement direction calculation unit 110 to perform at least one behavior of the yaw rate of the moving body 10, the steering angle, the difference between the rotation speeds of the left and right wheels, and the planned movement direction. Based on the information, the actual moving direction of the moving body 10 at the intersection shown in the image IM of the imaging unit 11 is calculated. Then, the moving direction calculation unit 110 determines that the left direction L, which is the moving direction of the moving body 10 predicted in step S12, and the actual moving direction of the moving body 10 calculated based on the behavior information of the moving body 10 are equal to each other. Determine whether you are doing.
 ステップS14において、移動体10の移動方向と挙動情報とが一致している場合(YES)、ステップS15へ進み、移動方向演算部110は、ステップS12で算出した移動体10の移動方向である左方向Lを、画像領域設定部120へ出力する。 In step S14, when the moving direction of the moving body 10 and the behavior information match (YES), the process proceeds to step S15, and the moving direction calculation unit 110 is the moving direction of the moving body 10 calculated in step S12. The direction L is output to the image area setting unit 120.
 次に、ステップS16において、撮像部制御装置100は、移動体10の移動方向に応じて、撮像部11によって撮影された画像IMに露出補正の基準となる画像領域IRを設定する。より具体的には、撮像部制御装置100は、たとえば、画像領域設定部120によって、画像IMの中央領域CR、右領域RR、および左領域LRのうち、移動体10の移動方向である左方向Lに応じた左領域LRに、画像領域IRを設定する。 Next, in step S16, the image capturing unit control apparatus 100 sets an image region IR as a reference for exposure correction in the image IM captured by the image capturing unit 11 according to the moving direction of the moving body 10. More specifically, the image capturing unit control apparatus 100 uses, for example, the image region setting unit 120 to move in the left direction, which is the moving direction of the moving body 10 among the central region CR, the right region RR, and the left region LR of the image IM. The image region IR is set in the left region LR corresponding to L.
 次に、ステップS17において、撮像部制御装置100は、移動体10の移動方向に応じて設定された画像領域IRの画像情報に基づいて、撮像部11の露出補正を行う。より具体的には、撮像部制御装置100は、移動体10の移動方向である左方向Lに応じて画像IMの左領域LRに設定された画像領域IRの画像情報に基づいて、露出補正部130によって撮像部11の露出補正を行う。なお、撮像部11が二つのカメラを含む場合、これら二つのカメラに同時に露出補正を行う。 Next, in step S17, the imaging unit control device 100 performs exposure correction of the imaging unit 11 based on the image information of the image area IR set according to the moving direction of the moving body 10. More specifically, the image capturing unit control device 100, based on the image information of the image region IR set in the left region LR of the image IM according to the left direction L which is the moving direction of the moving body 10, the exposure correction unit. The exposure correction of the imaging unit 11 is performed by 130. When the image capturing unit 11 includes two cameras, exposure correction is performed on these two cameras at the same time.
 図4Aから図4Cは、撮像部制御装置100の露出補正部130による露出補正を説明するグラフである。露出補正部130は、画像領域IRの画像情報に基づいて撮像部11の露出補正を行う。より具体的には、露出補正部130は、たとえば、画像領域IRの画像情報として輝度情報を用い、輝度情報に基づいて作成したヒストグラムに応じて撮像部11の露出補正値を決定する。 4A to 4C are graphs for explaining exposure correction by the exposure correction unit 130 of the image pickup unit control device 100. The exposure correction unit 130 performs the exposure correction of the imaging unit 11 based on the image information of the image area IR. More specifically, the exposure correction unit 130 uses the brightness information as the image information of the image region IR, for example, and determines the exposure correction value of the imaging unit 11 according to the histogram created based on the brightness information.
 たとえば、画像領域IRにおいて、図4Aに示すように輝度の高い画素数が極端に多い場合、露出オーバーであることが予想される。この場合、露出補正部130は、露出を減少させるように負の露出補正値を決定する。また、画像領域IRにおいて、図4Bに示すように低い輝度から高い輝度まで画素数が比較的均等に偏りなく分布している場合には、適度なコントラストがあり、露出が適切であることが予想される。この場合、露出補正部130は、露出を維持するように露出補正値をゼロにする。また、図4Cに示すように輝度の低い画素数が極端に多い場合、露出アンダーであることが予想される。この場合、露出補正部130は、露出を増加させるように正の露出補正値を決定する。 For example, in the image region IR, if the number of pixels with high brightness is extremely large as shown in FIG. 4A, it is expected that overexposure is occurring. In this case, the exposure correction unit 130 determines a negative exposure correction value so as to reduce the exposure. Further, in the image region IR, when the number of pixels is relatively evenly distributed from low luminance to high luminance as shown in FIG. 4B, there is an appropriate contrast and it is expected that the exposure is appropriate. To be done. In this case, the exposure correction unit 130 sets the exposure correction value to zero so as to maintain the exposure. Further, as shown in FIG. 4C, when the number of pixels with low brightness is extremely large, it is expected that the exposure is underexposed. In this case, the exposure correction unit 130 determines a positive exposure correction value so as to increase the exposure.
 一方、ステップS14において、移動体10の移動方向と挙動情報とが一致していない場合(NO)、ステップS18へ進み、移動方向演算部110は、ステップS13で取得した移動体10の挙動情報に基づく進行方向である直進方向Sまたは右方向Rを、画像領域設定部120へ出力する。これは、たとえば、移動体10を運転する乗員が、移動経路演算部150によって算出された移動体10の移動経路と異なる経路を選択した場合である。 On the other hand, in step S14, when the moving direction of the moving body 10 and the behavior information do not match (NO), the process proceeds to step S18, and the moving direction calculation unit 110 uses the behavior information of the moving body 10 acquired in step S13. The straight traveling direction S or the right direction R, which is the traveling direction based on the traveling direction, is output to the image area setting unit 120. This is the case, for example, when the occupant driving the moving body 10 selects a route different from the moving route of the moving body 10 calculated by the moving route calculation unit 150.
 この場合、ステップS16において、撮像部制御装置100は、たとえば、画像領域設定部120によって、画像IMの中央領域CR、右領域RR、および左領域LRのうち、移動体10の移動方向である直進方向Sまたは右方向Rに応じた中央領域CRまたは右領域RRに、画像領域IRを設定する。このように、ステップS14で移動体10の移動方向と挙動情報とを比較することで、移動体10を運転する乗員が、移動経路を変更した場合などに対応することが可能になる。 In this case, in step S16, the imaging unit control device 100 causes, for example, the image region setting unit 120 to move straight in the moving direction of the moving body 10 among the central region CR, the right region RR, and the left region LR of the image IM. The image region IR is set in the central region CR or the right region RR corresponding to the direction S or the right direction R. In this way, by comparing the moving direction of the moving body 10 and the behavior information in step S14, it becomes possible for the occupant driving the moving body 10 to respond to a case where the moving route is changed.
 次に、ステップS17において、撮像部制御装置100は、移動体10の移動方向に応じて画像IMの中央領域CRまたは右領域RRに設定された画像領域IRの画像情報に基づいて、前述のように、露出補正部130による撮像部11の露出補正を行う。 Next, in step S17, the imaging unit control device 100 performs the above-described processing based on the image information of the image region IR set in the central region CR or the right region RR of the image IM according to the moving direction of the moving body 10. First, the exposure correction of the image pickup unit 11 is performed by the exposure correction unit 130.
 以上のように、本実施形態の撮像部制御装置100は、移動体10に搭載される撮像部11を制御する装置である。撮像部制御装置100は、移動体10の移動方向を算出する移動方向演算部110と、移動体10の移動方向に応じて撮像部11によって撮影された画像に露出補正の基準となる画像領域IRを設定する画像領域設定部120と、画像領域IRの画像情報に基づいて撮像部11の露出補正を行う露出補正部130と、を備えている。 As described above, the imaging unit control device 100 of the present embodiment is a device that controls the imaging unit 11 mounted on the moving body 10. The image capturing unit control apparatus 100 includes a moving direction calculation unit 110 that calculates a moving direction of the moving body 10 and an image region IR that serves as a reference for exposure correction in an image captured by the image capturing unit 11 according to the moving direction of the moving body 10. The image area setting unit 120 that sets the image area and the exposure correction unit 130 that performs the exposure correction of the imaging unit 11 based on the image information of the image area IR.
 この構成により、たとえば、図3に示すように、移動体10の進行方向である左方向Lに応じて画像IMの左領域LRに設定した画像領域IRを基準として、露出補正を行うことができる。そのため、たとえば図8Aに示すように、移動体10の移動方向である左方向Lとは異なる直進方向Sにおいて、撮像部11の画像IMの中央部に極端に暗い領域DRや極端に明るい領域が存在しても、その領域を含まない左領域LRに設定された画像領域IRを基準として、露出補正を行うことができる。 With this configuration, for example, as shown in FIG. 3, the exposure correction can be performed with reference to the image region IR set in the left region LR of the image IM according to the left direction L which is the traveling direction of the moving body 10. .. Therefore, for example, as shown in FIG. 8A, in a straight traveling direction S different from the left direction L which is the moving direction of the moving body 10, an extremely dark region DR or an extremely bright region is formed in the central portion of the image IM of the imaging unit 11. Even if it exists, the exposure correction can be performed on the basis of the image region IR set in the left region LR that does not include the region.
 これにより、前述の比較形態の撮像装置よりも撮像部11の露出補正を適切に行って、撮像部11の画像IMに白とびや黒つぶれなどの不具合が発生するのを抑制することができる。したがって、本実施形態によれば、移動体10の移動方向に応じて露出補正の基準となる画像領域IRを設定することで、撮像部11の露出を従来よりも適切に調節可能な撮像部制御装置100を提供することができる。 With this, it is possible to appropriately perform the exposure correction of the image pickup unit 11 as compared with the image pickup apparatus of the above-described comparative embodiment, and to prevent the image IM of the image pickup unit 11 from having defects such as overexposure and underexposure. Therefore, according to the present embodiment, by setting the image region IR that serves as a reference for the exposure correction according to the moving direction of the moving body 10, the exposure of the imaging unit 11 can be adjusted more appropriately than the conventional imaging unit control. A device 100 can be provided.
 また、本実施形態の撮像部制御装置100は、移動体10の位置情報を取得する位置情報取得部140と、移動体10の位置情報から移動体10の移動経路を算出する移動経路演算部150と、移動体10の挙動情報を取得する挙動情報取得部160と、をさらに備えている。この構成により、撮像部制御装置100は、移動体10の位置情報、移動経路、および挙動情報を利用して、撮像部11を制御することが可能になる。 Further, the imaging unit control device 100 of the present embodiment, the position information acquisition unit 140 that acquires the position information of the moving body 10, and the movement route calculation unit 150 that calculates the moving route of the moving body 10 from the position information of the moving body 10. And a behavior information acquisition unit 160 that acquires behavior information of the moving body 10. With this configuration, the image capturing unit control apparatus 100 can control the image capturing unit 11 using the position information, the moving route, and the behavior information of the moving body 10.
 また、本実施形態の撮像部制御装置100において、移動方向演算部110は、移動体10の位置情報および移動経路に基づいて移動方向を算出するように構成されている。この構成により、撮像部制御装置100は、たとえば、移動体10の前方の交差点における移動体10の移動方向を予測することが可能になる。 Further, in the imaging unit control device 100 of the present embodiment, the moving direction calculation unit 110 is configured to calculate the moving direction based on the position information of the moving body 10 and the moving route. With this configuration, the imaging unit control device 100 can predict the moving direction of the moving body 10 at an intersection in front of the moving body 10, for example.
 また、本実施形態の撮像部制御装置100において、挙動情報取得部160は、移動体10に搭載されたヨーレートセンサ12から移動体10のヨーレートを挙動情報として取得するように構成されている。また、移動方向演算部110は、移動体10のヨーレートに基づいて移動体10の移動方向を算出するように構成されている。この構成により、撮像部制御装置100は、移動体10のヨーレートに基づく移動体10の実際の移動方向に応じて、露出補正の基準となる画像領域IRを設定することができる。 Further, in the imaging unit control device 100 of the present embodiment, the behavior information acquisition unit 160 is configured to acquire the yaw rate of the moving body 10 from the yaw rate sensor 12 mounted on the moving body 10 as the behavior information. Further, the moving direction calculation unit 110 is configured to calculate the moving direction of the moving body 10 based on the yaw rate of the moving body 10. With this configuration, the image capturing unit control apparatus 100 can set the image region IR that serves as a reference for the exposure correction according to the actual moving direction of the moving body 10 based on the yaw rate of the moving body 10.
 また、本実施形態の撮像部制御装置100において、挙動情報取得部160は、移動体10に搭載された舵角センサ13から移動体10の操舵角を挙動情報として取得するように構成されている。また、移動方向演算部110は、移動体10の操舵角に基づいて移動体10の移動方向を算出するように構成されている。この構成により、撮像部制御装置100は、移動体10の操舵角に基づく移動体10の実際の移動方向に応じて、露出補正の基準となる画像領域IRを設定することができる。 Further, in the imaging unit control device 100 of the present embodiment, the behavior information acquisition unit 160 is configured to acquire the steering angle of the moving body 10 as the behavior information from the steering angle sensor 13 mounted on the moving body 10. .. Further, the moving direction calculation unit 110 is configured to calculate the moving direction of the moving body 10 based on the steering angle of the moving body 10. With this configuration, the imaging unit control device 100 can set the image region IR that is the reference of the exposure correction, according to the actual moving direction of the moving body 10 based on the steering angle of the moving body 10.
 また、本実施形態の撮像部制御装置100において、挙動情報取得部160は、移動体10に搭載された車輪速センサ14から移動体10の複数の車輪の回転速度を挙動情報として取得するように構成されている。また、移動方向演算部110は、移動体10の複数の車輪の回転速度の差に基づいて移動体10の移動方向を算出するように構成されている。この構成により、撮像部制御装置100は、移動体10の車輪の回転速度の差に基づく移動体10の実際の移動方向に応じて、露出補正の基準となる画像領域IRを設定することができる。 Further, in the imaging unit control device 100 of the present embodiment, the behavior information acquisition unit 160 acquires the rotation speeds of the plurality of wheels of the moving body 10 from the wheel speed sensor 14 mounted on the moving body 10 as the behavior information. It is configured. Further, the moving direction calculation unit 110 is configured to calculate the moving direction of the moving body 10 based on the difference in rotation speed of the plurality of wheels of the moving body 10. With this configuration, the image capturing unit control apparatus 100 can set the image region IR that is the reference for the exposure correction, according to the actual moving direction of the moving body 10 based on the difference in the rotation speed of the wheels of the moving body 10. ..
 また、本実施形態の撮像部制御装置100において、挙動情報取得部160は、移動体10に搭載された方向指示器15から移動体10の予定移動方向を挙動情報として取得するように構成されている。また、移動方向演算部110は、移動体10の予定移動方向に基づいて移動体10の移動方向を算出するように構成されている。この構成により、撮像部制御装置100は、移動体10の方向指示器15の操作に基づく移動体10の予測移動方向に応じて、露出補正の基準となる画像領域IRを設定することができる。 Further, in the imaging unit control device 100 of the present embodiment, the behavior information acquisition unit 160 is configured to acquire the planned movement direction of the mobile body 10 as the behavior information from the direction indicator 15 mounted on the mobile body 10. There is. Further, the moving direction calculation unit 110 is configured to calculate the moving direction of the moving body 10 based on the planned moving direction of the moving body 10. With this configuration, the imaging unit control device 100 can set the image region IR that is the reference for the exposure correction, according to the predicted moving direction of the moving body 10 based on the operation of the direction indicator 15 of the moving body 10.
 以上説明したように、本実施形態によれば、移動体10の移動方向に応じて露出補正の基準となる画像領域IRを設定することで、撮像部11の露出を従来よりも適切に調節可能な撮像部制御装置100を提供することができる。 As described above, according to the present embodiment, the exposure of the imaging unit 11 can be adjusted more appropriately than before by setting the image area IR that is the reference of the exposure correction according to the moving direction of the moving body 10. It is possible to provide the image capturing unit control device 100.
 なお、本開示に係る撮像部制御装置は、前述の実施形態に係る撮像部制御装置100の構成に限定されない。以下、図5から図7を参照して、前述の実施形態に係る撮像部制御装置100のいくつかの変形例について説明する。 Note that the imaging unit control device according to the present disclosure is not limited to the configuration of the imaging unit control device 100 according to the above-described embodiment. Hereinafter, some modified examples of the imaging unit control device 100 according to the above-described embodiment will be described with reference to FIGS. 5 to 7.
 図5は、前述の実施形態に係る撮像部制御装置100の画像領域設定部120によって設定された画像領域IRの変形例を示す図である。 FIG. 5 is a diagram showing a modified example of the image area IR set by the image area setting unit 120 of the imaging unit control device 100 according to the above-described embodiment.
 撮像部制御装置100は、前述のように撮像部11によって撮影された画像から移動体10の周囲の障害物情報および道路情報を含む外界情報を認識する外界情報認識部170を備えている。また、本変形例において、画像領域設定部120は、外界情報に基づいて画像領域IRを設定するように構成されている。この構成により、撮像部11のカメラの取り付け位置のずれが生じたり、移動体10の車輪の径が変化したりした場合でも、画像IMに適切な画像領域IRを設定することができる。 The image pickup unit control device 100 includes the outside world information recognition unit 170 that recognizes outside world information including obstacle information around the moving body 10 and road information from the image taken by the image pickup unit 11 as described above. Further, in the present modification, the image area setting unit 120 is configured to set the image area IR based on the external world information. With this configuration, it is possible to set an appropriate image region IR in the image IM even when the mounting position of the camera of the imaging unit 11 is displaced or the diameter of the wheel of the moving body 10 is changed.
 より詳細には、外界情報認識部170は、撮像部11の画像IMの座標に基づいて、画像IMから認識した車両Vや歩行者等の移動体の重心Gの位置を求め、その重心Gの軌跡を記録する。外界情報認識部170は、所定数以上の軌跡を記録した後、画像IMの横方向に移動する重心Gを、画像IMにプロットする。これにより、外界情報認識部170は、画像IMにおいて、車両や歩行者等の移動体が通過する領域を認識することができる。 More specifically, the external world information recognition unit 170 obtains the position of the center of gravity G of the moving body such as the vehicle V or a pedestrian recognized from the image IM based on the coordinates of the image IM of the imaging unit 11, and determines the position of the center of gravity G. Record the trajectory. After recording a predetermined number or more of loci, the external information recognizing unit 170 plots the center of gravity G moving in the lateral direction of the image IM on the image IM. As a result, the outside world information recognition unit 170 can recognize a region in the image IM through which a moving body such as a vehicle or a pedestrian passes.
 さらに、画像領域設定部120は、外界情報認識部170によって認識された車両や歩行者等の移動体が通過する領域に、画像領域IRを設定する。より具体的には、画像領域設定部120は、車両や歩行者等の移動体が通過する領域において、移動体10の移動方向に応じて画像領域IRを設定する。より詳細には、画像領域設定部120は、移動体10の直進時には、画像IMの中央領域CRに画像領域IRを設定し、移動体10の右折時には、画像IMの右領域RRに画像領域IRを設定し、移動体10の左折時には、画像IMの左領域LRに画像領域IRを設定する。 Further, the image area setting unit 120 sets an image area IR in an area through which a moving body such as a vehicle or a pedestrian recognized by the external information recognition unit 170 passes. More specifically, the image region setting unit 120 sets the image region IR in the region where a moving body such as a vehicle or a pedestrian passes in accordance with the moving direction of the moving body 10. More specifically, the image region setting unit 120 sets the image region IR in the central region CR of the image IM when the moving body 10 goes straight, and sets the image region IR in the right region RR of the image IM when the moving body 10 turns right. When the mobile body 10 is turned left, the image area IR is set in the left area LR of the image IM.
 なお、画像領域設定部120は、たとえば、移動体10がまっすぐな道路を直進していることが判定され、かつ、道路に表示された白線の曲率がしきい値以下の場合に、道路の左右両側の白線の消失点を中心とする任意の中央領域CRに画像領域IRを設定してもよい。なお、白線の消失点は、たとえば、外界情報認識部170によって認識することができる。また、移動体10がまっすぐな道路を直進しているか否かの判定は、たとえば、挙動情報取得部160によって取得された挙動情報を用いて行うことができる。たとえば、移動体10の操舵角およびヨーレートの値がともにしきい値以下であれば、移動体10が直進していると判定することができる。 Note that the image region setting unit 120, for example, when the moving body 10 is determined to be traveling straight on a straight road and the curvature of the white line displayed on the road is equal to or less than a threshold value, The image region IR may be set in an arbitrary central region CR centered on the vanishing points of the white lines on both sides. The vanishing point of the white line can be recognized by the external world information recognition unit 170, for example. Further, whether or not the moving body 10 is traveling straight on a straight road can be determined using, for example, the behavior information acquired by the behavior information acquisition unit 160. For example, if both the steering angle and the yaw rate of the moving body 10 are equal to or less than the threshold value, it can be determined that the moving body 10 is moving straight ahead.
 図6は、前述の実施形態に係る撮像部制御装置100による撮像部11の制御方法の変形例を示すフロー図である。図7は、この変形例に係る撮像部制御装置100の画像領域設定部120によって設定された画像領域IRを示す図である。 FIG. 6 is a flowchart showing a modified example of the control method of the image pickup unit 11 by the image pickup unit control apparatus 100 according to the above-described embodiment. FIG. 7 is a diagram showing the image region IR set by the image region setting unit 120 of the image capturing unit control apparatus 100 according to this modification.
 本変形例において、撮像部制御装置100は、位置情報取得部140、移動経路演算部150、および外界情報認識部170を有しなくてもよい。 In this modification, the imaging unit control device 100 does not have to include the position information acquisition unit 140, the movement route calculation unit 150, and the external world information recognition unit 170.
 まず、ステップS21において、本変形例に係る撮像部制御装置100は、図2に示す前述のステップS13と同様に、挙動情報取得部160によって移動体10の挙動情報を取得する。次に、ステップS22において、本変形例に係る撮像部制御装置100は、移動方向演算部110によって、移動体10の挙動情報に基づいて、移動体10移動方向を算出する。 First, in step S21, the imaging unit control apparatus 100 according to the present modification acquires the behavior information of the moving body 10 by the behavior information acquisition unit 160, as in step S13 described above shown in FIG. Next, in step S22, the imaging unit control device 100 according to the present modification causes the moving direction calculation unit 110 to calculate the moving direction of the moving body 10 based on the behavior information of the moving body 10.
 たとえば、移動体10がまっすぐな道路を直進している場合は、挙動情報取得部160によって取得された操舵角やヨーレートがしきい値以下となる。この場合、移動方向演算部110は、移動体10の移動方向を直進方向Sとして算出する。また、移動体10が左カーブを走行している場合や、交差点を左折している場合は、操舵角やヨーレートがしきい値以上になる。この場合、移動方向演算部110は、移動体10の移動方向を図7に示す左方向Lとして算出する。 For example, when the moving body 10 is traveling straight on a straight road, the steering angle and the yaw rate acquired by the behavior information acquisition unit 160 are below the threshold value. In this case, the moving direction calculation unit 110 calculates the moving direction of the moving body 10 as the straight traveling direction S. Further, when the moving body 10 is traveling on the left curve or turning left at the intersection, the steering angle and the yaw rate are equal to or more than the threshold value. In this case, the moving direction calculation unit 110 calculates the moving direction of the moving body 10 as the left direction L shown in FIG. 7.
 次に、ステップS23において、本変形例に係る撮像部制御装置100は、画像領域設定部120によって、移動体10の移動方向に応じて、画像IMに画像領域IRを設定する。たとえば、移動体10の移動方向が直進方向Sである場合、画像領域設定部120は、図7に示す中央領域CRに画像領域IRを設定する。また、移動体10の移動方向が左方向Lである場合、画像領域設定部120は、図7に示す左領域LRに画像領域IRを設定する。画像領域設定部120による画像領域IRの設定は、たとえば、移動体10の挙動情報の時間的な変化に応じて動的に行うようにしてもよい。 Next, in step S23, the imaging unit control apparatus 100 according to the present modification sets the image region IR in the image IM by the image region setting unit 120 according to the moving direction of the moving body 10. For example, when the moving direction of the moving body 10 is the straight traveling direction S, the image region setting unit 120 sets the image region IR in the central region CR shown in FIG. 7. When the moving direction of the moving body 10 is the left direction L, the image area setting unit 120 sets the image area IR in the left area LR shown in FIG. 7. The image area IR may be set by the image area setting unit 120 dynamically, for example, according to the temporal change of the behavior information of the moving body 10.
 次に、ステップS24において、本変形例に係る撮像部制御装置100は、図2に示すステップS17と同様に、移動体10の移動方向に応じて設定された画像領域IRの画像情報に基づいて、撮像部11の露出補正を行う。本変形例によっても、前述の実施形態と同様に、移動体10の移動方向に応じて露出補正の基準となる画像領域IRを設定することで、撮像部11の露出を従来よりも適切に調節可能な撮像部制御装置100を提供することができる。 Next, in step S24, the imaging unit control device 100 according to the present modification, based on the image information of the image region IR set according to the moving direction of the moving body 10, as in step S17 shown in FIG. Exposure compensation of the image pickup unit 11 is performed. According to this modification, as in the above-described embodiment, the exposure of the imaging unit 11 is adjusted more appropriately than before by setting the image region IR that is the reference of the exposure correction according to the moving direction of the moving body 10. It is possible to provide a possible imaging unit control device 100.
 以上、図面を用いて本開示に係る撮像部制御装置の実施形態およびその変形例を詳述してきた。しかし、本開示に係る撮像部制御装置の具体的な構成は、上記実施形態およびその変形例に限定されるものではなく、本開示の要旨を逸脱しない範囲における設計変更等があっても、それらは本開示に含まれる。 Above, the embodiment of the imaging unit control device according to the present disclosure and its modifications have been described in detail with reference to the drawings. However, the specific configuration of the imaging unit control device according to the present disclosure is not limited to the above-described embodiments and modifications thereof, and even if there are design changes and the like within a range not departing from the gist of the present disclosure, Are included in the present disclosure.
10  移動体
11  撮像部
12  ヨーレートセンサ
13  舵角センサ
14  車輪速センサ
15  方向指示器
100 撮像部制御装置
110 移動方向演算部
120 画像領域設定部
130 露出補正部
140 位置情報取得部
150 移動経路演算部
160 挙動情報取得部
170 外界情報認識部
IM  画像
IR  画像領域
10 moving body 11 image pickup unit 12 yaw rate sensor 13 steering angle sensor 14 wheel speed sensor 15 direction indicator 100 image pickup unit control device 110 moving direction calculation unit 120 image area setting unit 130 exposure correction unit 140 position information acquisition unit 150 movement route calculation unit 160 Behavior information acquisition unit 170 External information recognition unit IM image IR image region

Claims (8)

  1.  移動体に搭載される撮像部を制御する撮像部制御装置であって、
     前記移動体の移動方向を算出する移動方向演算部と、
     前記移動方向に応じて前記撮像部によって撮影された画像に露出補正の基準となる画像領域を設定する画像領域設定部と、
     前記画像領域の画像情報に基づいて前記撮像部の露出補正を行う露出補正部と、を備える、撮像部制御装置。
    An image pickup unit control device for controlling an image pickup unit mounted on a moving body,
    A moving direction calculator for calculating the moving direction of the moving body,
    An image region setting unit that sets an image region that serves as a reference for exposure correction in an image captured by the image capturing unit according to the moving direction;
    An image pickup unit control device, comprising: an exposure correction unit that performs exposure correction of the image pickup unit based on image information of the image region.
  2.  前記移動体の位置情報を取得する位置情報取得部と、
     前記位置情報から前記移動体の移動経路を算出する移動経路演算部と、
     前記移動体の挙動情報を取得する挙動情報取得部と、をさらに備える、請求項1に記載の撮像部制御装置。
    A position information acquisition unit that acquires the position information of the moving body,
    A movement route calculation unit that calculates the movement route of the moving body from the position information;
    The imaging unit control device according to claim 1, further comprising: a behavior information acquisition unit that acquires behavior information of the moving body.
  3.  前記移動方向演算部は、前記位置情報および前記移動経路に基づいて前記移動方向を算出する、請求項2に記載の撮像部制御装置。 The image pickup unit control device according to claim 2, wherein the movement direction calculation unit calculates the movement direction based on the position information and the movement route.
  4.  前記挙動情報取得部は、前記移動体に搭載されたヨーレートセンサから前記移動体のヨーレートを前記挙動情報として取得し、
     前記移動方向演算部は、前記ヨーレートに基づいて前記移動体の移動方向を算出する、請求項2に記載の撮像部制御装置。
    The behavior information acquisition unit acquires a yaw rate of the moving body from the yaw rate sensor mounted on the moving body as the behavior information,
    The imaging unit control device according to claim 2, wherein the moving direction calculation unit calculates the moving direction of the moving body based on the yaw rate.
  5.  前記挙動情報取得部は、前記移動体に搭載された舵角センサから前記移動体の操舵角を前記挙動情報として取得し、
     前記移動方向演算部は、前記操舵角に基づいて前記移動体の移動方向を算出する、請求項2に記載の撮像部制御装置。
    The behavior information acquisition unit acquires the steering angle of the moving body from the steering angle sensor mounted on the moving body as the behavior information,
    The imaging unit control device according to claim 2, wherein the moving direction calculation unit calculates the moving direction of the moving body based on the steering angle.
  6.  前記挙動情報取得部は、前記移動体に搭載された車輪速センサから前記移動体の複数の車輪の回転速度を前記挙動情報として取得し、
     前記移動方向演算部は、前記複数の車輪の回転速度の差に基づいて前記移動体の移動方向を算出する、請求項2に記載の撮像部制御装置。
    The behavior information acquisition unit acquires, as the behavior information, rotation speeds of a plurality of wheels of the moving body from a wheel speed sensor mounted on the moving body,
    The imaging unit control device according to claim 2, wherein the moving direction calculation unit calculates the moving direction of the moving body based on a difference in rotation speeds of the plurality of wheels.
  7.  前記挙動情報取得部は、前記移動体に搭載された方向指示器から前記移動体の予定移動方向を前記挙動情報として取得し、
     前記移動方向演算部は、前記予定移動方向に基づいて前記移動体の移動方向を算出する、請求項2に記載の撮像部制御装置。
    The behavior information acquisition unit acquires a planned moving direction of the mobile body as the behavior information from a direction indicator mounted on the mobile body,
    The imaging unit control device according to claim 2, wherein the movement direction calculation unit calculates the movement direction of the moving body based on the planned movement direction.
  8.  前記撮像部によって撮影された画像から前記移動体の周囲の障害物情報および道路情報を含む外界情報を認識する外界情報認識部をさらに備え、
     前記画像領域設定部は、前記外界情報に基づいて前記画像領域を設定する、請求項1に記載の撮像部制御装置。
    An external world information recognition unit for recognizing external world information including obstacle information and road information around the moving body from an image captured by the imaging unit,
    The image capturing unit control device according to claim 1, wherein the image region setting unit sets the image region based on the external world information.
PCT/JP2019/045356 2018-12-12 2019-11-20 Imaging unit control device WO2020121758A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020559891A JP7122394B2 (en) 2018-12-12 2019-11-20 Imaging unit controller
CN201980079112.1A CN113170057B (en) 2018-12-12 2019-11-20 Imaging unit control device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-232134 2018-12-12
JP2018232134 2018-12-12

Publications (1)

Publication Number Publication Date
WO2020121758A1 true WO2020121758A1 (en) 2020-06-18

Family

ID=71075625

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/045356 WO2020121758A1 (en) 2018-12-12 2019-11-20 Imaging unit control device

Country Status (3)

Country Link
JP (1) JP7122394B2 (en)
CN (1) CN113170057B (en)
WO (1) WO2020121758A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022064970A1 (en) * 2020-09-25 2022-03-31 ソニーセミコンダクタソリューションズ株式会社 Image capture control device, image capture control method, and mobile object

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07302325A (en) * 1994-04-30 1995-11-14 Suzuki Motor Corp On-vehicle image recognizing device
WO2006118076A1 (en) * 2005-04-28 2006-11-09 Aisin Seiki Kabushiki Kaisha System for monitoring periphery of vehicle
JP2011065442A (en) * 2009-09-17 2011-03-31 Hitachi Automotive Systems Ltd Vehicle shadow recognition device
JP2014143547A (en) * 2013-01-23 2014-08-07 Denso Corp Exposure controller

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07302325A (en) * 1994-04-30 1995-11-14 Suzuki Motor Corp On-vehicle image recognizing device
WO2006118076A1 (en) * 2005-04-28 2006-11-09 Aisin Seiki Kabushiki Kaisha System for monitoring periphery of vehicle
JP2011065442A (en) * 2009-09-17 2011-03-31 Hitachi Automotive Systems Ltd Vehicle shadow recognition device
JP2014143547A (en) * 2013-01-23 2014-08-07 Denso Corp Exposure controller

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022064970A1 (en) * 2020-09-25 2022-03-31 ソニーセミコンダクタソリューションズ株式会社 Image capture control device, image capture control method, and mobile object

Also Published As

Publication number Publication date
CN113170057A (en) 2021-07-23
JPWO2020121758A1 (en) 2021-10-14
JP7122394B2 (en) 2022-08-19
CN113170057B (en) 2023-06-13

Similar Documents

Publication Publication Date Title
EP2950521B1 (en) Camera capable of reducing motion blur in a low luminance environment and vehicle including the same
EP2955915B1 (en) Around view provision apparatus and vehicle including the same
CN106981082B (en) Vehicle-mounted camera calibration method and device and vehicle-mounted equipment
US9308917B2 (en) Driver assistance apparatus capable of performing distance detection and vehicle including the same
US20150341620A1 (en) Stereo camera and driver assistance apparatus and vehicle including the same
JP5880703B2 (en) Lane marking indicator, driving support system
US10704957B2 (en) Imaging device and imaging method
JP2020501423A (en) Camera means and method for performing context-dependent acquisition of a surrounding area of a vehicle
JP2005332107A (en) Lane marking recognizing device for vehicle
JP6471522B2 (en) Camera parameter adjustment device
JP6209825B2 (en) Parallax detection device and parallax detection method
JP2005332106A (en) Lane marking recognizing device for vehicle
JP2012073927A (en) Driving support apparatus
US10455159B2 (en) Imaging setting changing apparatus, imaging system, and imaging setting changing method
WO2020121758A1 (en) Imaging unit control device
JP5716944B2 (en) In-vehicle camera device
JP3722485B1 (en) Vehicle lane marking recognition device
JP4539400B2 (en) Stereo camera correction method and stereo camera correction device
US11410288B2 (en) Image processing apparatus
JP2023115753A (en) Remote operation system, remote operation control method, and remote operator terminal
JP2022157399A (en) Image processing device and image processing method, vehicle control device, and program
KR20160092403A (en) Driver assistance apparatus and Control Method Thereof
KR20200119790A (en) Recognition device, recognition method and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19896594

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020559891

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19896594

Country of ref document: EP

Kind code of ref document: A1