WO2022190166A1 - Image processing device, image processing method, and image processing program - Google Patents

Image processing device, image processing method, and image processing program Download PDF

Info

Publication number
WO2022190166A1
WO2022190166A1 PCT/JP2021/008973 JP2021008973W WO2022190166A1 WO 2022190166 A1 WO2022190166 A1 WO 2022190166A1 JP 2021008973 W JP2021008973 W JP 2021008973W WO 2022190166 A1 WO2022190166 A1 WO 2022190166A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
road surface
luminance
surface lighting
unit
Prior art date
Application number
PCT/JP2021/008973
Other languages
French (fr)
Japanese (ja)
Inventor
竜也 山田
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2021557564A priority Critical patent/JP7031799B1/en
Priority to PCT/JP2021/008973 priority patent/WO2022190166A1/en
Publication of WO2022190166A1 publication Critical patent/WO2022190166A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an image processing device, an image processing method, and an image processing program.
  • the captured image including the road surface image will be displayed on the display screen.
  • the road surface image shows a warning
  • the image showing the warning is displayed on the display screen inside the vehicle.
  • it is difficult for the occupant to distinguish whether the information indicating the alert displayed on the display screen is information for the occupant or information for pedestrians outside the vehicle. Therefore, the occupant may feel annoyed by the fact that the captured image including the road surface image is displayed on the display screen.
  • the present invention is intended to solve the above-described problems, and to reduce annoyance even when displaying on a display screen an image obtained by capturing a road surface image projected onto the road surface around the vehicle. aim.
  • a display processing device includes a projection control unit that projects an image whose brightness changes with time in a predetermined pattern onto a road surface around a vehicle, and an acquisition unit that acquires a photographed image of the surroundings of the vehicle.
  • a storage unit for storing the photographed image acquired by the acquisition unit; specifying an area in which luminance changes with time according to the pattern from the photographed image stored in the storage unit; It is characterized by comprising a luminance adjusting section for adjusting a luminance value, and a display output section for outputting the photographed image adjusted by the luminance adjusting section to a display section.
  • the present invention has the effect of reducing the annoyance even when displaying on the display screen an image obtained by capturing a road surface image projected on the road surface around the vehicle.
  • FIG. 1 is a diagram showing the configuration of an image processing apparatus 100 according to this embodiment
  • FIG. 2 is a conceptual diagram showing software operating on the image processing apparatus 100.
  • FIG. 4 is a conceptual diagram of a memory 122 of the camera ECU 120;
  • FIG. FIG. 2 is a conceptual diagram showing aspects of road surface lighting; This is an example of photographed data stored in the frame buffer 171 when the conventional road surface lighting image 201 is photographed by the camera 125 .
  • FIG. 4 is a conceptual diagram showing an example of a luminance change pattern; 4 is a flowchart regarding processing of the image processing apparatus 100.
  • FIG. 4 is a conceptual diagram showing a captured image stored in a frame buffer 171;
  • FIG. 4 is a conceptual diagram showing a photographed image with adjusted luminance values stored in a frame buffer 171.
  • FIG. 2 is a functional block diagram of the image processing apparatus 100;
  • FIG. FIG. 5 is a diagram showing a luminance change pattern of road surface lighting associated with identification information;
  • 4 is a flowchart regarding processing of the image processing apparatus 100.
  • FIG. 2 is a conceptual diagram of memory 132 of main SoC 130.
  • FIG. 4 is a flowchart regarding processing of the image processing apparatus 100.
  • FIG. FIG. 4 is a diagram showing a situation in which a plurality of road surface lighting images are captured; It is an example of photographed data stored in a frame buffer when a plurality of road surface lighting images are photographed by the camera 125 .
  • 4 is a flowchart regarding processing of the image processing apparatus 100.
  • FIG. 4 is a conceptual diagram showing a photographed image with adjusted luminance values stored in a frame buffer 440.
  • FIG. 1 is a diagram showing the configuration of an image processing apparatus 100 according to this embodiment.
  • the image processing apparatus 100 includes a road surface lighting ECU (Electric Control Unit) 110 , a camera ECU 120 , a main SoC (System-on-a-Chip) 130 and an in-vehicle communication bus 101 .
  • the road surface lighting ECU 110, the camera ECU 120, and the main SoC 130 are connected by an in-vehicle communication bus 101, respectively. It should be noted that the present invention can also be implemented with other connection modes for the in-vehicle communication bus 101 .
  • the road surface lighting ECU 110 includes a CPU (Central Processing Unit) 111 (however, multiple CPU cores may be provided), a memory 112 , an interface 113 and a storage 114 .
  • the interface 113 is connected to the projection section 115 .
  • the projection unit 115 is, for example, a projector.
  • the projection unit 115 projects an image showing information to be notified to people (pedestrians, occupants of other vehicles) around the vehicle, such as the traveling direction of the vehicle and a warning, onto the road surface around the vehicle. .
  • the image projected on the road surface around the vehicle will be referred to as a "road surface lighting image”. Projecting the road surface lighting image onto the road surface is called "road surface lighting”.
  • the road surface lighting image includes a still image and/or a moving image.
  • the CPU 111 is an arithmetic unit that controls the road surface lighting ECU 110 .
  • the memory 112 is, for example, a volatile storage device, and stores programs executed by the CPU 111, stacks, variables, and the like.
  • the storage 114 is, for example, a non-volatile storage device such as a flash memory, and stores data such as programs and image data representing a road surface lighting image.
  • image processing apparatus 100 includes projection unit 115 as a part of its configuration, but projection unit 115 may be configured as an external component.
  • the camera ECU 120 includes a CPU 121 , a memory 122 , an interface 123 and a storage 124 .
  • Interface 123 is connected to camera 125 .
  • the camera 125 captures an image of the surroundings of the vehicle to generate captured data. Therefore, when the road surface lighting image is projected during photographing, the camera 125 generates a photographed image including the road surface lighting image.
  • image processing apparatus 100 includes camera 125 as part of its configuration, but camera 125 may be configured as an external component. Also, the interface 123 may be connected to multiple cameras 125 .
  • the main SoC 130 comprises a CPU 131, a memory 132, an interface 133, and a storage 134.
  • the interface 133 is connected to the display section 135 .
  • the display unit 135 is, for example, a liquid crystal screen provided in a car navigation system, a room mirror, or the like.
  • the display unit 135 displays a captured image corresponding to the captured data.
  • image processing apparatus 100 includes display unit 135 as part of its configuration, but display unit 135 may be configured as an external component.
  • FIG. 2 is a conceptual diagram showing software that operates on the image processing apparatus 100.
  • the CPU 111 of the road surface lighting ECU 110 activates the operating system 141 . Also, the CPU 111 of the road surface lighting ECU 110 operates a communication library 142 , a road surface light driver 143 , and a road surface lighting application 140 on the operating system 141 .
  • the road surface lighting application 140 drives the road surface light driver 143 to perform road surface lighting.
  • the CPU 121 of the camera ECU 120 activates the operating system 151 . Also, the CPU 121 of the camera ECU 120 operates a communication library 152 , a camera driver 153 and a camera application 150 on this operating system 151 . The camera application 150 drives the camera driver 153 to photograph the surroundings of the vehicle, and sequentially transmits photographed data generated by photographing to the main SoC 130 .
  • the CPU 131 of the main SoC 130 starts up the operating system 161. Also, the CPU 131 of the main SoC 130 operates a communication library 162 , a display driver 163 and a driving support application 160 on this operating system 161 .
  • the driving assistance application 160 receives the photographed data transmitted from the camera ECU 120 . Further, the driving support application 160 drives the display driver 163 to display the captured image corresponding to the captured data on the display unit 135 .
  • the present invention can be implemented regardless of whether the respective operating systems and communication libraries are of the same type or different types.
  • FIG. 3 is a conceptual diagram of the memory 122 of the camera ECU 120.
  • Camera driver 153 uses part of memory 122 as frame buffer 171 .
  • the frame buffer 171 is used for temporary storage until a photographed image (still image) photographed by the camera 125 is encoded as a video (moving image).
  • the camera application 150 sequentially stores the photographed data in units of one frame in the frame buffer 171 . Further, when the camera application 150 stores the photographed data up to the end of the frame buffer 171, it returns to the beginning of the frame buffer 171 and resumes storing the photographed data. Furthermore, the camera application 150 encodes the shooting data stored in the frame buffer 171 as video in units of one frame or multiple frames. Common techniques are used for this encoding method. In the present disclosure, the data before encoding and the data after encoding are referred to as "capture data" without distinguishing between them.
  • FIG. 4 is a conceptual diagram showing the road surface lighting image 201 projected from the projection unit 115 onto the road surface around the own vehicle 200.
  • FIG. 4 shows the front of the own vehicle 200, and the upper part of FIG. 4 shows the rear of the own vehicle 200.
  • a projection unit 115 and a camera 125 are provided behind the own vehicle 200 .
  • a shooting range 202 indicates a range shot by the camera 125 .
  • a non-illuminated area 203 indicates an area that is included in the imaging range 202 but is not illuminated with the road surface lighting image 201 .
  • the brightness value obtained by measuring the brightness of the area illuminated with the road surface lighting image 201 is higher than the brightness value obtained by measuring the brightness of the non-illuminated area 203 .
  • the occupant may feel annoyed by the fact that the captured image including the road surface image is displayed on the display screen.
  • the road surface lighting image 201 shows the shape when the own vehicle 200 makes a left turn while backing up
  • the projected image and the timing are not limited to this.
  • FIG. 5 is an example of photographed data stored in the frame buffer 171 when the conventional road surface lighting image 201 is photographed by the camera 125 .
  • Frames 403A to 403E store captured data generated over time in units of one frame.
  • a conventional road lighting ECU projects a road lighting image with a constant luminance. Therefore, frames 403A to 403E store captured images including road surface lighting images with a constant luminance value regardless of the display cycle. In the following description, if there is no need to distinguish between the frames 403A to 403E, the letters at the end will be omitted.
  • the road surface lighting ECU 110 does not project the road surface lighting image 201 with a constant luminance as in the conventional art, but changes the luminance over time in a predetermined pattern (hereinafter referred to as a luminance change pattern).
  • a road surface lighting image 201 is projected.
  • the camera ECU 120 identifies from the photographed image an area in which the luminance changes over time according to the luminance change pattern, and adjusts the photographed data so that the luminance value of the identified area becomes low.
  • the main SoC 130 displays on the display unit 135 the captured image corresponding to the captured data whose luminance value has been adjusted.
  • the display unit 135 displays the photographed image in which the brightness value is adjusted so that the road surface lighting image 201 becomes difficult to see.
  • the conventional luminance change pattern 410 is set to a constant luminance (high luminance) regardless of the period.
  • the brightness value for projecting the road surface lighting image 201 is set for each section 412 .
  • a section 412 indicates the minimum section that constitutes the luminance change pattern.
  • the brightness change pattern 411 is indicated, for example, as high brightness, high brightness, low brightness, high brightness, and low brightness.
  • the road surface lighting ECU 110 of the present disclosure repeats this luminance change pattern 411 as one period only while the road surface lighting image 201 is being projected.
  • the camera 125 has a predetermined number of frames (frame rate) that can be captured in one second. For example, when using the camera 125 of 30 fps (frames/second), it is desirable that the section 412 is set to approximately 33 milliseconds or longer.
  • FIG. 7 is a flow chart regarding processing of the image processing apparatus 100 .
  • the road lighting EUC 110 is in the state before road lighting begins.
  • the camera ECU 120 is in a state before starting photographing.
  • the main SoC 130 is in a state in which it is possible to display an image corresponding to the photographing data transmitted from the camera ECU 120 on the display section 135 .
  • the road surface lighting ECU 110 When the road surface lighting ECU 110 receives a trigger signal that triggers the start of road surface lighting, the road surface lighting ECU 110 notifies the camera ECU 120 of the luminance change pattern 411 of the road surface lighting (step 301). This communication is performed using the in-vehicle communication bus 101 . Communication is assumed to be in-vehicle Ethernet, but other methods may be used.
  • the road surface lighting ECU 110 starts road surface lighting (step 302), and projects the road surface lighting image 201 whose luminance is changed according to the content of the luminance change pattern 411 notified in step 301 (step 303).
  • the camera ECU 120 receives the brightness change pattern 411 from the road surface lighting ECU 110 (step 304).
  • the camera ECU 120 secures in the memory 122 a frame buffer 171 area capable of storing photographed data for one cycle or more of the brightness change pattern 411 in order to specify a region where the brightness changes over time in the brightness change pattern 411 .
  • the camera ECU 120 starts photographing with the camera 125 (step 305), and stores photographed data generated by photographing in the frame buffer 171 (step 306).
  • FIG. 8 is a conceptual diagram showing captured images stored in the frame buffer 171.
  • the frame buffer 171 stores photographed data obtained by continuously photographing road surface lighting images whose luminance changes with time according to the luminance change pattern 411 .
  • frames 403A, 403B, and 403D store captured images of high-brightness road surface lighting images 201A, 201B, and 201D.
  • Frames 403C and 403E store captured images obtained by capturing low-luminance road surface lighting images 201C and 201E.
  • the camera ECU 120 measures the change in the brightness value of the captured image stored in each frame 403, and determines the area where the brightness changes over time according to the brightness change pattern 411 notified in step 301, that is, the road surface lighting images 201A to 201E. (step 307). If the corresponding area could not be specified in the process of step 307 , the camera ECU 120 proceeds to the process of step 309 . On the other hand, when the relevant area is specified in the process of step 307, the camera ECU 120 adjusts the brightness value of the captured image in the relevant area.
  • the camera ECU 120 calculates the difference between the luminance values indicated by the regions identified in step 307 from the frame 403A corresponding to the high luminance luminance change pattern 411 and the frame 403C corresponding to the low luminance luminance change pattern 411. Then, the camera ECU 120 adjusts the brightness values of the captured images stored in the frames 403A, 403B, and 403D corresponding to the high-brightness brightness change pattern 411 so as to be lower by the corresponding difference (step 308).
  • the manner of adjusting the luminance value is not limited to this, as long as the camera ECU 120 adjusts such that the higher the luminance of the image projected by the road surface lighting ECU 110, the greater the extent to which the luminance value of the area is reduced.
  • any technique may be used.
  • the camera ECU 120 calculates the difference between the brightness value of the area where the high-brightness road surface lighting image 201A of the frame 403A corresponding to the high-brightness brightness change pattern 411 is captured and the brightness value of the other area. Then, the camera ECU 120 may adjust the luminance value of the area where the high-luminance road surface lighting image 201A is captured so as to be lower by the corresponding difference.
  • the camera ECU 120 may adjust the luminance value of the area where the high-luminance road surface lighting image 201A of the frame 403A corresponding to the high-luminance luminance change pattern 411 is captured to be lower by a predetermined value. .
  • FIG. 9 is a conceptual diagram showing a photographed image with adjusted luminance values stored in the frame buffer 171.
  • the brightness values of the road surface lighting images 201A, 201B, and 201D illuminated with high brightness are adjusted to be approximately the same as the brightness values of the road surface lighting images 201C and 201E illuminated with low brightness. Captured data is stored.
  • the camera ECU 120 checks for free space in the frame buffer 171 (step 309).
  • Step 309 is implemented to temporarily retain the number of frames required to detect the luminance change pattern in step 307 in the frame buffer 171 . If there is space in the frame buffer 171 (step 309 returns No), the camera ECU 120 proceeds to step 306 to store the shooting data generated by shooting in the frame buffer 171 . On the other hand, if the frame buffer 171 has no free space (step 309; Yes), the camera ECU 120 transmits the oldest captured data among the captured data stored in the frame buffer 171 to the main SoC 130, and deletes the photographic data stored in , and makes it reusable (step 310). The main SoC 130 displays the captured image corresponding to the received captured data on the display unit 135 (step 311).
  • FIG. 10 is a functional block diagram of the image processing apparatus 100.
  • the image processing apparatus 100 includes a projection control section 11 , an acquisition section 12 , a storage section 13 , a luminance adjustment section 14 and a display output section 15 .
  • the projection control unit 11 receives a trigger signal that triggers the start of road surface lighting from an in-vehicle device (not shown)
  • the projection control unit 11 projects an image (road surface lighting image) in which the luminance changes with time in a predetermined pattern on the road surface around the vehicle. Project.
  • the projection control unit 11 is realized by the CPU 111 of the road surface lighting ECU 110 in FIG.
  • the acquisition unit 12 acquires a photographed image of the surroundings of the vehicle.
  • the acquisition unit 12 is implemented by the CPU 121 of the camera ECU 120 in FIG.
  • the storage unit 13 stores the captured image acquired by the acquisition unit 12 .
  • the storage unit 13 is implemented by the memory 122 of the camera ECU 120 .
  • the brightness adjustment unit 14 identifies an area in which the brightness changes with time according to a pattern from the captured image stored in the storage unit 13, and adjusts the brightness value of the captured image in that area. Further, the brightness adjustment unit 14 adjusts the width of decreasing the brightness value of the region to increase as the brightness of the image projected by the projection control unit 11 increases.
  • the brightness adjustment unit 14 is implemented by the CPU 121 of the camera ECU 120 .
  • the display output unit 15 outputs the captured image adjusted by the brightness adjustment unit 14 to the display unit 135 .
  • the display output unit 15 is implemented by the CPU 131 of the main Soc 130 .
  • the image processing apparatus 100 displays the photographed image on the display unit so that the luminance value of the area corresponding to the road surface lighting image 201 is adjusted to be low. 135 can be displayed. Therefore, even if a photographed image obtained by photographing the road surface lighting image 201 is displayed on the display unit 135, it is difficult for the occupant of the own vehicle 200 to distinguish between the area corresponding to the road surface lighting image 201 and other areas. Become. That is, the image processing device 100 can reduce the annoyance of the passenger even when the image obtained by capturing the road surface lighting image 201 is displayed on the display unit 135 . In addition, the image processing apparatus 100 can prevent the occupant from recognizing the surrounding situation.
  • the image processing device 100 does not change the position or size at which the road surface lighting image 201 is projected.
  • the image processing apparatus 100 can reduce the effect of calling attention obtained by projecting the road surface lighting image 201, compared to the case where the position and size of the projection of the road surface lighting image 201 are changed. None.
  • in general in order to remove a specific object from a photographed image taken by a camera, it is necessary to recognize and extract the object from the photographed image. Recently, machine learning has been used for object recognition for arbitrary shapes. However, even when a trained model is used for object recognition, processing with a relatively high computational cost such as matrix multiplication and sum operations is performed. High difficulty.
  • the image processing apparatus 100 projects the road surface lighting image 201 with a luminance change pattern in which the luminance changes over time, and shoots images in which the luminance value of the area where the luminance changes over time is adjusted according to the luminance change pattern. It just displays the image. Therefore, the image processing apparatus 100 can realize real-time object recognition and removal from images to the extent that the passenger can recognize, judge, and operate the object without using a high-cost, high-performance SoC.
  • the road surface lighting ECU 110 notifies the camera ECU 120 of the road surface lighting pattern, so that the camera ECU 120 identifies from the captured image an area in which luminance changes over time according to the luminance change pattern, and detects the identified area.
  • the shooting data was adjusted so that the brightness value of
  • the camera ECU 120 may not be able to secure the capacity of the frame buffer. Therefore, in the second embodiment, a mode is described in which the camera ECU 120 notifies the road surface lighting ECU 110 of the identification information indicating the luminance change pattern of the road surface lighting.
  • the hardware configuration (FIG. 1), software configuration (FIG. 2), and functional configuration (FIG. 10) are the same as those of the first embodiment.
  • FIG. 11 is a diagram showing a luminance change pattern of road lighting associated with identification information.
  • the storage 114 of the road surface lighting ECU 110 and the storage 124 of the camera ECU 120 store luminance change pattern information 420 indicating the luminance change pattern of the road surface lighting associated with the identification information.
  • the brightness change pattern is specified not by the road surface lighting ECU 110 but by the camera ECU 120 . Therefore, the brightness change pattern information 420 is information that has been previously confirmed to be practicable in terms of the performance of the road surface lighting ECU 110 .
  • the brightness change pattern information 420 may be shared by the road surface lighting ECU 110 and the camera ECU 120, and may be stored in a storage other than the storages 114 and 124.
  • FIG. 12 is a flowchart regarding processing of the image processing apparatus 100 .
  • the camera ECU 120 selects the previously shared identification information of the luminance change pattern of the road surface lighting, and transmits it to the road surface lighting ECU 110 (step 312).
  • the road surface lighting ECU 110 receives identification information indicating the luminance change pattern of the road surface lighting from the camera ECU 120 .
  • the road surface lighting ECU 110 starts road surface lighting with the luminance change pattern associated with the identification information (step 302). That is, the road surface lighting ECU 110 illuminates a road surface lighting image whose luminance is changed according to the luminance change pattern associated with the identification information (step 303).
  • the camera ECU 120 secures a frame buffer area capable of storing one cycle of the photographed data of the luminance change pattern associated with the identification information (step 304). Then, the camera ECU 120 starts photographing with the camera 125 (step 305). Since the subsequent operation is the same as that of the first embodiment, the explanation is omitted.
  • the camera ECU 120 selects the previously shared identification information of the luminance change pattern of the road surface lighting and transmits it to the road surface lighting ECU 110, thereby suppressing the possibility that the capacity of the frame buffer cannot be secured.
  • the image processing apparatus 100 can display on the display unit 135 the captured image adjusted so that the luminance value of the area corresponding to the road surface lighting image 201 is low, even without the high-performance camera ECU 120 . .
  • camera ECU 120 defines a region of frame buffer 171 capable of storing image data for one cycle or more of brightness change pattern 411 in order to specify a region where brightness changes over time in brightness change pattern 411. Secured in the memory 122 .
  • the main SoC 130 an area in which luminance changes over time is specified in the luminance change pattern, and the brightness value of the specified area is adjusted. This embodiment can be applied when the performance of the camera ECU 120 is low and the memory capacity required for the frame buffer cannot be secured.
  • the hardware configuration (FIG. 1), software configuration (FIG. 2), and functional configuration (FIG. 10) are the same as those of the first embodiment.
  • the storage unit 13 is realized by the memory 132 of the main SoC 130 .
  • the brightness adjustment unit 14 is implemented by the CPU 131 of the main SoC 130 .
  • FIG. 13 is a conceptual diagram of the memory 132 of the main SoC 130.
  • Communications library 162 uses a portion of memory 132 as frame buffer 172 .
  • the frame buffer 172 is used for temporary storage until the captured image (still image) transmitted from the camera ECU 120 is encoded as a video (moving image).
  • the main SoC 130 sequentially stores the received shot data in units of one frame in the frame buffer 172 . Further, when the captured data is stored up to the end of the frame buffer 172, the main SoC 130 returns to the beginning of the frame buffer 172 and resumes storing captured data.
  • the driving assistance application 160 encodes the photographed data stored in the frame buffer 172 as video in units of one frame or multiple frames. Common techniques are used for this encoding technique. In the present disclosure, the data before encoding and the data after encoding are referred to as "capture data" without distinguishing between them.
  • FIG. 14 is a flowchart regarding processing of the image processing apparatus 100 .
  • the road lighting EUC 110 is in the state before road lighting begins.
  • the camera ECU 120 is in a state before starting photographing.
  • the main SoC 130 is in a state capable of receiving photographed data transmitted from the camera ECU 120 .
  • the road surface lighting ECU 110 notifies the main SoC 130 of the luminance change pattern of the road surface lighting (step 301). This communication is performed using the in-vehicle communication bus 101 . Communication is assumed to be in-vehicle Ethernet, but other methods may be used.
  • the road surface lighting ECU 110 starts road surface lighting (step 302), and projects a road surface lighting image whose luminance is changed according to the content of the luminance change pattern notified in step 301 (step 303).
  • the main SoC 130 receives the luminance change pattern from the road surface lighting ECU 110 (step 320).
  • the main SoC 130 secures in the memory 132 a frame buffer 172 area capable of storing image data for one cycle or more of the brightness change pattern in order to specify a region where the brightness changes over time in the brightness change pattern. Further, after the main SoC 130 secures the area of the frame buffer 172 in the memory 132, the main SoC 130 transmits to the camera ECU 120 a receivable notification indicating that preparations for receiving photographed data have been completed (step 321).
  • the camera ECU 120 starts shooting with the camera 125 (step 305). Further, after receiving the receivable notification from the main SoC 130, the camera ECU 120 sequentially transmits the photographed data generated by photographing to the main SoC 130 without buffering (step 323).
  • the main SoC 130 stores the received imaging data in the frame buffer 172 (step 324).
  • the main SoC 130 measures changes in the luminance value of the captured image stored in the frame buffer 172, and identifies areas where the luminance changes over time according to the luminance change pattern notified in step 301 (step 325). If the relevant area could not be specified in step 325 , the main SoC 130 proceeds to process step 327 .
  • the main SoC 130 adjusts the brightness value of the captured image in the relevant area (step 326). For example, the main SoC 130 replaces the brightness value of the high-brightness region of the oldest frame in the frame buffer with the brightness value of the low-brightness region that appeared immediately before. Note that the main SoC 130 may adjust the brightness value of the captured image using the same method as the camera ECU 120 .
  • Step 327 is implemented to temporarily store in frame buffer 172 the number of frames required to detect the luminance change pattern in step 325 . If there is space in the frame buffer 172 (step 327 ; No), the main SoC 130 transitions to step 324 to store the shooting data generated by shooting in the frame buffer 172 . On the other hand, if the frame buffer 172 has no free space (step 327: Yes), the main SoC 130 displays the captured image corresponding to the oldest captured data among the captured data stored in the frame buffer 172 on the display device. Delete the framebuffer area and make it available for reuse (328). Note that the main SoC 130 may cause the display unit 135 to display a captured image corresponding to the captured data according to the specifications of the display unit 135 .
  • the pixel processing device 100 replaces the buffering by the camera ECU 120 with the main SoC 130, so that the luminance value of the area corresponding to the road surface lighting image 201 is lowered even without the high-performance camera ECU 120.
  • the captured image adjusted as above can be displayed on the display unit 135 .
  • Embodiment 4 In Embodiment 1, only the case where the road surface lighting is performed by the own vehicle is assumed. On the other hand, due to the spread of road lighting, it is conceivable that other vehicles close to the own vehicle 200 also have similar luminance change patterns.
  • FIG. 15 is a diagram showing a situation in which a plurality of road surface lighting images are captured.
  • camera 125 of own vehicle 200 not only projects road surface lighting image 201 projected from own vehicle 200 but also road surface lighting image 211 projected from other vehicle 210. I will be taking pictures.
  • FIG. 16 is an example of photographed data stored in the frame buffer when a plurality of road surface lighting images are photographed by the camera 125.
  • the frame buffer 440 stores photographed data obtained by continuously photographing the road surface lighting image 211 of the other vehicle 210 and the road surface lighting image 201 of the own vehicle 200 .
  • the image processing apparatus 100 cannot distinguish between each road surface lighting image. For this reason, the image processing apparatus 100 has a problem that, for example, like the photographed images stored in the buffer 442, the luminance value of the area corresponding to each road surface lighting image is adjusted to be low.
  • FIG. 17 is a flowchart regarding processing of the image processing apparatus 100 .
  • the image processing device 100 determines that the road surface lighting of the other vehicle 210 has been photographed when the ratio of the area in which the luminance changes with time in the luminance change pattern 411 increases, and changes the luminance of the road surface lighting of the own vehicle 200. The difference is that the change pattern 411 is changed.
  • the road surface lighting EUC 110 is in the state before starting the road surface lighting.
  • the camera ECU 120 is in a state before starting photographing.
  • the main SoC 130 is in a state capable of displaying on the display unit 135 an image corresponding to the photographed data transmitted from the camera ECU 120 .
  • the road surface lighting ECU 110 notifies the camera ECU 120 of the brightness change pattern 411 (step 301).
  • the road surface lighting ECU 110 starts road surface lighting (step 302), and projects the road surface lighting image 201 whose luminance is changed according to the content of the luminance change pattern 411 notified in step 301 (step 303).
  • the camera ECU 120 receives the brightness change pattern 411 from the road surface lighting ECU 110 .
  • the camera ECU 120 secures in the memory 122 a frame buffer 440 capable of storing photographed data for one cycle or more of the brightness change pattern 411 in order to specify an area in which the brightness changes over time in the brightness change pattern 411 (step 304).
  • the camera ECU 120 starts shooting with the camera 125 (step 305), and stores shooting data generated by shooting in the frame buffer 440 (step 306).
  • the camera ECU 120 measures changes in the brightness value of the captured image stored in the frame buffer 440, and identifies whether there is an area in which the brightness changes over time in the brightness change pattern 411 notified in step 301 (step 307).
  • the camera ECU 120 determines whether the ratio of the region corresponding to the luminance change pattern 411 to the captured image stored in the frame buffer 440 is greater than a preset threshold (step 331).
  • This threshold value is set according to product specifications related to automobile safety standards, using parameters such as the size of the road surface lighting image, the height at which the projection unit 115 is provided, the projection angle, the projection range, and the permissible range of the degree of proximity between the vehicle and other vehicles. be done. Also, this threshold value may be set, for example, according to the past average ratio of the area corresponding to the luminance change pattern 411 in the captured image stored in the frame buffer 440 .
  • the camera ECU 120 captures the road surface lighting of the other vehicle 210 projected with the same brightness change pattern 411 as that of the own vehicle 200, in addition to the road surface lighting of the own vehicle 200. It is determined that The camera ECU 120 then notifies the road surface lighting ECU 110 of the change in the luminance change pattern (step 332). After transmitting all the photographed data stored in the frame buffer 440 to the main SoC 130, the camera ECU 120 deletes all the photographed data stored in the frame buffer 440 (step 333).
  • the road surface lighting ECU 110 changes the brightness change pattern of the own vehicle 200 in accordance with the request from the camera ECU 120 to change the brightness change pattern (step 330). Then, the road surface lighting ECU 110 notifies the camera ECU 120 of the luminance change pattern after the change (step 301), and restarts the road surface lighting (step 302).
  • step 307 If no luminance change pattern is detected in step 307 (step 307: No), or if it is determined that the value is below the threshold in step 331 (step 331: No), the camera ECU 120 performs steps 308 to 308 as in the first embodiment. 310 is executed. Then, the main SoC 130 displays the captured image corresponding to the received captured data on the display unit 135 (step 311).
  • FIG. 19 is a conceptual diagram showing a photographed image stored in the frame buffer 440 and having its luminance value adjusted.
  • the image processing device 100 changes the luminance change pattern of the own vehicle 200 so that even when the road surface lighting image projected from the own vehicle 200 and the other vehicle 210 is captured, the image of the own vehicle 200 is changed. Only the luminance value of the area corresponding to the road surface lighting image 201 can be adjusted. That is, the image processing device 100 can reduce the discomfort felt by the occupant by displaying on the display screen an image corresponding to the road surface lighting image projected from the own vehicle 200 .
  • Embodiment 4 is independent of Embodiments 2 and 3, it is possible to combine them.
  • Embodiments 1, 2, 3, and 4 it was assumed that the camera ECU 120 and the main SoC 130 are different devices. is feasible. In this case, communication using the in-vehicle communication bus is replaced with communication using the bus within the SoC.

Abstract

A road surface lighting ECU 110 projects a road surface lighting image 201, the luminance of which changes with time in a prescribed luminance change pattern, instead of projecting a road surface lighting image 201 with a constant luminance as in the past. A camera ECU 120 then identifies an area in a captured image, the luminance of which changes with time in the luminance change pattern, and adjusts image capture data so that the luminance value of the identified area decreases. A main SoC 130 then displays, on a display unit 135, a captured image corresponding to the image capture data adjusted for the luminance value. As a result, even if the road surface lighting image 201 is reflected in a captured image, the luminance value of the captured image as displayed on the display unit 135 is adjusted so as to make it difficult to visually recognize the road surface lighting image 201.

Description

画像処理装置、画像処理方法、及び、画像処理プログラムImage processing device, image processing method, and image processing program
 この発明は、画像処理装置、画像処理方法、及び、画像処理プログラムに関する。 The present invention relates to an image processing device, an image processing method, and an image processing program.
 従来から、カメラで撮影した車外の画像を車内の表示画面に表示することで、運転手の運転を支援する技術がある。また、例えば、車両の進行方向や注意喚起を示す画像を車両の周辺の路面に投影することで、車両の周辺に存在する歩行者に対して情報を伝達する技術がある(特許文献1)。以降、車両の周辺の路面に投影された画像を「路面画像」、カメラで撮影した車外の画像を「撮影画像」と称する。 Conventionally, there is a technology that supports the driver's driving by displaying an image of the outside of the vehicle taken by a camera on the display screen inside the vehicle. Further, for example, there is a technique of transmitting information to pedestrians present around the vehicle by projecting an image showing the traveling direction of the vehicle and a warning on the road surface around the vehicle (Patent Document 1). Hereinafter, an image projected on the road surface around the vehicle will be referred to as a "road surface image", and an image outside the vehicle captured by a camera will be referred to as a "captured image".
特開2018-149876号公報JP 2018-149876 A
 ところで、カメラの撮影中に路面画像が投影された場合、路面画像の写り込んだ撮影画像が表示画面に表示されることになる。例えば、路面画像が注意喚起を示す場合、注意喚起を示す画像が車内の表示画面に表示されることになる。この場合、乗員は、表示画面に表示された注意喚起を示す情報が、乗員に対する情報なのか、車外に存在する歩行者に対する情報なのか識別し難い。その為、乗員は、路面画像の写り込んだ撮影画像が表示画面に表示されることに対して、煩わしさを感じることがあった。 By the way, when the road surface image is projected while the camera is shooting, the captured image including the road surface image will be displayed on the display screen. For example, when the road surface image shows a warning, the image showing the warning is displayed on the display screen inside the vehicle. In this case, it is difficult for the occupant to distinguish whether the information indicating the alert displayed on the display screen is information for the occupant or information for pedestrians outside the vehicle. Therefore, the occupant may feel annoyed by the fact that the captured image including the road surface image is displayed on the display screen.
この発明は、前述のような問題を解決するものであり、車両の周辺の路面に投影された路面画像を撮影した画像を表示画面に表示する場合であっても、煩わしさを低減することを目的とする。 SUMMARY OF THE INVENTION The present invention is intended to solve the above-described problems, and to reduce annoyance even when displaying on a display screen an image obtained by capturing a road surface image projected onto the road surface around the vehicle. aim.
 この発明に係る表示処理装置は、車両の周辺の路面に所定のパターンで経時的に輝度が変化する画像を投影する投影制御部と、前記車両の周辺を撮影した撮影画像を取得する取得部と、前記取得部で取得した前記撮影画像を格納する記憶部と、前記記憶部に格納した前記撮影画像から、前記パターンで経時的に輝度が変化する領域を特定し、当該領域における前記撮影画像の輝度値を調整する輝度調整部と、前記輝度調整部で調整した前記撮影画像を表示部に出力する表示出力部とを備えることを特徴とするものである。 A display processing device according to the present invention includes a projection control unit that projects an image whose brightness changes with time in a predetermined pattern onto a road surface around a vehicle, and an acquisition unit that acquires a photographed image of the surroundings of the vehicle. a storage unit for storing the photographed image acquired by the acquisition unit; specifying an area in which luminance changes with time according to the pattern from the photographed image stored in the storage unit; It is characterized by comprising a luminance adjusting section for adjusting a luminance value, and a display output section for outputting the photographed image adjusted by the luminance adjusting section to a display section.
 この発明は、車両の周辺の路面に投影された路面画像を撮影した画像を表示画面に表示する場合であっても、煩わしさを低減することができるという効果を奏する。 The present invention has the effect of reducing the annoyance even when displaying on the display screen an image obtained by capturing a road surface image projected on the road surface around the vehicle.
本実施の形態に係る画像処理装置100の構成を示す図である。1 is a diagram showing the configuration of an image processing apparatus 100 according to this embodiment; FIG. 画像処理装置100上で動作するソフトウェアを示す概念図である。2 is a conceptual diagram showing software operating on the image processing apparatus 100. FIG. カメラECU120のメモリ122の概念図である。4 is a conceptual diagram of a memory 122 of the camera ECU 120; FIG. 路面ライティングの態様を示す概念図である。FIG. 2 is a conceptual diagram showing aspects of road surface lighting; 従来の路面ライティング画像201をカメラ125で撮影したときにフレームバッファ171に格納される撮影データの例である。This is an example of photographed data stored in the frame buffer 171 when the conventional road surface lighting image 201 is photographed by the camera 125 . 輝度変化パターンの例を示す概念図である。FIG. 4 is a conceptual diagram showing an example of a luminance change pattern; 画像処理装置100の処理に関するフローチャートである。4 is a flowchart regarding processing of the image processing apparatus 100. FIG. フレームバッファ171に格納された撮影画像を示す概念図である。4 is a conceptual diagram showing a captured image stored in a frame buffer 171; FIG. フレームバッファ171に格納された輝度値が調整された撮影画像を示す概念図である。4 is a conceptual diagram showing a photographed image with adjusted luminance values stored in a frame buffer 171. FIG. 画像処理装置100の機能ブロック図である。2 is a functional block diagram of the image processing apparatus 100; FIG. 識別情報に対応付けられた路面ライティングの輝度変化パターンを示す図である。FIG. 5 is a diagram showing a luminance change pattern of road surface lighting associated with identification information; 画像処理装置100の処理に関するフローチャートである。4 is a flowchart regarding processing of the image processing apparatus 100. FIG. メインSoC130のメモリ132の概念図である。2 is a conceptual diagram of memory 132 of main SoC 130. FIG. 画像処理装置100の処理に関するフローチャートである。4 is a flowchart regarding processing of the image processing apparatus 100. FIG. 複数の路面ライティング画像が撮影される状況を示す図である。FIG. 4 is a diagram showing a situation in which a plurality of road surface lighting images are captured; 複数の路面ライティング画像をカメラ125で撮影したときにフレームバッファに格納される撮影データの例である。It is an example of photographed data stored in a frame buffer when a plurality of road surface lighting images are photographed by the camera 125 . 画像処理装置100の処理に関するフローチャートである。4 is a flowchart regarding processing of the image processing apparatus 100. FIG. フレームバッファ440に格納された輝度値が調整された撮影画像を示す概念図である。4 is a conceptual diagram showing a photographed image with adjusted luminance values stored in a frame buffer 440. FIG.
 本開示における画像処理装置は、自動車に備えられているものとする。以降の説明において、画像処理装置を備えた自動車を「車両」と称する。 It is assumed that the image processing device in the present disclosure is installed in an automobile. In the following description, an automobile equipped with an image processing device will be referred to as a "vehicle".
実施の形態1.
 図1は、本実施の形態に係る画像処理装置100の構成を示す図である。画像処理装置100は、路面ライティングECU(Electric・Control・Unit)110、カメラECU120、メインSoC(System-on-a-Chip)130、及び、車載通信バス101を備える。路面ライティングECU110、カメラECU120、及び、メインSoC130は、夫々車載通信バス101によって接続される。なお、車載通信バス101は、他の接続形態によっても本発明は実施可能である。
Embodiment 1.
FIG. 1 is a diagram showing the configuration of an image processing apparatus 100 according to this embodiment. The image processing apparatus 100 includes a road surface lighting ECU (Electric Control Unit) 110 , a camera ECU 120 , a main SoC (System-on-a-Chip) 130 and an in-vehicle communication bus 101 . The road surface lighting ECU 110, the camera ECU 120, and the main SoC 130 are connected by an in-vehicle communication bus 101, respectively. It should be noted that the present invention can also be implemented with other connection modes for the in-vehicle communication bus 101 .
 路面ライティングECU110は、CPU(Central・Processing・Unit)111(ただし、CPUコアは複数個備えていてもよい。)、メモリ112、インタフェース113、ストレージ114を備える。インタフェース113は、投影部115に接続されている。投影部115は、例えば、プロジェクタである。投影部115は、例えば、車両の進行方向や注意喚起といった、車両の周辺に存在する人間(歩行者、他車両の乗員)に対して報知したい情報を示す画像を車両の周辺の路面に投影する。以降、車両の周辺の路面に投影された画像を「路面ライティング画像」と称する。また、路面ライティング画像を路面に投影することを「路面ライティング」と称する。なお、路面ライティング画像は、静止画像、及び/又は、動画像を含む。CPU111は、路面ライティングECU110の制御を行う演算装置である。メモリ112は、例えば、揮発性記憶装置であり、CPU111が実行するプログラム、スタック、変数などを記憶する。ストレージ114は、例えば、フラッシュメモリなどの不揮発性記憶装置であり、プログラム、路面ライティング画像を示す画像データなどのデータを記憶する。なお、本実施の形態において、画像処理装置100は、投影部115を構成の一部としたが、投影部115を外部の構成としても良い。 The road surface lighting ECU 110 includes a CPU (Central Processing Unit) 111 (however, multiple CPU cores may be provided), a memory 112 , an interface 113 and a storage 114 . The interface 113 is connected to the projection section 115 . The projection unit 115 is, for example, a projector. The projection unit 115 projects an image showing information to be notified to people (pedestrians, occupants of other vehicles) around the vehicle, such as the traveling direction of the vehicle and a warning, onto the road surface around the vehicle. . Hereinafter, the image projected on the road surface around the vehicle will be referred to as a "road surface lighting image". Projecting the road surface lighting image onto the road surface is called "road surface lighting". Note that the road surface lighting image includes a still image and/or a moving image. The CPU 111 is an arithmetic unit that controls the road surface lighting ECU 110 . The memory 112 is, for example, a volatile storage device, and stores programs executed by the CPU 111, stacks, variables, and the like. The storage 114 is, for example, a non-volatile storage device such as a flash memory, and stores data such as programs and image data representing a road surface lighting image. In the present embodiment, image processing apparatus 100 includes projection unit 115 as a part of its configuration, but projection unit 115 may be configured as an external component.
 カメラECU120は、CPU121、メモリ122、インタフェース123、ストレージ124を備える。インタフェース123は、カメラ125に接続されている。カメラ125は、車両の周辺を撮影して撮影データを生成する。従って、撮影中に路面ライティング画像が投影された場合、カメラ125は、路面ライティング画像の写り込んだ撮影画像を生成することになる。なお、本実施の形態において、画像処理装置100は、カメラ125を構成の一部としたが、カメラ125を外部の構成としても良い。また、インタフェース123は、複数のカメラ125に接続されていても良い。 The camera ECU 120 includes a CPU 121 , a memory 122 , an interface 123 and a storage 124 . Interface 123 is connected to camera 125 . The camera 125 captures an image of the surroundings of the vehicle to generate captured data. Therefore, when the road surface lighting image is projected during photographing, the camera 125 generates a photographed image including the road surface lighting image. In the present embodiment, image processing apparatus 100 includes camera 125 as part of its configuration, but camera 125 may be configured as an external component. Also, the interface 123 may be connected to multiple cameras 125 .
 メインSoC130は、CPU131、メモリ132、インタフェース133、ストレージ134を備える。インタフェース133は、表示部135に接続されている。表示部135は、例えば、カーナビゲーションシステムやルームミラーなどに設けられた液晶画面である。表示部135は、撮影データに応じた撮影画像を表示する。なお、本実施の形態において、画像処理装置100は、表示部135を構成の一部としたが、表示部135を外部の構成としても良い。 The main SoC 130 comprises a CPU 131, a memory 132, an interface 133, and a storage 134. The interface 133 is connected to the display section 135 . The display unit 135 is, for example, a liquid crystal screen provided in a car navigation system, a room mirror, or the like. The display unit 135 displays a captured image corresponding to the captured data. In the present embodiment, image processing apparatus 100 includes display unit 135 as part of its configuration, but display unit 135 may be configured as an external component.
 図2は、画像処理装置100上で動作するソフトウェアを示す概念図である。路面ライティングECU110のCPU111は、オペレーティングシステム141を起動する。また、路面ライティングECU110のCPU111は、このオペレーティングシステム141上で、通信ライブラリ142、路面ライトドライバ143、及び、路面ライティングアプリ140を動作させる。路面ライティングアプリ140は、路面ライトドライバ143を駆動して路面ライティングを実施する。 FIG. 2 is a conceptual diagram showing software that operates on the image processing apparatus 100. FIG. The CPU 111 of the road surface lighting ECU 110 activates the operating system 141 . Also, the CPU 111 of the road surface lighting ECU 110 operates a communication library 142 , a road surface light driver 143 , and a road surface lighting application 140 on the operating system 141 . The road surface lighting application 140 drives the road surface light driver 143 to perform road surface lighting.
 カメラECU120のCPU121は、オペレーティングシステム151を起動する。また、カメラECU120のCPU121は、このオペレーティングシステム151上で、通信ライブラリ152、カメラドライバ153、及び、カメラアプリ150を動作させる。カメラアプリ150は、カメラドライバ153を駆動して車両の周辺を撮影し、撮影により生成され撮影データをメインSoC130に順次送信する。 The CPU 121 of the camera ECU 120 activates the operating system 151 . Also, the CPU 121 of the camera ECU 120 operates a communication library 152 , a camera driver 153 and a camera application 150 on this operating system 151 . The camera application 150 drives the camera driver 153 to photograph the surroundings of the vehicle, and sequentially transmits photographed data generated by photographing to the main SoC 130 .
 メインSoC130のCPU131は、オペレーティングシステム161を起動する。また、メインSoC130のCPU131は、このオペレーティングシステム161上で、通信ライブラリ162、ディスプレイドライバ163、及び、運転支援アプリ160を動作させる。運転支援アプリ160は、カメラECU120から送信された撮影データを受信する。また、運転支援アプリ160は、ディスプレイドライバ163を駆動して撮影データに応じた撮影画像を表示部135に表示させる。 The CPU 131 of the main SoC 130 starts up the operating system 161. Also, the CPU 131 of the main SoC 130 operates a communication library 162 , a display driver 163 and a driving support application 160 on this operating system 161 . The driving assistance application 160 receives the photographed data transmitted from the camera ECU 120 . Further, the driving support application 160 drives the display driver 163 to display the captured image corresponding to the captured data on the display unit 135 .
 それぞれのオペレーティングシステムと通信ライブラリは同種・異種にかかわらず、本発明は実施可能である。 The present invention can be implemented regardless of whether the respective operating systems and communication libraries are of the same type or different types.
 図3は、カメラECU120のメモリ122の概念図である。
 カメラドライバ153は、メモリ122の一部分をフレームバッファ171として使用する。フレームバッファ171は、カメラ125によって撮影された撮影画像(静止画像)を映像(動画像)としてエンコードするまでの一時保存のために使用される。カメラアプリ150は、撮影された1コマ単位の撮影データをフレームバッファ171に順次格納する。また、カメラアプリ150は、フレームバッファ171の終端まで撮影データを格納した場合には、フレームバッファ171の先頭に戻って撮影データの格納を再開する。さらに、カメラアプリ150は、フレームバッファ171に格納されている撮影データを1コマもしくは複数コマ単位で映像としてエンコードする。このエンコード手法には一般的な技術が使用される。なお、本開示においては、エンコード前のデータとエンコード後のデータを区別せずに「撮影データ」と称する。
FIG. 3 is a conceptual diagram of the memory 122 of the camera ECU 120. As shown in FIG.
Camera driver 153 uses part of memory 122 as frame buffer 171 . The frame buffer 171 is used for temporary storage until a photographed image (still image) photographed by the camera 125 is encoded as a video (moving image). The camera application 150 sequentially stores the photographed data in units of one frame in the frame buffer 171 . Further, when the camera application 150 stores the photographed data up to the end of the frame buffer 171, it returns to the beginning of the frame buffer 171 and resumes storing the photographed data. Furthermore, the camera application 150 encodes the shooting data stored in the frame buffer 171 as video in units of one frame or multiple frames. Common techniques are used for this encoding method. In the present disclosure, the data before encoding and the data after encoding are referred to as "capture data" without distinguishing between them.
 図4は、自車両200の周辺の路面に対して投影部115から投影された路面ライティング画像201を示す概念図である。図4の下方が自車両200の前方、図4の上方が自車両200の後方を示す。自車両200の後方には、投影部115及びカメラ125が設けられている。撮影範囲202は、カメラ125で撮影される範囲を示している。非照射領域203は、撮影範囲202に含まれるが、路面ライティング画像201が照射されなかった領域を示している。路面ライティング画像201が照射された領域の輝度を計測した輝度値は、非照射領域203の輝度を計測した輝度値よりも高い。その為、乗員は、路面画像の写り込んだ撮影画像が表示画面に表示されることに対して、煩わしさを感じることがあった。なお、路面ライティング画像201は、自車両200が後退しながら左折するときの形状を示しているが、投影される画像及びタイミングはこれに限られない。 FIG. 4 is a conceptual diagram showing the road surface lighting image 201 projected from the projection unit 115 onto the road surface around the own vehicle 200. FIG. 4 shows the front of the own vehicle 200, and the upper part of FIG. 4 shows the rear of the own vehicle 200. As shown in FIG. A projection unit 115 and a camera 125 are provided behind the own vehicle 200 . A shooting range 202 indicates a range shot by the camera 125 . A non-illuminated area 203 indicates an area that is included in the imaging range 202 but is not illuminated with the road surface lighting image 201 . The brightness value obtained by measuring the brightness of the area illuminated with the road surface lighting image 201 is higher than the brightness value obtained by measuring the brightness of the non-illuminated area 203 . Therefore, the occupant may feel annoyed by the fact that the captured image including the road surface image is displayed on the display screen. Although the road surface lighting image 201 shows the shape when the own vehicle 200 makes a left turn while backing up, the projected image and the timing are not limited to this.
 図5は、従来の路面ライティング画像201をカメラ125で撮影したときにフレームバッファ171に格納される撮影データの例である。フレーム403A~Eには、経時的に生成された撮影データが1フレーム単位で格納される。従来の路面ライティングECUは、一定の輝度で路面ライティング画像を投影する。そのため、フレーム403A~Eには、表示周期に関わらず、一定の輝度値の路面ライティング画像を含む撮影画像が格納される。以降の説明において、フレーム403A~Eの夫々を区別して説明する必要が無い場合には、末尾の英字を省略する。 FIG. 5 is an example of photographed data stored in the frame buffer 171 when the conventional road surface lighting image 201 is photographed by the camera 125 . Frames 403A to 403E store captured data generated over time in units of one frame. A conventional road lighting ECU projects a road lighting image with a constant luminance. Therefore, frames 403A to 403E store captured images including road surface lighting images with a constant luminance value regardless of the display cycle. In the following description, if there is no need to distinguish between the frames 403A to 403E, the letters at the end will be omitted.
 本実施の形態における路面ライティングECU110は、従来のように一定の輝度で路面ライティング画像201を投影するのではなく、所定のパターン(以下、輝度変化パターンと称する)で経時的に輝度を変化させた路面ライティング画像201を投影する。そして、カメラECU120は、輝度変化パターンで経時的に輝度が変化する領域を撮影画像から特定し、特定した領域の輝度値が低くなるように撮影データを調整する。そして、メインSoC130は、輝度値が調整された撮影データに応じた撮影画像を表示部135に表示する。これにより、撮影画像に路面ライティング画像201が写り込んだ場合であっても、輝度値が調整されて路面ライティング画像201を視認し難くなった撮影画像が表示部135に表示されることになる。 The road surface lighting ECU 110 according to the present embodiment does not project the road surface lighting image 201 with a constant luminance as in the conventional art, but changes the luminance over time in a predetermined pattern (hereinafter referred to as a luminance change pattern). A road surface lighting image 201 is projected. Then, the camera ECU 120 identifies from the photographed image an area in which the luminance changes over time according to the luminance change pattern, and adjusts the photographed data so that the luminance value of the identified area becomes low. Then, the main SoC 130 displays on the display unit 135 the captured image corresponding to the captured data whose luminance value has been adjusted. As a result, even if the road surface lighting image 201 is reflected in the photographed image, the display unit 135 displays the photographed image in which the brightness value is adjusted so that the road surface lighting image 201 becomes difficult to see.
 図6において、従来の輝度変化パターン410は、周期とは関係なく、一定の輝度(高輝度)に設定される。これに対し、本開示の輝度変化パターン411は、路面ライティング画像201を投影する輝度値が区間412ごとに設定される。区間412は、輝度変化パターンを構成する最小の区間を示す。輝度変化パターン411は、例えば、高輝度、高輝度、低輝度、高輝度、低輝度といったように示される。本開示の路面ライティングECU110は、この輝度変化パターン411を1周期として、路面ライティング画像201を投影する間だけ繰り返す。ところで、一般的に、カメラ125は、一秒間に撮影されるフレーム数(フレームレート)が定められている。例えば、30fps(フレーム/秒)のカメラ125を用いた場合、区間412は、約33ミリ秒以上に設定されることが望ましい In FIG. 6, the conventional luminance change pattern 410 is set to a constant luminance (high luminance) regardless of the period. On the other hand, in the brightness change pattern 411 of the present disclosure, the brightness value for projecting the road surface lighting image 201 is set for each section 412 . A section 412 indicates the minimum section that constitutes the luminance change pattern. The brightness change pattern 411 is indicated, for example, as high brightness, high brightness, low brightness, high brightness, and low brightness. The road surface lighting ECU 110 of the present disclosure repeats this luminance change pattern 411 as one period only while the road surface lighting image 201 is being projected. By the way, in general, the camera 125 has a predetermined number of frames (frame rate) that can be captured in one second. For example, when using the camera 125 of 30 fps (frames/second), it is desirable that the section 412 is set to approximately 33 milliseconds or longer.
 図7は、画像処理装置100の処理に関するフローチャートである。
 開始時、路面ライティングEUC110は、路面ライティングを開始する前の状態である。また、カメラECU120は、撮影を開始する前の状態である。また、メインSoC130は、カメラECU120から送信されてくる撮影データに応じた画像を表示部135に表示可能な状態である。
FIG. 7 is a flow chart regarding processing of the image processing apparatus 100 .
At the start, the road lighting EUC 110 is in the state before road lighting begins. Also, the camera ECU 120 is in a state before starting photographing. Further, the main SoC 130 is in a state in which it is possible to display an image corresponding to the photographing data transmitted from the camera ECU 120 on the display section 135 .
 路面ライティングECU110は、路面ライティングの開始の契機となるトリガ信号を受け付けると、路面ライティングの輝度変化パターン411をカメラECU120に通知する(ステップ301)。この通信は、車載通信バス101を用いて行われる。通信は車載イーサネットを想定するが、その他の方式を用いてもよい。路面ライティングECU110は、路面ライティングを開始するとともに(ステップ302)、ステップ301で通知した輝度変化パターン411の内容に従って輝度を変化させた路面ライティング画像201を投影する(ステップ303)。 When the road surface lighting ECU 110 receives a trigger signal that triggers the start of road surface lighting, the road surface lighting ECU 110 notifies the camera ECU 120 of the luminance change pattern 411 of the road surface lighting (step 301). This communication is performed using the in-vehicle communication bus 101 . Communication is assumed to be in-vehicle Ethernet, but other methods may be used. The road surface lighting ECU 110 starts road surface lighting (step 302), and projects the road surface lighting image 201 whose luminance is changed according to the content of the luminance change pattern 411 notified in step 301 (step 303).
 カメラECU120は、路面ライティングECU110から輝度変化パターン411を受信する(ステップ304)。カメラECU120は、輝度変化パターン411で経時的に輝度が変化する領域を特定するために、輝度変化パターン411の一周期分以上の撮影データを格納可能なフレームバッファ171領域をメモリ122に確保する。カメラECU120は、カメラ125の撮影を開始し(ステップ305)、撮影によって生成した撮影データをフレームバッファ171に格納する(ステップ306)。 The camera ECU 120 receives the brightness change pattern 411 from the road surface lighting ECU 110 (step 304). The camera ECU 120 secures in the memory 122 a frame buffer 171 area capable of storing photographed data for one cycle or more of the brightness change pattern 411 in order to specify a region where the brightness changes over time in the brightness change pattern 411 . The camera ECU 120 starts photographing with the camera 125 (step 305), and stores photographed data generated by photographing in the frame buffer 171 (step 306).
 図8は、フレームバッファ171に格納された撮影画像を示す概念図である。
 フレームバッファ171には、輝度変化パターン411で経時的に輝度が変化する路面ライティング画像を連続して撮影した撮影データが格納される。図8において、フレーム403A、403B、403Dには、高輝度の路面ライティング画像201A、201B、201Dを撮影した撮影画像が格納されている。また、フレーム403C、403Eには、低輝度の路面ライティング画像201C、201Eを撮影した撮影画像が格納されている。
FIG. 8 is a conceptual diagram showing captured images stored in the frame buffer 171. As shown in FIG.
The frame buffer 171 stores photographed data obtained by continuously photographing road surface lighting images whose luminance changes with time according to the luminance change pattern 411 . In FIG. 8, frames 403A, 403B, and 403D store captured images of high-brightness road surface lighting images 201A, 201B, and 201D. Frames 403C and 403E store captured images obtained by capturing low-luminance road surface lighting images 201C and 201E.
 ここで、図7の説明に戻る。カメラECU120は、各フレーム403に格納された撮影画像の輝度値の変化を測定し、ステップ301で通知された輝度変化パターン411で経時的に輝度が変化する領域、即ち、路面ライティング画像201A~Eに対応する領域を特定する(ステップ307)。ステップ307の処理で該当の領域を特定できなかった場合、カメラECU120は、ステップ309の処理へと進む。一方、ステップ307の処理で該当の領域を特定した場合、カメラECU120は、該当領域における撮影画像の輝度値を調整する。例えば、カメラECU120は、高輝度の輝度変化パターン411に対応するフレーム403Aと、低輝度の輝度変化パターン411に対応するフレーム403Cから、ステップ307で特定した領域の示す輝度値の差分を算出する。そして、カメラECU120は、高輝度の輝度変化パターン411に対応するフレーム403A、403B、403Dに格納された撮影画像の輝度値を、該当の差分だけ低くなるように調整する(ステップ308)。 Now, return to the description of FIG. The camera ECU 120 measures the change in the brightness value of the captured image stored in each frame 403, and determines the area where the brightness changes over time according to the brightness change pattern 411 notified in step 301, that is, the road surface lighting images 201A to 201E. (step 307). If the corresponding area could not be specified in the process of step 307 , the camera ECU 120 proceeds to the process of step 309 . On the other hand, when the relevant area is specified in the process of step 307, the camera ECU 120 adjusts the brightness value of the captured image in the relevant area. For example, the camera ECU 120 calculates the difference between the luminance values indicated by the regions identified in step 307 from the frame 403A corresponding to the high luminance luminance change pattern 411 and the frame 403C corresponding to the low luminance luminance change pattern 411. Then, the camera ECU 120 adjusts the brightness values of the captured images stored in the frames 403A, 403B, and 403D corresponding to the high-brightness brightness change pattern 411 so as to be lower by the corresponding difference (step 308).
 なお、輝度値の調整の態様は、これに限らず、カメラECU120は、路面ライティングECU110で投影した画像の輝度が高い程、領域の輝度値を下げる幅が大きくなるように調整する態様であれば、如何なる技術を用いても良い。例えば、カメラECU120は、高輝度の輝度変化パターン411に対応するフレーム403Aの高輝度の路面ライティング画像201Aが撮影された領域の輝度値と、それ以外の領域の輝度値の差分を算出する。そして、カメラECU120は、高輝度の路面ライティング画像201Aが撮影された領域の輝度値を、該当の差分だけ低くなるように調整してもよい。また、例えば、カメラECU120は、高輝度の輝度変化パターン411に対応するフレーム403Aの高輝度の路面ライティング画像201Aが撮影された領域の輝度値を所定の値だけ低くなるように調整してもよい。 Note that the manner of adjusting the luminance value is not limited to this, as long as the camera ECU 120 adjusts such that the higher the luminance of the image projected by the road surface lighting ECU 110, the greater the extent to which the luminance value of the area is reduced. , any technique may be used. For example, the camera ECU 120 calculates the difference between the brightness value of the area where the high-brightness road surface lighting image 201A of the frame 403A corresponding to the high-brightness brightness change pattern 411 is captured and the brightness value of the other area. Then, the camera ECU 120 may adjust the luminance value of the area where the high-luminance road surface lighting image 201A is captured so as to be lower by the corresponding difference. Further, for example, the camera ECU 120 may adjust the luminance value of the area where the high-luminance road surface lighting image 201A of the frame 403A corresponding to the high-luminance luminance change pattern 411 is captured to be lower by a predetermined value. .
 図9は、フレームバッファ171に格納された輝度値が調整された撮影画像を示す概念図である。フレームバッファ171には、高輝度で照射された路面ライティング画像201A、201B、201Dの輝度値が、低輝度で照射された路面ライティング画像201C、201Eの輝度値と同程度となるように調整された撮影データが格納されている。 FIG. 9 is a conceptual diagram showing a photographed image with adjusted luminance values stored in the frame buffer 171. FIG. In the frame buffer 171, the brightness values of the road surface lighting images 201A, 201B, and 201D illuminated with high brightness are adjusted to be approximately the same as the brightness values of the road surface lighting images 201C and 201E illuminated with low brightness. Captured data is stored.
 ここで、図7の説明に戻る。カメラECU120は、フレームバッファ171の空きを検査する(ステップ309)。ステップ309は、ステップ307において輝度変化パターンを検出するために必要なフレーム数をフレームバッファ171に一時的に滞留させるために実施される。フレームバッファ171に空きがある場合(ステップ309;No)、カメラECU120は、撮影によって生成された撮影データをフレームバッファ171に格納するためにステップ306に遷移する。一方、フレームバッファ171の空きが無い場合(ステップ309;Yes)、カメラECU120は、フレームバッファ171に格納された撮影データのうち、最も古い撮影データをメインSoC130に対して送信し、当該フレームバッファ171に格納された撮影データを削除して再利用可能にする(ステップ310)。メインSoC130は、受信した撮影データに応じた撮影画像を表示部135に表示する(ステップ311)。 Now, return to the description of FIG. The camera ECU 120 checks for free space in the frame buffer 171 (step 309). Step 309 is implemented to temporarily retain the number of frames required to detect the luminance change pattern in step 307 in the frame buffer 171 . If there is space in the frame buffer 171 (step 309 returns No), the camera ECU 120 proceeds to step 306 to store the shooting data generated by shooting in the frame buffer 171 . On the other hand, if the frame buffer 171 has no free space (step 309; Yes), the camera ECU 120 transmits the oldest captured data among the captured data stored in the frame buffer 171 to the main SoC 130, and deletes the photographic data stored in , and makes it reusable (step 310). The main SoC 130 displays the captured image corresponding to the received captured data on the display unit 135 (step 311).
 図10は、画像処理装置100の機能ブロック図である。
 画像処理装置100は、投影制御部11、取得部12、記憶部13、輝度調整部14、及び、表示出力部15を備える。投影制御部11は、図示せぬ車載装置から路面ライティングの開始の契機となるトリガ信号を受け付けると、車両の周辺の路面に所定のパターンで経時的に輝度が変化する画像(路面ライティング画像)を投影する。投影制御部11は、図1の路面ライティングECU110のCPU111により実現される。取得部12は、車両の周辺を撮影した撮影画像を取得する。取得部12は、図1のカメラECU120のCPU121により実現される。記憶部13は、取得部12で取得した撮影画像を格納する。本実施形態において、記憶部13は、カメラECU120のメモリ122にて実現される。輝度調整部14は、記憶部13に格納した撮影画像から、パターンで経時的に輝度が変化する領域を特定し、当該領域における撮影画像の輝度値を調整する。また、輝度調整部14は、投影制御部11で投影した画像の輝度が高い程、領域の輝度値を下げる幅が大きくなるように調整する。本実施形態において、輝度調整部14は、カメラECU120のCPU121にて実現される。表示出力部15は、輝度調整部14で調整した撮影画像を表示部135に出力する。表示出力部15は、メインSoc130のCPU131にて実現される。
FIG. 10 is a functional block diagram of the image processing apparatus 100. As shown in FIG.
The image processing apparatus 100 includes a projection control section 11 , an acquisition section 12 , a storage section 13 , a luminance adjustment section 14 and a display output section 15 . When the projection control unit 11 receives a trigger signal that triggers the start of road surface lighting from an in-vehicle device (not shown), the projection control unit 11 projects an image (road surface lighting image) in which the luminance changes with time in a predetermined pattern on the road surface around the vehicle. Project. The projection control unit 11 is realized by the CPU 111 of the road surface lighting ECU 110 in FIG. The acquisition unit 12 acquires a photographed image of the surroundings of the vehicle. The acquisition unit 12 is implemented by the CPU 121 of the camera ECU 120 in FIG. The storage unit 13 stores the captured image acquired by the acquisition unit 12 . In this embodiment, the storage unit 13 is implemented by the memory 122 of the camera ECU 120 . The brightness adjustment unit 14 identifies an area in which the brightness changes with time according to a pattern from the captured image stored in the storage unit 13, and adjusts the brightness value of the captured image in that area. Further, the brightness adjustment unit 14 adjusts the width of decreasing the brightness value of the region to increase as the brightness of the image projected by the projection control unit 11 increases. In this embodiment, the brightness adjustment unit 14 is implemented by the CPU 121 of the camera ECU 120 . The display output unit 15 outputs the captured image adjusted by the brightness adjustment unit 14 to the display unit 135 . The display output unit 15 is implemented by the CPU 131 of the main Soc 130 .
 以上の処理によって、画像処理装置100は、路面ライティング画像201をカメラ125で撮影した場合であっても、路面ライティング画像201に対応する領域の輝度値が低くなるように調整した撮影画像を表示部135に表示することができる。従って、路面ライティング画像201を撮影した撮影画像が表示部135に表示された場合であっても、自車両200の乗員は、路面ライティング画像201に対応する領域と、その他の領域の識別がしにくくなる。即ち、画像処理装置100は、路面ライティング画像201を撮影した画像を表示部135に表示した場合であっても、乗員の煩わしさを低減することができる。また、画像処理装置100は、乗員による周囲の状況把握を妨げることを抑制することができる。また、画像処理装置100は、路面ライティング画像201を投影する位置や大きさを変更しない。その結果、画像処理装置100は、路面ライティング画像201を投影する位置や大きさを変更する場合と比較して、路面ライティング画像201を投影することによって得られる注意喚起などの効果を低下させることが無い。また、一般に、カメラで撮影した撮影画像から特定の物体を除去するためには、撮影画像から物体の認識と抽出を行う必要がある。最近では、任意の形状に対する物体認識のために、機械学習が用いられるようになってきている。しかし、学習済みモデルを物体認識に使用した場合でも行列の積和演算など比較的計算コストの高い処理が行われるため、低コストで低性能な組込み向けSoCではリアルタイムに近い処理速度での認識は難易度が高い。さらに、運転者が認知・判断・操作を行えるだけの短時間に物体認識を行い、表示部に表示する撮影画像からの除去を実現するには、リアルタイムに近い物体認識処理が必要となる。これに対して、画像処理装置100は、経時的に輝度が変化する輝度変化パターンで路面ライティング画像201を投影し、当該輝度変化パターンで経時的に輝度が変化する領域の輝度値を調整した撮影画像を表示するだけである。従って、画像処理装置100は、高コストかつ高性能なSoCを用いることなく、乗員が認知・判断・操作を行える程度のリアルタイム性を持った物体認識と映像からの除去の実現が可能となる。 With the above processing, even when the road surface lighting image 201 is photographed by the camera 125, the image processing apparatus 100 displays the photographed image on the display unit so that the luminance value of the area corresponding to the road surface lighting image 201 is adjusted to be low. 135 can be displayed. Therefore, even if a photographed image obtained by photographing the road surface lighting image 201 is displayed on the display unit 135, it is difficult for the occupant of the own vehicle 200 to distinguish between the area corresponding to the road surface lighting image 201 and other areas. Become. That is, the image processing device 100 can reduce the annoyance of the passenger even when the image obtained by capturing the road surface lighting image 201 is displayed on the display unit 135 . In addition, the image processing apparatus 100 can prevent the occupant from recognizing the surrounding situation. In addition, the image processing device 100 does not change the position or size at which the road surface lighting image 201 is projected. As a result, the image processing apparatus 100 can reduce the effect of calling attention obtained by projecting the road surface lighting image 201, compared to the case where the position and size of the projection of the road surface lighting image 201 are changed. None. Also, in general, in order to remove a specific object from a photographed image taken by a camera, it is necessary to recognize and extract the object from the photographed image. Recently, machine learning has been used for object recognition for arbitrary shapes. However, even when a trained model is used for object recognition, processing with a relatively high computational cost such as matrix multiplication and sum operations is performed. High difficulty. Furthermore, near-real-time object recognition processing is required in order to perform object recognition in a short period of time that allows the driver to recognize, judge, and operate, and to remove the object from the captured image displayed on the display unit. On the other hand, the image processing apparatus 100 projects the road surface lighting image 201 with a luminance change pattern in which the luminance changes over time, and shoots images in which the luminance value of the area where the luminance changes over time is adjusted according to the luminance change pattern. It just displays the image. Therefore, the image processing apparatus 100 can realize real-time object recognition and removal from images to the extent that the passenger can recognize, judge, and operate the object without using a high-cost, high-performance SoC.
実施の形態2.
 実施の形態1では、路面ライティングECU110からカメラECU120に対して路面ライティングパターンを通知することで、カメラECU120は、輝度変化パターンで経時的に輝度が変化する領域を撮影画像から特定し、特定した領域の輝度値が低くなるように撮影データを調整した。しかしながら、輝度変化パターンの複雑度によっては、カメラECU120でフレームバッファの容量を確保できない恐れがある。そこで、実施の形態2では、路面ライティングの輝度変化パターンを示す識別情報をカメラECU120から路面ライティングECU110に対して通知する態様を示す。なお、ハードウェア構成(図1)、ソフトウェア構成(図2)、機能構成(図10)は、実施の形態1と同様である。
Embodiment 2.
In the first embodiment, the road surface lighting ECU 110 notifies the camera ECU 120 of the road surface lighting pattern, so that the camera ECU 120 identifies from the captured image an area in which luminance changes over time according to the luminance change pattern, and detects the identified area. The shooting data was adjusted so that the brightness value of However, depending on the complexity of the brightness change pattern, the camera ECU 120 may not be able to secure the capacity of the frame buffer. Therefore, in the second embodiment, a mode is described in which the camera ECU 120 notifies the road surface lighting ECU 110 of the identification information indicating the luminance change pattern of the road surface lighting. The hardware configuration (FIG. 1), software configuration (FIG. 2), and functional configuration (FIG. 10) are the same as those of the first embodiment.
 図11は、識別情報に対応付けられた路面ライティングの輝度変化パターンを示す図である。路面ライティングECU110のストレージ114、及び、カメラECU120のストレージ124には、識別情報に対応付けられた路面ライティングの輝度変化パターンを示す輝度変化パターン情報420が記憶されている。実施の形態2においては、実施の形態1と異なり、路面ライティングECU110から輝度変化パターンを指定するのではなく、カメラECU120から輝度変化パターンを指定する。その為、輝度変化パターン情報420は、路面ライティングECU110の性能的に実施可能であることが予め確認された情報である。なお、輝度変化パターン情報420は、路面ライティングECU110、及び、カメラECU120で共有されればよく、ストレージ114、124以外のストレージに記憶されていてもよい。 FIG. 11 is a diagram showing a luminance change pattern of road lighting associated with identification information. The storage 114 of the road surface lighting ECU 110 and the storage 124 of the camera ECU 120 store luminance change pattern information 420 indicating the luminance change pattern of the road surface lighting associated with the identification information. In the second embodiment, unlike the first embodiment, the brightness change pattern is specified not by the road surface lighting ECU 110 but by the camera ECU 120 . Therefore, the brightness change pattern information 420 is information that has been previously confirmed to be practicable in terms of the performance of the road surface lighting ECU 110 . The brightness change pattern information 420 may be shared by the road surface lighting ECU 110 and the camera ECU 120, and may be stored in a storage other than the storages 114 and 124.
 図12は、画像処理装置100の処理に関するフローチャートである。
 カメラECU120は、あらかじめ共有していた路面ライティングの輝度変化パターンの識別情報を選択し、路面ライティングECU110に送信する(ステップ312)。
FIG. 12 is a flowchart regarding processing of the image processing apparatus 100 .
The camera ECU 120 selects the previously shared identification information of the luminance change pattern of the road surface lighting, and transmits it to the road surface lighting ECU 110 (step 312).
 路面ライティングECU110は、カメラECU120から路面ライティングの輝度変化パターンを示す識別情報を受信する。路面ライティングECU110は、識別情報に対応づけられた輝度変化パターンで路面ライティングを開始する(ステップ302)。即ち、路面ライティングECU110は、識別情報に対応づけられた輝度変化パターンで輝度を変化させた路面ライティング画像を照射する(ステップ303)。 The road surface lighting ECU 110 receives identification information indicating the luminance change pattern of the road surface lighting from the camera ECU 120 . The road surface lighting ECU 110 starts road surface lighting with the luminance change pattern associated with the identification information (step 302). That is, the road surface lighting ECU 110 illuminates a road surface lighting image whose luminance is changed according to the luminance change pattern associated with the identification information (step 303).
 カメラECU120は、識別情報に対応付けられた輝度変化パターンの撮影データの一周期分を格納可能なフレームバッファ領域を確保する(ステップ304)。そして、カメラECU120は、カメラ125で撮影を開始する(ステップ305)。以降の動作は実施の形態1と同様であるから説明を省略する。 The camera ECU 120 secures a frame buffer area capable of storing one cycle of the photographed data of the luminance change pattern associated with the identification information (step 304). Then, the camera ECU 120 starts photographing with the camera 125 (step 305). Since the subsequent operation is the same as that of the first embodiment, the explanation is omitted.
 以上のように、カメラECU120は、あらかじめ共有していた路面ライティングの輝度変化パターンの識別情報を選択し、路面ライティングECU110に送信することで、フレームバッファの容量を確保できない可能性を抑制できる。即ち、画像処理装置100は、高性能なカメラECU120を備えなくても、路面ライティング画像201に対応する領域の輝度値が低くなるように調整された撮影画像を表示部135に表示することができる。 As described above, the camera ECU 120 selects the previously shared identification information of the luminance change pattern of the road surface lighting and transmits it to the road surface lighting ECU 110, thereby suppressing the possibility that the capacity of the frame buffer cannot be secured. In other words, the image processing apparatus 100 can display on the display unit 135 the captured image adjusted so that the luminance value of the area corresponding to the road surface lighting image 201 is low, even without the high-performance camera ECU 120 . .
実施の形態3.
 実施の形態1では、カメラECU120は、輝度変化パターン411で経時的に輝度が変化する領域を特定するために、輝度変化パターン411の一周期分以上の撮影データを格納可能なフレームバッファ171領域をメモリ122に確保した。これに対し、本実施形態では、メインSoC130において、輝度変化パターンで経時的に輝度が変化する領域を特定し、特定した領域の輝度値を調整する。本実施形態は、カメラECU120の性能が低くフレームバッファに必要とされるメモリ容量を確保できない場合などに適用可能である。なお、ハードウェア構成(図1)、ソフトウェア構成(図2)、機能構成(図10)は、実施の形態1と同様である。ただし、本実施形態において、記憶部13は、メインSoC130のメモリ132にて実現される。また、本実施形態において、輝度調整部14は、メインSoC130のCPU131にて実現される。
Embodiment 3.
In the first embodiment, camera ECU 120 defines a region of frame buffer 171 capable of storing image data for one cycle or more of brightness change pattern 411 in order to specify a region where brightness changes over time in brightness change pattern 411. Secured in the memory 122 . On the other hand, in the present embodiment, in the main SoC 130, an area in which luminance changes over time is specified in the luminance change pattern, and the brightness value of the specified area is adjusted. This embodiment can be applied when the performance of the camera ECU 120 is low and the memory capacity required for the frame buffer cannot be secured. The hardware configuration (FIG. 1), software configuration (FIG. 2), and functional configuration (FIG. 10) are the same as those of the first embodiment. However, in this embodiment, the storage unit 13 is realized by the memory 132 of the main SoC 130 . Further, in the present embodiment, the brightness adjustment unit 14 is implemented by the CPU 131 of the main SoC 130 .
 図13は、メインSoC130のメモリ132の概念図である。
 通信ライブラリ162は、メモリ132の一部分をフレームバッファ172として使用する。フレームバッファ172は、カメラECU120から送信された撮影画像(静止画像)を映像(動画像)としてエンコードするまでの一時保存のために使用される。メインSoC130は、受信した1コマ単位の撮影データをフレームバッファ172に順次格納する。また、メインSoC130は、フレームバッファ172の終端まで撮影データを格納した場合には、フレームバッファ172の先頭に戻って撮影データの格納を再開する。運転支援アプリ160は、フレームバッファ172に格納されている撮影データを1コマもしくは複数コマ単位で映像としてエンコードする。このエンコード手法には一般的な技術が使用される。なお、本開示においては、エンコード前のデータとエンコード後のデータを区別せずに「撮影データ」と称する。
FIG. 13 is a conceptual diagram of the memory 132 of the main SoC 130. As shown in FIG.
Communications library 162 uses a portion of memory 132 as frame buffer 172 . The frame buffer 172 is used for temporary storage until the captured image (still image) transmitted from the camera ECU 120 is encoded as a video (moving image). The main SoC 130 sequentially stores the received shot data in units of one frame in the frame buffer 172 . Further, when the captured data is stored up to the end of the frame buffer 172, the main SoC 130 returns to the beginning of the frame buffer 172 and resumes storing captured data. The driving assistance application 160 encodes the photographed data stored in the frame buffer 172 as video in units of one frame or multiple frames. Common techniques are used for this encoding technique. In the present disclosure, the data before encoding and the data after encoding are referred to as "capture data" without distinguishing between them.
 図14は、画像処理装置100の処理に関するフローチャートである。
 開始時、路面ライティングEUC110は、路面ライティングを開始する前の状態である。また、カメラECU120は、撮影を開始する前の状態である。また、メインSoC130は、カメラECU120から送信されてくる撮影データを受信可能な状態になっている。
FIG. 14 is a flowchart regarding processing of the image processing apparatus 100 .
At the start, the road lighting EUC 110 is in the state before road lighting begins. Also, the camera ECU 120 is in a state before starting photographing. Also, the main SoC 130 is in a state capable of receiving photographed data transmitted from the camera ECU 120 .
 路面ライティングECU110は、路面ライティングの輝度変化パターンをメインSoC130に通知する(ステップ301)。この通信は、車載通信バス101を用いて行われる。通信は車載イーサネットを想定するが、その他の方式を用いてもよい。路面ライティングECU110は、路面ライティングを開始するとともに(ステップ302)、ステップ301で通知した輝度変化パターンの内容に従って輝度を変化させた路面ライティング画像を投影する(ステップ303)。 The road surface lighting ECU 110 notifies the main SoC 130 of the luminance change pattern of the road surface lighting (step 301). This communication is performed using the in-vehicle communication bus 101 . Communication is assumed to be in-vehicle Ethernet, but other methods may be used. The road surface lighting ECU 110 starts road surface lighting (step 302), and projects a road surface lighting image whose luminance is changed according to the content of the luminance change pattern notified in step 301 (step 303).
 メインSoC130は、路面ライティングECU110から輝度変化パターンを受信する(ステップ320)。メインSoC130は、輝度変化パターンで経時的に輝度が変化する領域を特定するために、輝度変化パターンの一周期分以上の撮影データを格納可能なフレームバッファ172領域をメモリ132に確保する。また、メインSoC130は、フレームバッファ172領域をメモリ132に確保した後、撮影データを受信する準備が完了した旨を示す受信可能通知をカメラECU120に対して送信する(ステップ321)。 The main SoC 130 receives the luminance change pattern from the road surface lighting ECU 110 (step 320). The main SoC 130 secures in the memory 132 a frame buffer 172 area capable of storing image data for one cycle or more of the brightness change pattern in order to specify a region where the brightness changes over time in the brightness change pattern. Further, after the main SoC 130 secures the area of the frame buffer 172 in the memory 132, the main SoC 130 transmits to the camera ECU 120 a receivable notification indicating that preparations for receiving photographed data have been completed (step 321).
 カメラECU120は、カメラ125の撮影を開始する(ステップ305)。また、カメラECU120は、受信可能通知をメインSoC130から受信した後、撮影によって生成した撮影データをバッファリングせずに順次メインSoC130に対して送信する(ステップ323)。 The camera ECU 120 starts shooting with the camera 125 (step 305). Further, after receiving the receivable notification from the main SoC 130, the camera ECU 120 sequentially transmits the photographed data generated by photographing to the main SoC 130 without buffering (step 323).
 メインSoC130は、受信した撮影データをフレームバッファ172に格納する(ステップ324)。メインSoC130は、フレームバッファ172に格納された撮影画像の輝度値の変化を測定し、ステップ301で通知された輝度変化パターンで経時的に輝度が変化する領域を特定する(ステップ325)。ステップ325で該当の領域を特定できなかった場合、メインSoC130は、ステップ327の処理へと進む。一方、ステップ325で該当の領域を特定した場合、メインSoC130は、該当の領域における撮影画像の輝度値を調整する(ステップ326)。例えば、メインSoC130は、当該フレームバッファ中の最も古いフレームの高輝度の領域の輝度値を、直前に出現した低輝度の領域の輝度値で置換する。なお、メインSoC130は、カメラECU120と同様の方法を用いて、撮影画像の輝度値を調整してもよい。 The main SoC 130 stores the received imaging data in the frame buffer 172 (step 324). The main SoC 130 measures changes in the luminance value of the captured image stored in the frame buffer 172, and identifies areas where the luminance changes over time according to the luminance change pattern notified in step 301 (step 325). If the relevant area could not be specified in step 325 , the main SoC 130 proceeds to process step 327 . On the other hand, when the relevant area is specified in step 325, the main SoC 130 adjusts the brightness value of the captured image in the relevant area (step 326). For example, the main SoC 130 replaces the brightness value of the high-brightness region of the oldest frame in the frame buffer with the brightness value of the low-brightness region that appeared immediately before. Note that the main SoC 130 may adjust the brightness value of the captured image using the same method as the camera ECU 120 .
 次に、メインSoC130は、フレームバッファに空きがあるか検査する(ステップ327)。ステップ327は、ステップ325において輝度変化パターンを検出するために必要なフレーム数をフレームバッファ172に一時的に滞留させるために実施される。フレームバッファ172に空きがある場合(ステップ327;No)、メインSoC130は、撮影によって生成された撮影データをフレームバッファ172に格納するためにステップ324に遷移する。一方、フレームバッファ172に空きが無い場合(ステップ327:Yes)、メインSoC130は、フレームバッファ172に格納された撮影データのうち、最も古い撮影データに応じた撮影画像を表示デバイスに表示し、当該フレームバッファ領域を削除して再利用可能にする(328)。なお、メインSoC130は、表示部135の仕様に従って、撮影データに応じた撮影画像を表示部135に表示させればよい。 Next, the main SoC 130 checks whether there is space in the frame buffer (step 327). Step 327 is implemented to temporarily store in frame buffer 172 the number of frames required to detect the luminance change pattern in step 325 . If there is space in the frame buffer 172 (step 327 ; No), the main SoC 130 transitions to step 324 to store the shooting data generated by shooting in the frame buffer 172 . On the other hand, if the frame buffer 172 has no free space (step 327: Yes), the main SoC 130 displays the captured image corresponding to the oldest captured data among the captured data stored in the frame buffer 172 on the display device. Delete the framebuffer area and make it available for reuse (328). Note that the main SoC 130 may cause the display unit 135 to display a captured image corresponding to the captured data according to the specifications of the display unit 135 .
 以上のように、画素処理装置100は、カメラECU120によるバッファリングをメインSoC130で代替することによって、高性能なカメラECU120を備えなくても、路面ライティング画像201に対応する領域の輝度値が低くなるように調整された撮影画像を表示部135に表示することができる。 As described above, the pixel processing device 100 replaces the buffering by the camera ECU 120 with the main SoC 130, so that the luminance value of the area corresponding to the road surface lighting image 201 is lowered even without the high-performance camera ECU 120. The captured image adjusted as above can be displayed on the display unit 135 .
実施の形態4.
 実施の形態1では、路面ライティングが自車によって行われる場合のみを想定した。一方、路面ライティングの普及により、自車両200に近接する他車両においても同様の輝度変化パターンを備えている場合が考えられる。
Embodiment 4.
In Embodiment 1, only the case where the road surface lighting is performed by the own vehicle is assumed. On the other hand, due to the spread of road lighting, it is conceivable that other vehicles close to the own vehicle 200 also have similar luminance change patterns.
 図15は、複数の路面ライティング画像が撮影される状況を示す図である。
 自車両200と他車両210が路面ライティング画像を投影した場合、自車両200のカメラ125は、自車両200から投影した路面ライティング画像201だけでなく、他車両210から投影された路面ライティング画像211も撮影することになる。
FIG. 15 is a diagram showing a situation in which a plurality of road surface lighting images are captured.
When own vehicle 200 and other vehicle 210 project road surface lighting images, camera 125 of own vehicle 200 not only projects road surface lighting image 201 projected from own vehicle 200 but also road surface lighting image 211 projected from other vehicle 210. I will be taking pictures.
 図16は、複数の路面ライティング画像をカメラ125で撮影したときにフレームバッファに格納される撮影データの例である。フレームバッファ440には、他車両210の路面ライティング画像211と、自車両200の路面ライティング画像201を連続して撮影した撮影データが格納される。ここで、夫々の路面ライティング画像が同じ輝度変化パターンで投影されていた場合、画像処理装置100は、夫々の路面ライティング画像を区別することができない。その為、画像処理装置100は、例えば、バッファ442に格納された撮影画像のように、夫々の路面ライティング画像に対応した領域の輝度値が低くなるように調整してしまうという課題が生じる。 FIG. 16 is an example of photographed data stored in the frame buffer when a plurality of road surface lighting images are photographed by the camera 125. FIG. The frame buffer 440 stores photographed data obtained by continuously photographing the road surface lighting image 211 of the other vehicle 210 and the road surface lighting image 201 of the own vehicle 200 . Here, when each road surface lighting image is projected with the same brightness change pattern, the image processing apparatus 100 cannot distinguish between each road surface lighting image. For this reason, the image processing apparatus 100 has a problem that, for example, like the photographed images stored in the buffer 442, the luminance value of the area corresponding to each road surface lighting image is adjusted to be low.
 この課題の対策として、自車両200から投影された路面ライティング画像201だけでなく、他車両210から投影された路面ライティング画像211を撮影した場合であっても、自車両200の路面ライティング画像201のみを低輝度に置換する方法を示す。 As a countermeasure for this problem, even if not only the road surface lighting image 201 projected from the own vehicle 200 but also the road surface lighting image 211 projected from the other vehicle 210 is captured, only the road surface lighting image 201 of the own vehicle 200 is captured. will be replaced with low luminance.
 ハードウェア構成(図1)、ソフトウェア構成(図2)、機能構成(図10)は、実施の形態1と同様である。 The hardware configuration (Fig. 1), software configuration (Fig. 2), and functional configuration (Fig. 10) are the same as in the first embodiment.
 図17は、画像処理装置100の処理に関するフローチャートである。
 画像処理装置100は、輝度変化パターン411で経時的に輝度が変化する領域の占める割合が増加したときに、他車両210の路面ライティングが撮影されたと判断して、自車両200の路面ライティングの輝度変化パターン411を変更することが相違点である。
FIG. 17 is a flowchart regarding processing of the image processing apparatus 100 .
The image processing device 100 determines that the road surface lighting of the other vehicle 210 has been photographed when the ratio of the area in which the luminance changes with time in the luminance change pattern 411 increases, and changes the luminance of the road surface lighting of the own vehicle 200. The difference is that the change pattern 411 is changed.
 開始時、路面ライティングEUC110は、路面ライティングを開始する前の状態である。また、カメラECU120は、撮影を開始する前の状態である。また、メインSoC130は、カメラECU120から送信されてくる撮影データに応じた画像を表示部135に表示可能な状態になっている。 At the start, the road surface lighting EUC 110 is in the state before starting the road surface lighting. Also, the camera ECU 120 is in a state before starting photographing. Further, the main SoC 130 is in a state capable of displaying on the display unit 135 an image corresponding to the photographed data transmitted from the camera ECU 120 .
 路面ライティングECU110は、輝度変化パターン411をカメラECU120に通知する(ステップ301)。路面ライティングECU110は、路面ライティングを開始するとともに(ステップ302)、ステップ301で通知した輝度変化パターン411の内容に従って輝度を変化させた路面ライティング画像201を投影する(ステップ303)。 The road surface lighting ECU 110 notifies the camera ECU 120 of the brightness change pattern 411 (step 301). The road surface lighting ECU 110 starts road surface lighting (step 302), and projects the road surface lighting image 201 whose luminance is changed according to the content of the luminance change pattern 411 notified in step 301 (step 303).
 カメラECU120は、路面ライティングECU110から輝度変化パターン411を受信する。カメラECU120は、輝度変化パターン411で経時的に輝度が変化する領域を特定するために、輝度変化パターン411の一周期分以上の撮影データを格納可能なフレームバッファ440をメモリ122に確保する(ステップ304)。カメラECU120は、カメラ125の撮影を開始し(ステップ305)、撮影によって生成された撮影データをフレームバッファ440に格納する(ステップ306)。 The camera ECU 120 receives the brightness change pattern 411 from the road surface lighting ECU 110 . The camera ECU 120 secures in the memory 122 a frame buffer 440 capable of storing photographed data for one cycle or more of the brightness change pattern 411 in order to specify an area in which the brightness changes over time in the brightness change pattern 411 (step 304). The camera ECU 120 starts shooting with the camera 125 (step 305), and stores shooting data generated by shooting in the frame buffer 440 (step 306).
 カメラECU120は、フレームバッファ440に格納された撮影画像の輝度値の変化を測定し、ステップ301で通知された輝度変化パターン411で経時的に輝度が変化する領域が存在するかを特定する(ステップ307)。 The camera ECU 120 measures changes in the brightness value of the captured image stored in the frame buffer 440, and identifies whether there is an area in which the brightness changes over time in the brightness change pattern 411 notified in step 301 (step 307).
 更に、カメラECU120は、フレームバッファ440に格納された撮影画像に対して、輝度変化パターン411に対応する領域の占める割合が、予め設定された閾値よりも大きいか判定する(ステップ331)。この閾値は、路面ライティング画像の大きさ、投影部115の設けられた高さ、投影角度、投影範囲、自車と他車の近接度合いの許容範囲をパラメータとして自動車の安全基準に関する製品仕様に従って設定される。また、この閾値は、例えば、フレームバッファ440に格納された撮影画像に対して、輝度変化パターン411に対応する領域の割合の過去の平均値に従って設定されてもよい。 Furthermore, the camera ECU 120 determines whether the ratio of the region corresponding to the luminance change pattern 411 to the captured image stored in the frame buffer 440 is greater than a preset threshold (step 331). This threshold value is set according to product specifications related to automobile safety standards, using parameters such as the size of the road surface lighting image, the height at which the projection unit 115 is provided, the projection angle, the projection range, and the permissible range of the degree of proximity between the vehicle and other vehicles. be done. Also, this threshold value may be set, for example, according to the past average ratio of the area corresponding to the luminance change pattern 411 in the captured image stored in the frame buffer 440 .
 閾値よりも大きいと判定された場合(ステップ331:Yes)、カメラECU120は、自車両200の路面ライティング以外に、自車両200と同じ輝度変化パターン411で投影された他車両210の路面ライティングが撮影されていると判断する。そして、カメラECU120は、輝度変化パターンの変更を路面ライティングECU110に通知する(ステップ332)。そして、カメラECU120は、フレームバッファ440に格納されているすべての撮影データをメインSoC130に送信した後、フレームバッファ440に格納されているすべての撮影データを削除する(ステップ333)。 If it is determined to be greater than the threshold value (step 331: Yes), the camera ECU 120 captures the road surface lighting of the other vehicle 210 projected with the same brightness change pattern 411 as that of the own vehicle 200, in addition to the road surface lighting of the own vehicle 200. It is determined that The camera ECU 120 then notifies the road surface lighting ECU 110 of the change in the luminance change pattern (step 332). After transmitting all the photographed data stored in the frame buffer 440 to the main SoC 130, the camera ECU 120 deletes all the photographed data stored in the frame buffer 440 (step 333).
 路面ライティングECU110は、カメラECU120からの輝度変化パターンの変更の依頼に従って、自車両200の輝度変化パターンを変更する(ステップ330)。そして、路面ライティングECU110は、変更後の輝度変化パターンをカメラECU120に対して通知し(ステップ301)、路面ライティングを再開する(ステップ302)。 The road surface lighting ECU 110 changes the brightness change pattern of the own vehicle 200 in accordance with the request from the camera ECU 120 to change the brightness change pattern (step 330). Then, the road surface lighting ECU 110 notifies the camera ECU 120 of the luminance change pattern after the change (step 301), and restarts the road surface lighting (step 302).
 ステップ307で輝度変化パターンが検出されない場合(ステップ307:No)、又は、ステップ331で閾値以下と判定された場合(ステップ331:No)、カメラECU120は、実施の形態1と同様にステップ308~310を実行する。そして、メインSoC130は、受信した撮影データに応じた撮影画像を表示部135に表示する(ステップ311)。 If no luminance change pattern is detected in step 307 (step 307: No), or if it is determined that the value is below the threshold in step 331 (step 331: No), the camera ECU 120 performs steps 308 to 308 as in the first embodiment. 310 is executed. Then, the main SoC 130 displays the captured image corresponding to the received captured data on the display unit 135 (step 311).
 図19は、フレームバッファ440に格納された輝度値が調整された撮影画像を示す概念図である。
 同図に示すように、画像処理装置100は、自車両200の輝度変化パターンを変更することで、自車両200及び他車両210から投影された路面ライティング画像を撮影した場合でも、自車両200の路面ライティング画像201に対応する領域の輝度値のみを調整することができる。即ち、画像処理装置100は、自車両200から投影された路面ライティング画像に応じた画像が表示画面に表示されることによって、乗員が煩わしさを感じることを低減することができる。
FIG. 19 is a conceptual diagram showing a photographed image stored in the frame buffer 440 and having its luminance value adjusted.
As shown in the figure, the image processing device 100 changes the luminance change pattern of the own vehicle 200 so that even when the road surface lighting image projected from the own vehicle 200 and the other vehicle 210 is captured, the image of the own vehicle 200 is changed. Only the luminance value of the area corresponding to the road surface lighting image 201 can be adjusted. That is, the image processing device 100 can reduce the discomfort felt by the occupant by displaying on the display screen an image corresponding to the road surface lighting image projected from the own vehicle 200 .
 実施の形態4は、実施の形態2および3とは独立しているため、それぞれ組み合わせても実施可能である。 Since Embodiment 4 is independent of Embodiments 2 and 3, it is possible to combine them.
 実施の形態1,2,3,4では、カメラECU120とメインSoC130は夫々異なるデバイスであると仮定したが、統合が可能な程度に高性能なSoCを用いた場合、同一のSoCにおいても本発明は実施可能である。この場合、車載通信バスを使用した通信が、SoC内のバスによる通信で代替される。 In Embodiments 1, 2, 3, and 4, it was assumed that the camera ECU 120 and the main SoC 130 are different devices. is feasible. In this case, communication using the in-vehicle communication bus is replaced with communication using the bus within the SoC.
 11 投影制御部、12 取得部、13 記憶部、14 輝度調整部、15 表示出力部、100 画像処理装置、101 車載通信バス、112 メモリ、113 インタフェース、114 ストレージ、115 投影部、122 メモリ、123 インタフェース、124 ストレージ、115 投影部、122 メモリ、123 インタフェース、124 ストレージ、125 カメラ、132 メモリ、133 インタフェース、134 ストレージ、135 表示部、140 路面ライティングアプリ、141 オペレーティングシステム、142 通信ライブラリ、143 路面ライトドライバ、150 カメラアプリ、151 オペレーティングシステム、152 通信ライブラリ、153 カメラドライバ、160 運転支援アプリ、161 オペレーティングシステム、162 通信ライブラリ、163 ディスプレイドライバ、171 フレームブッファ、172 フレームブッファ、200 自車両、201 路面ライティング画像、202 投影範囲、203 非照射領域、210 他車両、211 路面ライティング画像、301 輝度変化パターン、410 輝度変化パターン、411 輝度変化パターン、412 区間、420 輝度変化パターン情報、440 フレームバッファ、442 バッファ。 11 projection control unit, 12 acquisition unit, 13 storage unit, 14 brightness adjustment unit, 15 display output unit, 100 image processing device, 101 in-vehicle communication bus, 112 memory, 113 interface, 114 storage, 115 projection unit, 122 memory, 123 interface, 124 storage, 115 projection unit, 122 memory, 123 interface, 124 storage, 125 camera, 132 memory, 133 interface, 134 storage, 135 display unit, 140 road surface lighting application, 141 operating system, 142 communication library, 143 road surface light Driver, 150 camera application, 151 operating system, 152 communication library, 153 camera driver, 160 driving support application, 161 operating system, 162 communication library, 163 display driver, 171 frame buffer, 172 frame buffer, 200 own vehicle, 201 road lighting Image, 202 Projection range, 203 Non-illuminated area, 210 Other vehicle, 211 Road lighting image, 301 Brightness change pattern, 410 Brightness change pattern, 411 Brightness change pattern, 412 Section, 420 Brightness change pattern information, 440 Frame buffer, 442 Buffer .

Claims (7)

  1.  車両の周辺の路面に所定のパターンで経時的に輝度が変化する画像を投影する投影制御部と、
     前記車両の周辺を撮影した撮影画像を取得する取得部と、
     前記取得部で取得した前記撮影画像を格納する記憶部と、
     前記記憶部に格納した前記撮影画像から、前記パターンで経時的に輝度が変化する領域を特定し、当該領域における前記撮影画像の輝度値を調整する輝度調整部と、
     前記輝度調整部で調整した前記撮影画像を表示部に出力する表示出力部と
     を備えることを特徴とする画像処理装置。
    a projection control unit that projects an image whose luminance changes over time in a predetermined pattern onto the road surface around the vehicle;
    an acquisition unit that acquires a photographed image of the surroundings of the vehicle;
    a storage unit that stores the captured image acquired by the acquisition unit;
    a luminance adjustment unit that identifies an area in which luminance changes over time according to the pattern from the captured image stored in the storage unit, and adjusts the luminance value of the captured image in the area;
    and a display output unit that outputs the captured image adjusted by the brightness adjustment unit to a display unit.
  2.  前記輝度調整部は、
     前記投影制御部で投影した画像の輝度が高い程、前記領域の輝度値を下げる幅が大きくなるように調整する
     ことを特徴とする請求項1に記載の画像処理装置。
    The brightness adjustment unit
    2. The image processing apparatus according to claim 1, wherein the higher the brightness of the image projected by the projection control unit, the greater the amount of reduction in the brightness value of the area.
  3.  前記輝度調整部は、
     前記記憶部に格納された前記撮影画像に対して前記領域の占める割合が閾値よりも大きい場合、前記パターンの変更を前記投影制御部に通知する
     ことを特徴とする請求項1又は2の何れかに記載の画像処理装置。
    The brightness adjustment unit
    3. The projection control unit is notified of a change in the pattern when a ratio of the region to the captured image stored in the storage unit is larger than a threshold value. The image processing device according to .
  4.  前記投影制御部は、
     前記輝度を変化させる前記パターンを前記輝度調整部に通知し
     前記輝度調整部は、
     前記投影制御部から通知された前記パターンに基づいて前記領域を特定する
    ことを特徴とする請求項1~3の何れかに記載の画像処理装置。
    The projection control unit
    notifying the luminance adjustment unit of the pattern for changing the luminance, and
    4. The image processing apparatus according to any one of claims 1 to 3, wherein the area is specified based on the pattern notified from the projection control unit.
  5.  前記輝度調整部は、
     前記輝度を変化させる前記パターンを前記投影制御部に通知し、
     前記投影制御部は、
     前記輝度調整部から通知された前記パターンに基づいて前記輝度を変化させる
     ことを特徴とする請求項1~3の何れかに記載の画像処理装置。
    The brightness adjustment unit
    notifying the projection control unit of the pattern for changing the luminance;
    The projection control unit
    4. The image processing apparatus according to any one of claims 1 to 3, wherein the brightness is changed based on the pattern notified from the brightness adjustment section.
  6.  車両の周辺の路面に所定のパターンで経時的に輝度が変化する画像を投影する投影制御ステップと、
     前記車両の周辺を撮影した撮影画像を取得する取得ステップと、
     前記取得ステップで取得した前記撮影画像を格納する記憶ステップと、
     前記記憶ステップで格納した前記撮影画像から、前記パターンで経時的に輝度が変化する領域を特定し、当該領域における前記撮影画像の輝度値を調整する輝度調整ステップと、
     前記輝度調整ステップで調整した前記撮影画像を表示部に出力する表示出力ステップと
     を備えることを特徴とする画像処理方法。
    a projection control step of projecting an image whose luminance changes over time in a predetermined pattern onto the road surface around the vehicle;
    an acquisition step of acquiring a photographed image of the surroundings of the vehicle;
    a storage step of storing the captured image acquired in the acquisition step;
    a luminance adjustment step of identifying an area in which luminance changes with time according to the pattern from the photographed image stored in the storing step, and adjusting the luminance value of the photographed image in the area;
    and a display output step of outputting the captured image adjusted in the brightness adjustment step to a display unit.
  7.  車両の周辺の路面に所定のパターンで経時的に輝度が変化する画像を投影する投影制御ステップと、
     前記車両の周辺を撮影した撮影画像を取得する取得ステップと、
     前記取得ステップで取得した前記撮影画像を格納する記憶ステップと、
     前記記憶ステップで格納した前記撮影画像から、前記パターンで経時的に輝度が変化する領域を特定し、当該領域における前記撮影画像の輝度値を調整する輝度調整ステップと、
     前記輝度調整ステップで調整した前記撮影画像を表示部に出力する表示出力ステップと
     をコンピュータに実行させることを特徴とする画像処理プログラム。
    a projection control step of projecting an image whose luminance changes over time in a predetermined pattern onto the road surface around the vehicle;
    an acquisition step of acquiring a photographed image of the surroundings of the vehicle;
    a storage step of storing the captured image acquired in the acquisition step;
    a luminance adjustment step of identifying an area in which luminance changes with time according to the pattern from the photographed image stored in the storing step, and adjusting the luminance value of the photographed image in the area;
    and a display output step of outputting the photographed image adjusted in the brightness adjustment step to a display unit.
PCT/JP2021/008973 2021-03-08 2021-03-08 Image processing device, image processing method, and image processing program WO2022190166A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021557564A JP7031799B1 (en) 2021-03-08 2021-03-08 Image processing device, image processing method, and image processing program
PCT/JP2021/008973 WO2022190166A1 (en) 2021-03-08 2021-03-08 Image processing device, image processing method, and image processing program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/008973 WO2022190166A1 (en) 2021-03-08 2021-03-08 Image processing device, image processing method, and image processing program

Publications (1)

Publication Number Publication Date
WO2022190166A1 true WO2022190166A1 (en) 2022-09-15

Family

ID=81212888

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/008973 WO2022190166A1 (en) 2021-03-08 2021-03-08 Image processing device, image processing method, and image processing program

Country Status (2)

Country Link
JP (1) JP7031799B1 (en)
WO (1) WO2022190166A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023014606A (en) * 2021-07-19 2023-01-31 株式会社小糸製作所 Control device of vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120044090A1 (en) * 2010-08-18 2012-02-23 GM Global Technology Operations LLC Motor vehicle with digital projectors
JP2019051793A (en) * 2017-09-14 2019-04-04 株式会社Subaru Road surface drawing device for vehicle
JP2020121615A (en) * 2019-01-30 2020-08-13 クラリオン株式会社 Vehicular road sign recognition support device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120044090A1 (en) * 2010-08-18 2012-02-23 GM Global Technology Operations LLC Motor vehicle with digital projectors
JP2019051793A (en) * 2017-09-14 2019-04-04 株式会社Subaru Road surface drawing device for vehicle
JP2020121615A (en) * 2019-01-30 2020-08-13 クラリオン株式会社 Vehicular road sign recognition support device

Also Published As

Publication number Publication date
JPWO2022190166A1 (en) 2022-09-15
JP7031799B1 (en) 2022-03-08

Similar Documents

Publication Publication Date Title
EP2471691B1 (en) Obstacle detection device, obstacle detection system provided therewith, and obstacle detection method
US11244173B2 (en) Image display apparatus
JP6925848B2 (en) Display control device, display control method and camera monitoring system
US20100002078A1 (en) Periphery monitoring apparatus
CN110798655A (en) Driving image system for eliminating pillar A blind area of mobile carrier and image processing method thereof
WO2022190166A1 (en) Image processing device, image processing method, and image processing program
JP2018093427A (en) On-vehicle projection apparatus
US10735667B1 (en) Vehicle surroundings monitoring apparatus
JP2006287513A (en) System for presenting status outside vehicle
TWI630818B (en) Dynamic image feature enhancement method and system
JP4799236B2 (en) In-vehicle display system
CN113507559A (en) Intelligent camera shooting method and system applied to vehicle and vehicle
JP6766433B2 (en) Vehicle display control device, vehicle display system, vehicle display control method and program
JP6999231B2 (en) Display control device, display control method and camera monitoring system
JP2013009041A (en) Vehicle photographing display control system
KR101941607B1 (en) Imaging and display device for vehicle and recording medium
JP6327388B2 (en) Captured image display device, captured image display method, and captured image display program
JP6866212B2 (en) Display control device, display control method and camera monitoring system
JP7213732B2 (en) image display device
JP2006160192A (en) Vehicular drive supporting device
JPWO2018179695A1 (en) Control device, imaging device, control method, and program
US10897572B2 (en) Imaging and display device for vehicle and recording medium thereof for switching an angle of view of a captured image
JP4285229B2 (en) Vehicle display device
JP6311826B2 (en) Captured image display device, captured image display method, and captured image display program
JP2024056563A (en) Display processing device, display processing method, and operation program for display processing device

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2021557564

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21930025

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21930025

Country of ref document: EP

Kind code of ref document: A1