WO2023286806A1 - 静止物情報取得装置、プログラム、及び静止物情報取得方法 - Google Patents
静止物情報取得装置、プログラム、及び静止物情報取得方法 Download PDFInfo
- Publication number
- WO2023286806A1 WO2023286806A1 PCT/JP2022/027575 JP2022027575W WO2023286806A1 WO 2023286806 A1 WO2023286806 A1 WO 2023286806A1 JP 2022027575 W JP2022027575 W JP 2022027575W WO 2023286806 A1 WO2023286806 A1 WO 2023286806A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- stationary object
- vehicle
- image
- image data
- object information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y10/00—Economic sectors
- G16Y10/40—Transportation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y40/00—IoT characterised by the purpose of the information processing
- G16Y40/10—Detection; Monitoring
Definitions
- the present disclosure relates to a stationary object information acquisition device, a program, and a stationary object information acquisition method.
- Patent Literature 1 describes detecting a forward vehicle and controlling forward light distribution.
- ADB light distribution control is based on target information sent from the vehicle.
- Each target is detected by a specific algorithm based on the data acquired by sensors such as cameras. (excessive detection), or a target is detected even though it does not exist (false detection).
- stationary objects with high brightness such as street lights and signs on the road
- these stationary objects may be mistakenly recognized as the vehicle ahead.
- the headlamps of the vehicle ahead may be erroneously recognized as street lights. If information on stationary objects such as street lights and signs on the road can be collected, it is useful because it can be used to reduce the possibility of misrecognition as described above.
- the purpose of this disclosure is to collect stationary object information on stationary objects such as street lights and signs on the road.
- a stationary object information acquisition device includes: an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle; An image in which one or more stationary objects selected from a self-luminous body, a sign, a delineator, and a guardrail exist, or stationary object image data corresponding to a portion of the image, and the stationary object calculated based on the image data a specifying unit that specifies, based on the image data, stationary object information including at least one of stationary object position information indicating the position of the vehicle position information of the vehicle obtained from a position information obtaining unit mounted on the vehicle, the vehicle position information when an image corresponding to the image data in which the stationary object information is specified is captured; , the stationary object information to a storage unit; and installed in the vehicle.
- a program is A program that is executed in a computer device that includes a processor and is mounted on a vehicle, The program causes the processor to: an image acquisition step of acquiring image data of an image captured by a sensor unit mounted on the vehicle; An image in which one or more stationary objects selected from a self-luminous body, a sign, a delineator, and a guardrail exist, or stationary object image data corresponding to a portion of the image, and the stationary object calculated based on the image data a identifying step of identifying stationary object information including at least one of stationary object position information indicating the position of the image data based on the image data; vehicle position information of the vehicle obtained from a position information obtaining unit mounted on the vehicle, the vehicle position information when an image corresponding to the image data in which the stationary object information is specified is captured; , and the stationary object information to a storage unit; to run.
- a stationary object information acquisition method includes: A stationary object information acquisition method comprising a processor and executed by a computer device mounted on a vehicle,
- the stationary object information acquisition method comprises: an image acquisition step of acquiring image data of an image captured by a sensor unit mounted on the vehicle; An image in which one or more stationary objects selected from a self-luminous body, a sign, a delineator, and a guardrail exist, or stationary object image data corresponding to a portion of the image, and the stationary object calculated based on the image data a identifying step of identifying stationary object information including at least one of stationary object position information indicating the position of the image data based on the image data; vehicle position information of the vehicle obtained from a position information obtaining unit mounted on the vehicle, the vehicle position information when an image corresponding to the image data specifying the stationary object information is captured; , and the stationary object information to a storage unit; including running
- FIG. 1 is a schematic diagram showing an example of a system including a stationary object information acquisition device according to an embodiment of the present disclosure
- FIG. 1 is a block diagram showing an example of a system including a stationary object information acquisition device according to an embodiment of the present disclosure
- FIG. 6 is a flow chart showing an example of a method for acquiring stationary object information according to an embodiment of the present disclosure
- FIG. 4 is a flowchart showing an example of processing for identifying stationary object information shown in FIG. 3
- FIG. FIG. 4 is a schematic diagram for explaining stationary object position information
- 4A and 4B are schematic diagrams showing imaging timing and each image data acquired at each imaging timing
- FIG. 7 is a flowchart showing an example of determination processing of stationary object information
- FIG. 10 is a schematic diagram showing an example of reference image data used for specifying processing of stationary object information
- FIG. 10 is a schematic diagram showing an example of image data used for specifying processing of stationary object information
- FIG. 10 is a schematic diagram showing an example of image data used for specify
- FIG. 1 is a schematic diagram illustrating a system 1 according to one embodiment of the present disclosure.
- the system 1 includes a stationary object information storage device 200 and a plurality of vehicles 2 such as vehicles 2A and 2B on which stationary object information acquisition devices 100 are respectively mounted.
- the stationary object information storage device 200 and each vehicle 2 can be communicatively connected to each other by wireless communication.
- the stationary object information acquisition device 100 acquires stationary object information about stationary objects and transmits the stationary object information to the stationary object information storage device 200 .
- the stationary object information storage device 200 accumulates stationary object information received from each stationary object information acquisition device 100, for example.
- the stationary object information storage device 200 analyzes the received stationary object information, improves the accuracy of the stationary object information, acquires more detailed information, and creates a light distribution pattern based on the stationary object information. or
- the stationary object information storage device 200 also transmits the stationary object information with improved accuracy to each vehicle 2 in response to a request from each vehicle 2, for example.
- each vehicle 2 for example, by using the stationary object information with improved accuracy received from the stationary object information storage device 200, the accuracy and efficiency of target detection can be improved, and the arrangement of headlights can be improved. It becomes possible to appropriately perform light control.
- the "stationary object” in the present embodiment refers to an object that is fixed to the road and has a high brightness, and specifically includes a self-luminous body (for example, a street light, a traffic signal, etc.), a sign, a delineator, and guardrails. That is, the stationary object information acquiring apparatus 100 in this embodiment acquires stationary object information about various stationary objects given as the above specific examples. Note that, as another embodiment, the stationary object information acquisition apparatus 100 is an object that is not included in the above specific examples, but is fixed to the road, has high brightness, and can affect target detection. It may be configured to be identifiable as an object.
- FIG. 2 is a block diagram showing system 1 according to one embodiment of the present disclosure.
- the vehicle 2 includes a vehicle ECU (Electronic Control Unit) 10, a storage section 20, a sensor section 31, a position information acquisition section 32, an illuminance sensor 33, and a stationary object information acquisition device 100.
- the vehicle 2 can communicate with the stationary object information storage device 200 by wireless communication via the communication network 3 .
- the means of wireless communication is not particularly limited, and for example, mobile communication systems such as telematics for automobiles, cooperation with smartphones, utilization of in-vehicle Wi-Fi, etc. may be used.
- the vehicle ECU 10 controls various operations such as running of the vehicle 2 .
- the vehicle ECU 10 includes, for example, a processor such as an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a general-purpose CPU (Central Processing Unit).
- the storage unit 20 includes, for example, a ROM (Read Only Memory) storing various vehicle control programs and a RAM (Random Access Memory) temporarily storing various vehicle control data.
- the processor of the vehicle ECU 10 develops data designated by various vehicle control programs stored in the ROM onto the RAM, and controls various operations of the vehicle 2 in cooperation with the RAM.
- the sensor unit 31 outputs image data of an image of the exterior of the vehicle 2.
- the sensor unit 31 includes, for example, one or more sensors of a visible camera, LiDAR, and millimeter wave radar. Image data output by LiDAR and millimeter wave radar can be three-dimensional image data.
- the position information acquisition unit 32 outputs vehicle position information indicating the current position of the vehicle 2 .
- the position information acquisition unit 32 includes, for example, a GPS (Global Positioning System) sensor.
- the illuminance sensor 33 detects and outputs the illuminance around the vehicle 2 .
- the stationary object information acquisition device 100 includes a control section 110 and a storage section 120 .
- the control unit 110 is configured by, for example, a processor such as a CPU.
- the control unit 110 can be configured, for example, as part of a lighting ECU that controls the operation of lighting such as headlights in the vehicle 2 .
- the control unit 110 may be configured as a part of the vehicle ECU 10, for example.
- the storage unit 120 is configured by, for example, a ROM, a RAM, or the like.
- the storage unit 120 may be configured as part of the storage unit 20 or a storage device provided for the lamp ECU.
- the control unit 110 By reading the program 121 stored in the storage unit 120, the control unit 110 functions as an image acquisition unit 111, a specification unit 112, a transmission/reception unit 113, and a determination unit 114. Some of these functions may be implemented by the vehicle ECU 10 or the lamp ECU. In such a configuration, the vehicle ECU 10 or the lamp ECU constitutes a part of the stationary object information acquisition device 100 . Also, the program 121 may be recorded on a non-temporary computer-readable medium.
- the image acquisition unit 111 acquires the image data 122 of the image captured by the sensor unit 31.
- the acquired image data 122 is stored in the storage unit 120 .
- the image acquisition unit 111 acquires vehicle position information 124 (that is, imaging position information indicating the imaging position of the image) from the position information acquisition unit 32 when the image corresponding to the acquired image data 122 was captured.
- the vehicle position information 124 preferably includes information indicating the orientation of the vehicle 2 when the image was captured.
- the vehicle position information 124 may also include information indicating the position of the vehicle in the vehicle width direction. The position of the vehicle in the vehicle width direction can be calculated, for example, by detecting the driving lane and using that driving lane as a reference.
- the acquired vehicle position information 124 is stored in the storage unit 120 .
- the vehicle position information 124 is stored in the storage unit 120 in association with the corresponding image data 122, for example.
- the image acquisition unit 111 may acquire time information indicating the time when the image was captured.
- the time information may include information indicating the date when the image was captured.
- the image acquisition unit 111 may also acquire lighting information regarding whether or not the headlights of the vehicle 2 were on when the image was captured.
- the time information and lighting information are stored in the storage unit 120 in association with the corresponding image data 122, for example.
- the image acquisition unit 111 acquires the image data 122 captured while the illuminance sensor 33 is outputting a signal indicating that the illuminance is equal to or higher than a predetermined value (for example, 1000 lux) as reference image data. sell.
- a predetermined value for example, 1000 lux
- the illuminance equal to or greater than a predetermined value is, for example, an illuminance equal to or greater than a value determined to be daytime. That is, the image acquisition unit 111 can store the image data 122 of the image captured during the day in the storage unit 120 as the reference image data.
- the image acquisition unit 111 acquires from the illumination sensor 33 illuminance information indicating the illuminance around the vehicle 2 when the image was captured, and stores the image data 122 and the illuminance information in the storage unit 120 in association with each other. good too.
- the image data 122 whose illuminance indicated by the associated illuminance information is equal to or greater than a predetermined value can be the reference image data.
- the identifying unit 112 identifies the stationary object information 123 based on the image data 122 .
- the stationary object information 123 specified by the specifying unit 112 is stored in the storage unit 120 .
- the "still object information” means an image in which a still object exists or still object image data corresponding to a part of the image, and a still object position indicating the position of the still object calculated based on the image data 122. is information including at least one of:
- the identifying unit 112 detects a stationary object in an image by image analysis, and includes the image data 122 of the image in which the stationary object is detected in the stationary object information 123 as stationary object image data. Further, the specifying unit 112 specifies, for example, a region including a still object in an image in which a still object is detected as a still object region, and sets data corresponding to the still object region, which is a part of the image, as still object image data. It is included in the object information 123 . The identifying unit 112 also calculates the position of the stationary object based on the image in which the stationary object is detected, for example, and includes stationary object position information indicating the position of the stationary object in the stationary object information 123 .
- the stationary object position information may be, for example, information indicating the position of the stationary object in the image (for example, the coordinates and size of the position of the stationary object in the image), or may be information indicating the position of the stationary object in the image. It may indicate the distance and direction to. Further, the identifying unit 112 may identify the type of the stationary object and include information indicating the type in the stationary object information 123 .
- the transmission/reception unit 113 transmits and receives information to and from the vehicle ECU 10 and the stationary object information storage device 200 . That is, the transmitting/receiving section 113 functions as a transmitting section and a receiving section.
- the transmitting/receiving unit 113 transmits stationary object information 123 and vehicle position information 124 corresponding to the stationary object information 123 (when an image corresponding to the image data 122 in which the stationary object information 123 is specified is captured), It is transmitted to the stationary object information storage device 200 having the storage unit 220 .
- the transmitting/receiving unit 113 can transmit reference image data to the stationary object information storage device 200 . Further, the transmitting/receiving unit 113 can transmit/receive other information to/from the stationary object information storage device 200 as necessary.
- the determining unit 114 determines whether or not a stationary object exists at the position indicated by the stationary object position information calculated by the identifying unit 112 at the same position as the imaging position of the image corresponding to the image data 122 used to calculate the stationary object position. The determination is made based on the reference image data captured at the position.
- the reference image data used by the determination unit 114 for determination is, for example, the case where the vehicle 2 passes the position indicated by the vehicle position information 124 corresponding to the image data 122 for which the stationary object information is specified by the specification unit 112, and the illuminance This is image data 122 of an image captured when the sensor 33 is outputting a signal indicating that the illuminance is equal to or greater than a predetermined value, and is acquired by the image acquisition unit 111 .
- the stationary object information storage device 200 includes a control section 210 and a storage section 220 .
- the stationary object information storage device 200 is a computer device that aggregates and accumulates information transmitted from a plurality of vehicles 2, and is installed in a data center, for example.
- the control unit 210 is configured by, for example, a processor such as a CPU.
- the storage unit 220 is configured by, for example, a ROM, a RAM, or the like.
- the control unit 210 functions as a transmission/reception unit 211, a recording unit 212, and a determination unit 213 by reading a program 221 stored in the storage unit 220.
- the program 221 may be recorded on a non-temporary computer-readable medium.
- the transmission/reception unit 211 transmits and receives information to and from the vehicle ECU 10 and the stationary object information acquisition device 100 .
- the transmitting/receiving section 211 receives the stationary object information 123 transmitted from the transmitting/receiving section 113 and the vehicle position information 124 corresponding to the stationary object information 123 . Further, the transmission/reception unit 211 can receive reference image data from the stationary object information acquisition device 100 . Further, the transmission/reception unit 211 can transmit/receive other information to/from the vehicle ECU 10 and the stationary object information acquisition device 100 as necessary.
- the recording unit 212 associates the stationary object information 123 received by the transmitting/receiving unit 211 with the vehicle position information 124 corresponding to the stationary object information 123 and records them in the stationary object database 222 .
- the recording unit 212 can update the stationary object database 222 based on the determination result of the determining unit 213 .
- Vehicle position information 124 and stationary object information 123 are associated and recorded in the stationary object database 222 .
- the stationary object database 222 for example, a plurality of pieces of stationary object image data can be recorded for one imaging position indicated by the vehicle position information 124 .
- stationary object image data and reference image data having the same imaging position can be associated and recorded.
- information such as the position and size of the stationary object, the distance and direction from the imaging position, and the type of the stationary object can be recorded in association with the imaging position.
- the determining unit 213 determines whether or not the stationary object information 123 includes a stationary object using an algorithm different from the algorithm used by the specifying unit 112 to specify the stationary object information 123 . It is preferable that the algorithm used by the determination unit 213 is, for example, an algorithm with a higher stationary object detection accuracy than the algorithm used by the identification unit 112 .
- the determination unit 213 uses an image corresponding to the still object image data included in the still object information 123 to determine whether or not the image contains a stationary object.
- the determining unit 213 uses an image corresponding to the still object image data to obtain detailed information such as the position and size of the still object in the image, the distance and direction from the imaging position of the image to the stationary object, and the type of the stationary object. may be specified.
- the stationary object information storage device 200 may be mounted on the vehicle 2.
- control unit 210 and storage unit 220 may be provided separately from vehicle ECU 10 , control unit 110 , storage unit 20 , and storage unit 120 .
- the control unit 210 may be configured as a part of any one or more of the lamp ECU, the vehicle ECU 10, and the control unit 110, for example.
- part of the functions of the control unit 210 may be implemented by the vehicle ECU 10 or the lamp ECU.
- the storage unit 220 may be configured as a part of one or more of the storage unit 20, the storage unit 120, or a storage device provided for the lamp ECU, for example.
- the stationary object information storage device 200 is mounted on the vehicle 2, the stationary object information acquisition device 100 and the stationary object information storage device 200 are configured to be connectable by wireless communication or wired communication.
- the stationary object information acquisition method is executed by the control unit 110 of the stationary object information acquisition apparatus 100 that has loaded the program 121, for example.
- the stationary object information acquisition device 100 may identify the stationary object information 123 using, for example, an image output by millimeter wave radar or LiDAR.
- FIG. 3 is a flow chart showing an example of the stationary object information acquisition method according to this embodiment. It should be noted that the order of each process constituting each flowchart described in this specification may be random as long as there is no contradiction or inconsistency in the contents of the process, and the processes may be executed in parallel.
- control unit 110 acquires image data and the like. Specifically, control unit 110 acquires image data of an image captured by a visible camera. Also, the control unit 110 acquires the vehicle position information 124 corresponding to the image data.
- step S10 the control unit 110 collects time information indicating the time when the image was captured, lighting information regarding whether or not the headlights of the vehicle 2 were on when the image was captured, and the image It is preferable to acquire one or more pieces of illuminance information indicating the illuminance around the vehicle 2 when is imaged. Acquiring these pieces of information makes it possible to appropriately compare each image, and as a result, it is possible to improve the detection accuracy of a stationary object.
- the visible camera is controlled by the vehicle ECU, for example, so as to capture images of the exterior of the vehicle 2 at predetermined time intervals.
- the control unit 110 receives the image data 122 of the images captured at predetermined time intervals, for example, at time intervals longer than the time interval at the time of image capturing (for example, 0.1 to 1 second) or at predetermined distance intervals between the image capturing positions. It is preferable to acquire by thinning out (for example, intervals of 1 to 10 m). By thinning out the image data 122, it is possible to suppress an increase in the capacity of the storage unit 120. FIG. In addition, since the targets of the specific processing in step S30 described later can be reduced, the burden on the control unit 110 can be reduced.
- control unit 110 acquires all the image data 122 of images captured at predetermined time intervals, temporarily stores them in the storage unit 120, and at predetermined timing such as before the specific processing in step S30. , the image data 122 may be thinned.
- the image acquisition unit 111 may thin out the image data 122 based on whether or not the image is captured in a place where the vehicle 2 usually travels. Specifically, the image acquisition unit 111 thins out the image data 122 of images captured on roads where the number of times of travel in the past predetermined period is less than a predetermined specified number (for example, once or less in the past month). good too. This is because identifying a stationary object in a place where the vehicle 2 does not normally travel is not very useful for the user of the vehicle 2 . In particular, when the stationary object information storage device 200 is mounted on the vehicle 2, it is preferable to thin out the image data 122 based on the number of times the vehicle travels to the imaging position during a predetermined period.
- step S20 when the vehicle 2 is in the first state (Yes in step S20), the control unit 110 executes a process of identifying the stationary object information 123 in step S30. On the other hand, if vehicle 2 is not in the first state (No in step S20), control unit 110 waits to execute the specific process in step S30 until vehicle 2 is in the first state.
- the "first state” is a state in which the processing load on the vehicle ECU 10 or the lamp ECU is considered to be small.
- the “first state” includes, for example, a stopped state or a slow-moving state (for example, while traveling at a speed of 10 km/h or less).
- the vehicle ECU 10 or the lighting ECU is configured to execute the specific process of step S30 at the timing when the vehicle 2 is in the first state. reduce the burden on Note that if the control unit 110 is configured independently of the vehicle ECU 10 and the lamp ECU, the determination in step S20 may not be performed.
- step S ⁇ b>30 the control unit 110 executes identification processing for identifying the stationary object information 123 based on the image data 122 . Details of the specific processing will be described later with reference to FIG. 4 and the like.
- step S40 when the vehicle 2 is in the second state (Yes in step S40), the control unit 110 stores the stationary object information 123 and the vehicle position information 124 corresponding to the stationary object information 123 in step S50. It is transmitted to the stationary object information storage device 200 having the unit 220, and ends. Further, in step S50, time information, lighting information, illuminance information, etc. may also be transmitted together. On the other hand, if vehicle 2 is not in the second state (No in step S40), control unit 110 waits to execute the transmission process in step S50 until vehicle 2 is in the second state.
- the "second state” is a state in which the processing load on the vehicle ECU 10 or the lamp ECU is considered to be small.
- the “second state” includes, for example, a stopped state or a slow-moving state (for example, while traveling at a speed of 10 km/h or less).
- the vehicle ECU 10 or the lamp ECU is configured to execute the transmission process of step S50 at the timing when the vehicle 2 is in the second state. reduce the burden on It should be noted that if the control unit 110 is configured independently of the vehicle ECU 10 and the lamp ECU, the determination in step S40 may not be performed.
- the stationary object information 123 transmitted in step S50 may be the stationary object image data of the image identified as having the stationary object, the stationary object position information calculated from the image, or both. There may be.
- the still object image data can be further examined in the still object information storage device 200 to obtain more accurate information.
- the stationary object image data is not included in the transmitted stationary object information 123, it is advantageous in that the amount of data to be transmitted becomes small.
- FIG. 4 is a flowchart showing an example of processing for identifying the stationary object information 123.
- control unit 110 detects a light spot in the image.
- a conventionally known technique can be used for the detection of the light spot, and for example, it can be performed by luminance analysis of the image.
- step S32 the control unit 110 performs pattern recognition processing on the image.
- a conventionally known method can be used as a pattern recognition method.
- a machine learning model may be used to detect a stationary object, or a clustering method may be used to detect a stationary object.
- step S33 the control unit 110 determines whether or not there is a stationary object in the image based on the results of the processing in steps S31 and/or S32. If it is determined that there is no stationary object in the image (No in step S33), control unit 110 deletes image data 122 corresponding to that image from storage unit 120 in step S35, and the process ends.
- step S34 the control unit 110 identifies the stationary object region or the stationary object position in the image.
- the control unit 110 identifies the stationary object region or the stationary object position in the image.
- the still object region By specifying the still object region and using the data of the portion of the image including the still object region as the still object image data, the data volume for transmission to the still object information storage device 200 can be reduced.
- Still object image data may be processed to reduce the amount of data in an area other than the still object area.
- a stationary object position is, for example, the position of a stationary object in an image.
- a stationary object position can be specified, for example, using an arbitrary coordinate system set in the image.
- the stationary object position may indicate, for example, the center point of the stationary object or the position of the outer edge of the stationary object.
- the position of the stationary object preferably includes information about the size specified using the coordinate system.
- FIG. 5 is a schematic diagram for explaining stationary object position information.
- a sign O1 and street lights O2 to O4 are identified as stationary objects.
- stationary object position information can be defined by using coordinates defined by the x-axis and the y-axis to define the positions of areas Z1 to Z4 that include the sign O1 and the street lights O2 to O4, respectively.
- the method of setting the coordinates is not particularly limited, and for example, the center of the image may be set as the origin.
- the areas Z1 to Z4 do not include the post portions of the sign O1 and the street lights O2 to O4, but an area including those post portions may be set as the stationary object position.
- the position of the stationary object specified in step S34 may indicate the distance and direction from the imaging position of the image to the stationary object.
- the distance and direction from the imaging position to the stationary object may be calculated using the depth information. Further, it may be calculated by comparison with other image data 122 captured near the imaging position, or may be calculated using data acquired from a millimeter wave radar or LiDAR.
- the image data 122 may be deleted from the storage unit 120, or may be included in the stationary object information 123 in association with the stationary object position information. After step S34, the process proceeds to step S40 in FIG.
- the specific processing shown in FIG. 4 is executed based on the image data 122 captured in the daytime when the illuminance is equal to or higher than a predetermined value, it becomes easier to grasp the outline of the structure in the image, and the color of the structure can be detected from the image. Since it becomes easier to acquire information, it is possible to improve the detection accuracy of stationary objects by pattern recognition processing.
- the identification processing of the stationary object information 123 may be performed by comparing a plurality of image data 122 captured at the same point or points that are close to each other. Further, in step S ⁇ b>34 , the control unit 110 may specify the type of the stationary object based on the results of steps S ⁇ b>31 and/or S ⁇ b>32 and include the type information in the stationary object information 123 .
- An example of a technique for identifying whether a stationary object is a self-luminous body and an example of detecting a stationary object by comparing a plurality of image data 122 will be described below with reference to FIG.
- FIG. 6 is a schematic diagram showing image capturing timings and respective images 122A to 122D acquired at each capturing timing.
- the visible camera captures the front of the vehicle 2 at times T1, T2, T3, and T4, and outputs image data 122 of images 122A-D.
- the intervals F1 to F4 between each time are all the same. That is, the images 122A to 122D are data of images captured at regular time intervals.
- the headlights are on at times T1, T3, and T4, and off at time T2. Note that the vehicle 2 is traveling forward at a predetermined speed between times T1 to T4.
- Whether or not a stationary object present in the image corresponding to the image data 122 is a self-luminous object is determined, for example, by at least two images captured before and after the switching timing of turning on and off the headlights mounted on the vehicle 2. can be identified based on the image data 122 of .
- light points LP1 and LP2 are detected in image 122A.
- the image 122B the light spot LP1 is detected, but the light spot is not detected at the position where the light spot LP2 is estimated to be detected (the position indicated by the dotted line).
- the image 122C the light spot LP2 is detected again.
- the light spot LP2 is detected as a light spot by reflecting the light from the headlamp, and that the light spot LP2 is not caused by the self-luminous body. Further, it can be specified that the light spot LP1 detected in the image 122B captured when the headlight is not turned on is caused by the self-luminous body.
- a stationary object can be detected by comparing a plurality of images, for example, by comparing images captured at the same position or positions close to each other.
- image 122C' is an image captured at the same position as image 122C before image 122C.
- image 122C light points LP3 and LP4 are detected in addition to light points LP1 and LP2.
- image 122C' light points LP1 and LP2 are detected, but light points LP3 and LP4 are not detected. If the light points LP3 and LP4 are stationary objects, they are also detected from the image 122C', but the light points LP3 and LP4 are not actually detected from the image 122C'.
- the light points LP3 and LP4 are not stationary objects. Also, the light points LP1 and LP2 detected at the same position in both the image 122C and the image data C' can be identified as originating from a stationary object.
- light spots detected from each image include not only light spots caused by stationary objects but also light spots caused by moving objects such as vehicles. It may contain light spots that cause
- the position of each detected light spot is compared between a plurality of images, and if the position of the light spot does not change or the relative position between the light spots does not change, it is determined that the light spot is caused by a stationary object.
- it is possible to identify a stationary object by determining that a light spot whose position has changed significantly is a light spot of a moving object.
- the stationary object information 123 can be identified by regarding the light spot as a stationary object.
- detection of a stationary object by comparing a plurality of image data 122 is performed by comparing the plurality of image data 122 along the time series in which the images were captured, and determining the amount of movement of each light spot between the plurality of images and the movement of the vehicle 2 . It may be performed based on the running speed. For example, the amount of movement of light points LP3 and LP4 between images 122C-122D in the example of FIG. 6 is greater than the amount of movement of light points LP1 and LP2. If the amount of movement of the light points LP3 and LP4 is greater than the amount of movement estimated from the running speed of the vehicle 2, it is considered that the light points LP3 and LP4 are moving in the direction of the vehicle 2.
- LP4 can be identified as originating from mobile objects. Further, when the amount of movement of the light points LP1 and LP2 is equal to the amount of movement estimated from the traveling speed of the vehicle 2, it can be identified that the light points LP1 and LP2 are caused by stationary objects. If the amount of movement of the light points LP1 and LP2 is smaller than the amount of movement estimated from the traveling speed of the vehicle 2, it is considered that the light points LP1 and LP2 are moving in the same direction as the traveling direction of the vehicle 2. , light points LP1 and LP2 are identified as originating from the moving object.
- FIG. 7 is a flowchart showing an example of determination processing of the stationary object information 123. As shown in FIG.
- step S131 the control unit 110 acquires reference image data. Specifically, when the vehicle 2 again passes through the imaging position of the image corresponding to the image data 122 for which the stationary object information 123 is specified, the control unit 110 controls the illuminance sensor 33 to The image data 122 of the image captured when the signal indicating that there is an object is output is acquired as the reference image data.
- step S132 the control unit 110 identifies the position of the stationary object in the reference image indicated by the reference image data.
- the identification of the position of the stationary object in step S132 can be performed, for example, by processes similar to steps S31 to S34.
- step S133 the control unit 110 determines whether or not the still object position of the reference image matches the still object position of the target image, which is the image for which the still object information 123 has already been identified. If they match (Yes in step S133), control unit 110 determines that the identified stationary object information 123 is correct, and terminates the process.
- step S133 the control unit 110 updates the stationary object information 123 in step S134, and terminates.
- step S134 for example, the positions of stationary objects whose positions match between the reference image and the target image are determined to be correct, and the positions of stationary objects whose positions do not match between the reference image and the target image are determined to be erroneous. , to update the stationary object information 123 .
- FIG. 8 is a schematic diagram showing an example of the reference image 122E used for the determination process shown in FIG.
- FIG. 9 is a schematic diagram showing an example of the target image 122F used for the determination process shown in FIG.
- the reference image 122E is an image captured during the daytime
- the target image 122F is an image captured during the nighttime.
- the sign O1 and the street lights O2 to O4 are identified as stationary objects by the process of step S132.
- the other vehicle C1 which is the preceding vehicle, has issued a hazard and has stopped, and the rear lamps BL1 and BL2 are on.
- the rear lamps BL1 and BL2 are also erroneously detected as stationary objects by the process of step S132.
- the oncoming vehicle C2 has its headlights HL1 and HL2 turned off because it is daytime. Therefore, headlights HL1 and HL2 are not specified as stationary objects.
- step S30 identifies the sign O1 and the street lights O2 to O4 as stationary objects, and their surroundings as stationary object regions Z1 to Z4. Further, since it is nighttime, the other vehicle C1, which is the preceding vehicle, has its tail lamps turned on, and the rear lamps BL3 and BL4 are turned on. As a result, it is assumed that the rear lamps BL3 and BL4 are also erroneously detected as stationary objects by the process of step S30. Similarly, another vehicle C4, which is an oncoming vehicle, has its headlights HL3 and HL4 turned on, and the headlights HL3 and HL4 are erroneously detected as stationary objects. Although not shown, the surroundings of the rear lamps BL3 and BL4 and the headlamps HL3 and HL4 are also identified as stationary object areas.
- the sign O1 and the street lights O2 to O4 are present at the same positions in both the reference image 122E and the target image 122F. Therefore, the sign O1 and the street lights O2 to O4 are determined to be stationary objects.
- the rear lamps BL1 to BL4 and the headlamps HL3 to HL4 are present only in one of the images. Therefore, it is determined that the rear lamps BL1 to BL4 and the headlamps HL3 to HL4 are not stationary objects.
- the stationary object information 123 is updated assuming that the rear lamps BL3 to BL4 and the headlights HL3 to HL4 are not stationary objects.
- the present invention is not limited to the above-described embodiments, and can be modified, improved, etc. as appropriate.
- the material, shape, size, numerical value, form, number, location, etc. of each component in the above-described embodiment are arbitrary and not limited as long as the present invention can be achieved.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Accounting & Taxation (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Development Economics (AREA)
- Business, Economics & Management (AREA)
- Operations Research (AREA)
- Traffic Control Systems (AREA)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023534833A JPWO2023286806A1 (enrdf_load_stackoverflow) | 2021-07-16 | 2022-07-13 | |
CN202280050074.9A CN117716405A (zh) | 2021-07-16 | 2022-07-13 | 静止物信息获取装置、程序以及静止物信息获取方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-117823 | 2021-07-16 | ||
JP2021117823 | 2021-07-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023286806A1 true WO2023286806A1 (ja) | 2023-01-19 |
Family
ID=84919504
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/027575 WO2023286806A1 (ja) | 2021-07-16 | 2022-07-13 | 静止物情報取得装置、プログラム、及び静止物情報取得方法 |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2023286806A1 (enrdf_load_stackoverflow) |
CN (1) | CN117716405A (enrdf_load_stackoverflow) |
WO (1) | WO2023286806A1 (enrdf_load_stackoverflow) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009129290A (ja) * | 2007-11-27 | 2009-06-11 | Aisin Aw Co Ltd | 信号機検出装置、信号機検出方法及びプログラム |
JP2013045176A (ja) * | 2011-08-22 | 2013-03-04 | Pioneer Electronic Corp | 信号機認識装置、候補点パターン送信装置、候補点パターン受信装置、信号機認識方法、及び候補点パターン受信方法 |
JP2015108604A (ja) * | 2013-12-06 | 2015-06-11 | 日立オートモティブシステムズ株式会社 | 車両位置推定システム,装置,方法、及び、カメラ装置 |
JP2017102918A (ja) * | 2015-11-20 | 2017-06-08 | アサヒリサーチ株式会社 | ドライブレコーダ |
WO2020045317A1 (ja) * | 2018-08-31 | 2020-03-05 | 株式会社デンソー | 地図システム、方法および記憶媒体 |
-
2022
- 2022-07-13 JP JP2023534833A patent/JPWO2023286806A1/ja active Pending
- 2022-07-13 CN CN202280050074.9A patent/CN117716405A/zh active Pending
- 2022-07-13 WO PCT/JP2022/027575 patent/WO2023286806A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009129290A (ja) * | 2007-11-27 | 2009-06-11 | Aisin Aw Co Ltd | 信号機検出装置、信号機検出方法及びプログラム |
JP2013045176A (ja) * | 2011-08-22 | 2013-03-04 | Pioneer Electronic Corp | 信号機認識装置、候補点パターン送信装置、候補点パターン受信装置、信号機認識方法、及び候補点パターン受信方法 |
JP2015108604A (ja) * | 2013-12-06 | 2015-06-11 | 日立オートモティブシステムズ株式会社 | 車両位置推定システム,装置,方法、及び、カメラ装置 |
JP2017102918A (ja) * | 2015-11-20 | 2017-06-08 | アサヒリサーチ株式会社 | ドライブレコーダ |
WO2020045317A1 (ja) * | 2018-08-31 | 2020-03-05 | 株式会社デンソー | 地図システム、方法および記憶媒体 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023286806A1 (enrdf_load_stackoverflow) | 2023-01-19 |
CN117716405A (zh) | 2024-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10286834B2 (en) | Vehicle exterior environment recognition apparatus | |
CN104210421B (zh) | 周围环境判定装置 | |
US7512252B2 (en) | Image processing system and vehicle control system | |
CN104185588B (zh) | 用于确定道路宽度的车载成像系统及方法 | |
CN103213540B (zh) | 车辆的行驶环境辨识装置 | |
JP6056540B2 (ja) | 周辺車両識別システム、特徴量送信装置、及び周辺車両識別装置 | |
JP7255706B2 (ja) | 信号機認識方法及び信号機認識装置 | |
JP2018163438A (ja) | 情報処理装置及び情報処理システム | |
US11679769B2 (en) | Traffic signal recognition method and traffic signal recognition device | |
EP2525302A1 (en) | Image processing system | |
WO2023286810A1 (ja) | 静止物情報利用装置、プログラム、静止物情報利用方法、車両システム、及び静止物情報利用システム | |
JP6151569B2 (ja) | 周囲環境判定装置 | |
WO2023286806A1 (ja) | 静止物情報取得装置、プログラム、及び静止物情報取得方法 | |
US20200361375A1 (en) | Image processing device and vehicle lamp | |
WO2023286811A1 (ja) | 静止物情報蓄積システム、プログラム、及び静止物情報記憶方法 | |
KR20160091331A (ko) | 자동차용 야간 화상들을 형성하는 시스템 및 방법 | |
JP7084223B2 (ja) | 画像処理装置および車両用灯具 | |
JP7035339B2 (ja) | ブレーキ判定方法及びブレーキ判定装置 | |
CN117677996A (zh) | 静止物信息积累系统、程序以及静止物信息存储方法 | |
CN117651982A (zh) | 静止物信息利用装置、程序、静止物信息利用方法、车辆系统以及静止物信息利用系统 | |
US20250218191A1 (en) | Travel environment decision apparatus, vehicle, travel environment decision method, and non-transitory computer readable medium | |
RU2779798C1 (ru) | Способ распознавания светофора и устройство распознавания светофора | |
JP7725947B2 (ja) | オブジェクト認識制御装置およびオブジェクト認識制御方法 | |
WO2022130866A1 (ja) | 配光制御システム及びセンシングシステム | |
JP2023170863A (ja) | 信号機認識装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22842148 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023534833 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280050074.9 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22842148 Country of ref document: EP Kind code of ref document: A1 |