WO2023286811A1 - Système de collecte d'informations d'objet stationnaire, programme et procédé de stockage d'informations d'objet stationnaire - Google Patents

Système de collecte d'informations d'objet stationnaire, programme et procédé de stockage d'informations d'objet stationnaire Download PDF

Info

Publication number
WO2023286811A1
WO2023286811A1 PCT/JP2022/027594 JP2022027594W WO2023286811A1 WO 2023286811 A1 WO2023286811 A1 WO 2023286811A1 JP 2022027594 W JP2022027594 W JP 2022027594W WO 2023286811 A1 WO2023286811 A1 WO 2023286811A1
Authority
WO
WIPO (PCT)
Prior art keywords
stationary object
object information
image data
image
vehicle
Prior art date
Application number
PCT/JP2022/027594
Other languages
English (en)
Japanese (ja)
Inventor
美紗子 神谷
拓弥 片岡
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Priority to JP2023534838A priority Critical patent/JPWO2023286811A1/ja
Priority to CN202280050076.8A priority patent/CN117677996A/zh
Publication of WO2023286811A1 publication Critical patent/WO2023286811A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions

Definitions

  • the present disclosure relates to a stationary object information storage system, a program, and a stationary object information storage method.
  • Patent Literature 1 describes detecting a forward vehicle and controlling forward light distribution.
  • ADB light distribution control is based on target information sent from the vehicle.
  • Each target is detected by a specific algorithm based on the data acquired by sensors such as cameras. (excessive detection), or a target is detected even though it does not exist (false detection).
  • stationary objects with high brightness such as street lights and signs on the road
  • these stationary objects may be mistakenly recognized as the vehicle ahead.
  • the headlamps of the vehicle ahead may be erroneously recognized as street lights. If information on stationary objects such as street lights and signs on the road can be collected, it is useful because it can be used to reduce the possibility of misrecognition as described above.
  • the purpose of this disclosure is to collect and accumulate stationary object information on stationary objects such as street lights and signs on the road.
  • a stationary object information storage system includes: A stationary object information storage system comprising a stationary object information acquisition device mounted on a vehicle and a stationary object information storage device arranged outside the vehicle,
  • the stationary object information acquisition device is an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle; An image in which one or more stationary objects selected from a self-luminous body, a sign, a delineator, and a guardrail exist, or stationary object image data corresponding to a portion of the image, and the stationary object calculated based on the image data a specifying unit that specifies, based on the image data, stationary object information including at least one of stationary object position information indicating the position of the vehicle position information of the vehicle obtained from a position information obtaining unit mounted on the vehicle, the vehicle position information when an image corresponding to the image data in which the stationary object information is specified is captured; , the stationary object information, and a transmitting unit that transmits the stationary object information to the stationary object information storage device by wireless communication,
  • the stationary object information information is an
  • a program is A program that is executed by a computer device that includes a processor and is communicatively connectable to a stationary object information acquisition device mounted on a vehicle,
  • the stationary object information acquisition device is an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on the vehicle; An image in which one or more stationary objects selected from a self-luminous body, a sign, a delineator, and a guardrail exist, or stationary object image data corresponding to a portion of the image, and the stationary object calculated based on the image data a specifying unit that specifies, based on the image data, stationary object information including at least one of stationary object position information indicating the position of the vehicle position information of the vehicle obtained from a position information obtaining unit mounted on the vehicle, the vehicle position information when an image corresponding to the image data in which the stationary object information is specified is captured; , the stationary object information, and a transmitting unit that transmits the stationary object information to the stationary object information storage device by wireless communication,
  • the program causes the processor to:
  • a stationary object information storage method includes: A stationary object information storage method that is executed by a computer device that includes a processor and is communicatively connectable to a stationary object information acquisition device mounted on a vehicle,
  • the stationary object information acquisition device is an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on the vehicle; An image in which one or more stationary objects selected from a self-luminous body, a sign, a delineator, and a guardrail exist, or stationary object image data corresponding to a portion of the image, and the stationary object calculated based on the image data a specifying unit that specifies, based on the image data, stationary object information including at least one of stationary object position information indicating the position of the vehicle position information of the vehicle obtained from a position information obtaining unit mounted on the vehicle, the vehicle position information when an image corresponding to the image data in which the stationary object information is specified is captured; , the stationary object information, and a transmitting unit that transmits the stationary object information to the stationary object information storage device by wireless
  • a stationary object information storage system includes: A stationary object information storage system comprising a stationary object information acquisition device mounted on a vehicle and a stationary object information storage device communicatively connectable to the stationary object information acquisition device,
  • the stationary object information acquisition device is an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
  • Still object information including still object image data corresponding to an image or a part of the image in which one or more kinds of stationary objects selected from a self-luminous body, a sign, a delineator, and a guardrail are present is specified based on the image data.
  • the stationary object information storage device a receiving unit that receives the stationary object information and the vehicle position information transmitted from the transmitting unit; a recording unit that associates and records in a stationary object database the stationary object information and the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured; and a determination unit that determines whether or not the image corresponding to the still object image data includes the stationary object using an algorithm different from the algorithm for specifying the stationary object information.
  • a program that is executed by a computer device that includes a processor and is communicatively connectable to a stationary object information acquisition device mounted on a vehicle
  • the stationary object information acquisition device is an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
  • Still object information including still object image data corresponding to an image or a part of the image in which one or more kinds of stationary objects selected from a self-luminous body, a sign, a delineator, and a guardrail are present is specified based on the image data.
  • the program causes the processor to: a receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit; a recording step of associating the stationary object information and the vehicle position information when an image corresponding to the image data for which the stationary object information was specified was captured and recording the stationary object information in a stationary object database; and a determining step of determining whether or not the image corresponding to the still object image data includes the stationary object using an algorithm different from the algorithm for specifying the stationary object information.
  • a stationary object information storage method includes: A stationary object information storage method that is executed by a computer device that includes a processor and is communicatively connectable to a stationary object information acquisition device mounted on a vehicle,
  • the stationary object information acquisition device is an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
  • Still object information including still object image data corresponding to an image or a part of the image in which one or more kinds of stationary objects selected from a self-luminous body, a sign, a delineator, and a guardrail are present is specified based on the image data.
  • the stationary object information storage method comprises: a receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit; a recording step of associating the stationary object information and the vehicle position information when an image corresponding to the image data for which the stationary object information was specified was captured and recording the stationary object information in a stationary object database; determining whether or not the still object is included in the image corresponding to the still object image data using an algorithm different from the algorithm for specifying the still object information by the specifying unit; and .
  • FIG. 1 is a schematic diagram showing an example of a stationary object information storage system according to an embodiment of the present disclosure.
  • FIG. 2 is a block diagram showing an example of a stationary object information storage system according to an embodiment of the present disclosure.
  • FIG. 3 is a flowchart illustrating an example of a stationary object information storage method according to an embodiment of the present disclosure.
  • FIG. 4 is a flowchart showing an example of processing for specifying stationary object information shown in FIG.
  • FIG. 5 is a schematic diagram for explaining stationary object position information.
  • FIG. 6 is an example of a stationary object information database.
  • FIG. 7 is an example of a stationary object information database.
  • FIG. 8 is a schematic diagram showing imaging timings and each image acquired at each imaging timing.
  • FIG. 9 is a flowchart illustrating an example of processing for determining stationary object information.
  • FIG. 10 is a schematic diagram showing an example of a reference image used for identifying processing of stationary object information.
  • FIG. 11 is a schematic diagram showing an example of a target image used for identifying processing of stationary object information.
  • FIG. 12 is a flowchart illustrating another example of determination processing of stationary object information.
  • FIG. 1 is a schematic diagram illustrating a system 1 according to one embodiment of the present disclosure.
  • the system 1 includes a stationary object information storage device 200 and a plurality of vehicles 2 such as vehicles 2A and 2B on which the stationary object information acquisition devices 100 are respectively mounted.
  • the stationary object information storage device 200 and each vehicle 2 can be communicatively connected to each other by wireless communication.
  • the system 1 is an example of a stationary object information storage system according to the present disclosure.
  • the stationary object information acquisition device 100 acquires stationary object information about stationary objects and transmits the stationary object information to the stationary object information storage device 200 .
  • the stationary object information storage device 200 accumulates stationary object information received from each stationary object information acquisition device 100, for example.
  • the stationary object information storage device 200 analyzes the received stationary object information, improves the accuracy of the stationary object information, acquires more detailed information, and creates a light distribution pattern based on the stationary object information. or
  • the stationary object information storage device 200 also transmits the stationary object information with improved accuracy to each vehicle 2 in response to a request from each vehicle 2, for example.
  • each vehicle 2 for example, by using the stationary object information with improved accuracy received from the stationary object information storage device 200, the accuracy and efficiency of target detection can be improved, and the arrangement of headlights can be improved. It becomes possible to appropriately perform light control.
  • the "stationary object” in the present embodiment refers to an object that is fixed to the road and has a high brightness, and specifically includes a self-luminous body (for example, a street light, a traffic signal, etc.), a sign, a delineator, and guardrails. That is, the stationary object information acquiring apparatus 100 in this embodiment acquires stationary object information about various stationary objects given as the above specific examples. Note that, as another embodiment, the stationary object information acquisition apparatus 100 is an object that is not included in the above specific examples, but is fixed to the road, has high brightness, and can affect target detection. It may be configured to be identifiable as an object.
  • FIG. 2 is a block diagram showing system 1 according to one embodiment of the present disclosure.
  • the vehicle 2 includes a vehicle ECU (Electronic Control Unit) 10, a storage section 20, a sensor section 31, a position information acquisition section 32, an illuminance sensor 33, and a stationary object information acquisition device 100.
  • the vehicle 2 can communicate with the stationary object information storage device 200 by wireless communication via the communication network 3 .
  • the means of wireless communication is not particularly limited, and for example, mobile communication systems such as telematics for automobiles, cooperation with smartphones, utilization of in-vehicle Wi-Fi, etc. may be used.
  • the vehicle ECU 10 controls various operations such as running of the vehicle 2 .
  • the vehicle ECU 10 includes, for example, a processor such as an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a general-purpose CPU (Central Processing Unit).
  • the storage unit 20 includes, for example, a ROM (Read Only Memory) storing various vehicle control programs and a RAM (Random Access Memory) temporarily storing various vehicle control data.
  • the processor of the vehicle ECU 10 develops data designated by various vehicle control programs stored in the ROM onto the RAM, and controls various operations of the vehicle 2 in cooperation with the RAM.
  • the sensor unit 31 outputs image data of an image of the exterior of the vehicle 2.
  • the sensor unit 31 includes, for example, one or more sensors of a visible camera, LiDAR, and millimeter wave radar. Image data output by LiDAR and millimeter wave radar can be three-dimensional image data.
  • the position information acquisition unit 32 outputs vehicle position information indicating the current position of the vehicle 2 .
  • the position information acquisition unit 32 includes, for example, a GPS (Global Positioning System) sensor.
  • the illuminance sensor 33 detects and outputs the illuminance around the vehicle 2 .
  • the stationary object information acquisition device 100 includes a control section 110 and a storage section 120 .
  • the control unit 110 is configured by, for example, a processor such as a CPU.
  • the control unit 110 can be configured, for example, as part of a lighting ECU that controls the operation of lighting such as headlights in the vehicle 2 .
  • the control unit 110 may be configured as a part of the vehicle ECU 10, for example.
  • the storage unit 120 is configured by, for example, a ROM, a RAM, or the like.
  • the storage unit 120 may be configured as part of the storage unit 20 or a storage device provided for the lamp ECU.
  • the control unit 110 By reading the program 121 stored in the storage unit 120, the control unit 110 functions as an image acquisition unit 111, a specification unit 112, a transmission/reception unit 113, and a determination unit 114. Some of these functions may be implemented by the vehicle ECU 10 or the lamp ECU. In such a configuration, the vehicle ECU 10 or the lamp ECU constitutes a part of the stationary object information acquisition device 100 . Also, the program 121 may be recorded on a non-temporary computer-readable medium.
  • the image acquisition unit 111 acquires the image data 122 of the image captured by the sensor unit 31.
  • the acquired image data 122 is stored in the storage unit 120 .
  • the image acquisition unit 111 acquires vehicle position information 124 (that is, imaging position information indicating the imaging position of the image) from the position information acquisition unit 32 when the image corresponding to the acquired image data 122 was captured.
  • the vehicle position information 124 preferably includes information indicating the orientation of the vehicle 2 when the image was captured.
  • the vehicle position information 124 may also include information indicating the position of the vehicle in the vehicle width direction. The position of the vehicle in the vehicle width direction can be calculated, for example, by detecting the driving lane and using that driving lane as a reference.
  • the acquired vehicle position information 124 is stored in the storage unit 120 .
  • the vehicle position information 124 is stored in the storage unit 120 in association with the corresponding image data 122, for example.
  • the image acquisition unit 111 may acquire time information indicating the time when the image was captured.
  • the time information may include information indicating the date when the image was captured.
  • the image acquisition unit 111 may also acquire lighting information regarding whether or not the headlights of the vehicle 2 were on when the image was captured.
  • the time information and lighting information are stored in the storage unit 120 in association with the corresponding image data 122, for example.
  • the image acquisition unit 111 acquires the image data 122 captured while the illuminance sensor 33 is outputting a signal indicating that the illuminance is equal to or higher than a predetermined value (for example, 1000 lux) as reference image data. sell.
  • a predetermined value for example, 1000 lux
  • the illuminance equal to or greater than a predetermined value is, for example, an illuminance equal to or greater than a value determined to be daytime. That is, the image acquisition unit 111 can store the image data 122 of the image captured during the day in the storage unit 120 as the reference image data.
  • the image acquisition unit 111 acquires from the illumination sensor 33 illuminance information indicating the illuminance around the vehicle 2 when the image was captured, and stores the image data 122 and the illuminance information in the storage unit 120 in association with each other. good too.
  • the image data 122 whose illuminance indicated by the associated illuminance information is equal to or greater than a predetermined value can be the reference image data.
  • the identifying unit 112 identifies the stationary object information 123 based on the image data 122 .
  • the stationary object information 123 specified by the specifying unit 112 is stored in the storage unit 120 .
  • the "still object information” means an image in which a still object exists or still object image data corresponding to a part of the image, and a still object position indicating the position of the still object calculated based on the image data 122. is information including at least one of:
  • the identifying unit 112 detects a stationary object in an image by image analysis, and includes the image data 122 of the image in which the stationary object is detected in the stationary object information 123 as stationary object image data. Further, the specifying unit 112 specifies, for example, a region including a still object in an image in which a still object is detected as a still object region, and sets data corresponding to the still object region, which is a part of the image, as still object image data. It is included in the object information 123 . The identifying unit 112 also calculates the position of the stationary object based on the image in which the stationary object is detected, for example, and includes stationary object position information indicating the position of the stationary object in the stationary object information 123 .
  • the stationary object position information may be, for example, information indicating the position of the stationary object in the image (for example, the coordinates and size of the position of the stationary object in the image), or may be information indicating the position of the stationary object in the image. It may indicate the distance and direction to. Further, the identifying unit 112 may identify the type of the stationary object and include information indicating the type in the stationary object information 123 .
  • the transmission/reception unit 113 transmits and receives information to and from the vehicle ECU 10 and the stationary object information storage device 200 . That is, the transmitting/receiving section 113 functions as a transmitting section and a receiving section.
  • the transmitting/receiving unit 113 transmits stationary object information 123 and vehicle position information 124 corresponding to the stationary object information 123 (when an image corresponding to the image data 122 in which the stationary object information 123 is specified is captured), It is transmitted to the stationary object information storage device 200 having the storage unit 220 .
  • the transmitting/receiving unit 113 can transmit reference image data to the stationary object information storage device 200 . Further, the transmitting/receiving unit 113 can transmit/receive other information to/from the stationary object information storage device 200 as necessary.
  • the determining unit 114 determines whether or not a stationary object exists at the position indicated by the stationary object position information calculated by the identifying unit 112 at the same position as the imaging position of the image corresponding to the image data 122 used to calculate the stationary object position. The determination is made based on the reference image data captured at the position.
  • the reference image data used by the determination unit 114 for determination is, for example, the case where the vehicle 2 passes the position indicated by the vehicle position information 124 corresponding to the image data 122 for which the stationary object information is specified by the specification unit 112, and the illuminance This is image data 122 of an image captured when the sensor 33 is outputting a signal indicating that the illuminance is equal to or greater than a predetermined value, and is acquired by the image acquisition unit 111 .
  • the stationary object information storage device 200 includes a control section 210 and a storage section 220 .
  • the stationary object information storage device 200 is a computer device that aggregates and accumulates information transmitted from a plurality of vehicles 2, and is installed in a data center, for example.
  • the control unit 210 is configured by, for example, a processor such as a CPU.
  • the storage unit 220 is configured by, for example, a ROM, a RAM, or the like.
  • the control unit 210 functions as a transmission/reception unit 211, a recording unit 212, and a determination unit 213 by reading a program 221 stored in the storage unit 220.
  • the program 221 may be recorded on a non-temporary computer-readable medium.
  • the transmission/reception unit 211 transmits and receives information to and from the vehicle ECU 10 and the stationary object information acquisition device 100 . That is, the transmitting/receiving section 211 functions as a transmitting section and a receiving section. The transmitting/receiving section 211 receives the stationary object information 123 transmitted from the transmitting/receiving section 113 and the vehicle position information 124 corresponding to the stationary object information 123 . Further, the transmission/reception unit 211 can receive reference image data from the stationary object information acquisition device 100 . Further, the transmission/reception unit 211 can transmit/receive other information to/from the vehicle ECU 10 and the stationary object information acquisition device 100 as necessary.
  • the recording unit 212 associates the stationary object information 123 received by the transmitting/receiving unit 211 with the vehicle position information 124 corresponding to the stationary object information 123 and records them in the stationary object database 222 .
  • the recording unit 212 can update the stationary object database 222 based on the determination result of the determining unit 213 .
  • Vehicle position information 124 and stationary object information 123 are associated and recorded in the stationary object database 222 .
  • the stationary object database 222 for example, a plurality of pieces of stationary object image data can be recorded for one imaging position indicated by the vehicle position information 124 .
  • stationary object image data and reference image data having the same imaging position can be associated and recorded.
  • information such as the position and size of the stationary object, the distance and direction from the imaging position, and the type of the stationary object can be recorded in association with the imaging position.
  • the determining unit 213 determines whether or not the stationary object information 123 includes a stationary object using an algorithm different from the algorithm used by the specifying unit 112 to specify the stationary object information 123 . It is preferable that the algorithm used by the determination unit 213 is, for example, an algorithm with a higher stationary object detection accuracy than the algorithm used by the identification unit 112 .
  • the determination unit 213 functions as a first determination unit that, for example, uses an image corresponding to the still object image data included in the still object information 123 to determine whether or not the image contains a stationary object. Further, for example, based on the reference image data associated with the still object image data identified by the identifying unit 112 as having a still object, the determination unit 213 determines whether the image corresponding to the still object image data includes the still object. It functions as a second determination unit that determines whether or not. Further, the determination unit 213 is a third determination unit that determines whether or not the stationary object is a self-luminous object based on two or more images captured before and after the switching timing of turning on and off the headlights of the vehicle 2.
  • the determining unit 213 uses an image corresponding to the still object image data to determine the position and size of the still object in the image, the distance and direction from the imaging position of the image to the stationary object, the type of the stationary object, and the like. Detailed information may be specified.
  • the determination unit 213 identifies the first stationary object image data having the fewest light spots among the plurality of stationary object image data recorded for one imaging position, and determines the first stationary object image data. Based on this, it may be determined whether or not there is a stationary object at the one position.
  • the determining unit 213 determines whether or not a stationary object is included in each of the images corresponding to the plurality of still object image data recorded for one imaging position, and A final determination is made as to whether or not there is a stationary object at the one position based on the ratio of the still object image data determined to have a stationary object to the entirety of the plurality of still object image data recorded in the may Further, the determining unit 213 compares the plurality of images corresponding to the plurality of still object image data in chronological order based on the time information associated with the still object image data, thereby determining the It may be determined whether or not a stationary object is included.
  • the stationary object information storage device 200 may be mounted on the vehicle 2.
  • control unit 210 and storage unit 220 may be provided separately from vehicle ECU 10 , control unit 110 , storage unit 20 , and storage unit 120 .
  • the control unit 210 may be configured as a part of any one or more of the lamp ECU, the vehicle ECU 10, and the control unit 110, for example.
  • part of the functions of the control unit 210 may be implemented by the vehicle ECU 10 or the lamp ECU.
  • the storage unit 220 may be configured as a part of one or more of the storage unit 20, the storage unit 120, or a storage device provided for the lamp ECU, for example.
  • the stationary object information storage device 200 is mounted on the vehicle 2, the stationary object information acquisition device 100 and the stationary object information storage device 200 are configured to be connectable by wireless communication or wired communication.
  • the stationary object information storage method is executed, for example, by the controller 110 of the stationary object information acquisition device 100 loaded with the program 121 and the controller 210 of the stationary object information storage device 200 loaded with the program 221. .
  • the still object information acquisition device 100 identifies the still object information 123 using an image captured by a visible camera will be described as an example, but the present disclosure is limited to this. is not.
  • the stationary object information acquisition device 100 may identify the stationary object information 123 using, for example, an image output by millimeter wave radar or LiDAR.
  • FIG. 3 is a flowchart showing an example of the stationary object information storage method according to this embodiment. It should be noted that the order of each process constituting each flowchart described in this specification may be random as long as there is no contradiction or inconsistency in the contents of the process, and the processes may be executed in parallel.
  • control unit 110 acquires image data and the like. Specifically, control unit 110 acquires image data of an image captured by a visible camera. Also, the control unit 110 acquires the vehicle position information 124 corresponding to the image data.
  • step S10 the control unit 110 collects time information indicating the time when the image was captured, lighting information regarding whether or not the headlights of the vehicle 2 were on when the image was captured, and the image It is preferable to acquire one or more pieces of illuminance information indicating the illuminance around the vehicle 2 when is imaged. Acquiring these pieces of information makes it possible to appropriately compare each image, and as a result, it is possible to improve the detection accuracy of a stationary object.
  • the visible camera is controlled by the vehicle ECU, for example, so as to capture images of the exterior of the vehicle 2 at predetermined time intervals.
  • the control unit 110 receives the image data 122 of the images captured at predetermined time intervals, for example, at time intervals longer than the time interval at the time of image capturing (for example, 0.1 to 1 second) or at predetermined distance intervals between the image capturing positions. It is preferable to acquire by thinning out (for example, intervals of 1 to 10 m). By thinning out the image data 122, it is possible to suppress an increase in the capacity of the storage unit 120. FIG. In addition, since the targets of the specific processing in step S30 described later can be reduced, the burden on the control unit 110 can be reduced.
  • control unit 110 acquires all the image data 122 of images captured at predetermined time intervals, temporarily stores them in the storage unit 120, and at predetermined timing such as before the specific processing in step S30. , the image data 122 may be thinned.
  • the image acquisition unit 111 may thin out the image data 122 based on whether or not the image is captured in a place where the vehicle 2 usually travels. Specifically, the image acquisition unit 111 thins out the image data 122 of images captured on roads where the number of times of travel in the past predetermined period is less than a predetermined specified number (for example, once or less in the past month). good too. This is because identifying a stationary object in a place where the vehicle 2 does not normally travel is not very useful for the user of the vehicle 2 . In particular, when the stationary object information storage device 200 is mounted on the vehicle 2, it is preferable to thin out the image data 122 based on the number of times the vehicle travels to the imaging position during a predetermined period.
  • step S20 when the vehicle 2 is in the first state (Yes in step S20), the control unit 110 executes a process of identifying the stationary object information 123 in step S30. On the other hand, if vehicle 2 is not in the first state (No in step S20), control unit 110 waits to execute the specific process in step S30 until vehicle 2 is in the first state.
  • the "first state” is a state in which the processing load on the vehicle ECU 10 or the lamp ECU is considered to be small.
  • the “first state” includes, for example, a stopped state or a slow-moving state (for example, while traveling at a speed of 10 km/h or less).
  • the vehicle ECU 10 or the lighting ECU is configured to execute the specific process of step S30 at the timing when the vehicle 2 is in the first state. reduce the burden on Note that if the control unit 110 is configured independently of the vehicle ECU 10 and the lamp ECU, the determination in step S20 may not be performed.
  • step S ⁇ b>30 the control unit 110 executes identification processing for identifying the stationary object information 123 based on the image data 122 .
  • control unit 110 detects a light spot in the image.
  • a conventionally known technique can be used for the detection of the light spot, and for example, it can be performed by luminance analysis of the image.
  • step S32 the control unit 110 performs pattern recognition processing on the image.
  • a conventionally known method can be used as a pattern recognition method.
  • a machine learning model may be used to detect a stationary object, or a clustering method may be used to detect a stationary object.
  • step S33 the control unit 110 determines whether or not there is a stationary object in the image based on the results of the processing in steps S31 and/or S32. If it is determined that there is no stationary object in the image (No in step S33), control unit 110 deletes image data 122 corresponding to that image from storage unit 120 in step S35, and the process ends.
  • a stationary object position is, for example, the position of a stationary object in an image.
  • a stationary object position can be specified, for example, using an arbitrary coordinate system set in the image.
  • the stationary object position may indicate, for example, the center point of the stationary object or the position of the outer edge of the stationary object.
  • the position of the stationary object preferably includes information about the size specified using the coordinate system.
  • FIG. 5 is a schematic diagram for explaining stationary object position information.
  • a sign O1 and street lights O2 to O4 are identified as stationary objects.
  • stationary object position information can be defined by using coordinates defined by the x-axis and the y-axis to define the positions of areas Z1 to Z4 that include the sign O1 and the street lights O2 to O4, respectively.
  • the method of setting the coordinates is not particularly limited, and for example, the center of the image may be set as the origin.
  • the areas Z1 to Z4 do not include the post portions of the sign O1 and the street lights O2 to O4, but an area including those post portions may be set as the stationary object position.
  • the position of the stationary object specified in step S34 may indicate the distance and direction from the imaging position of the image to the stationary object.
  • the distance and direction from the imaging position to the stationary object may be calculated using the depth information. Further, it may be calculated by comparison with other image data 122 captured near the imaging position, or may be calculated using data acquired from a millimeter wave radar or LiDAR.
  • the image data 122 may be deleted from the storage unit 120, or may be included in the stationary object information 123 in association with the stationary object position information. After step S34, the process proceeds to step S40 in FIG.
  • the specific processing shown in FIG. 3 is executed based on the image data 122 captured in the daytime when the illuminance is equal to or higher than a predetermined value, it becomes easier to grasp the outline of the structure in the image, and the color of the structure can be detected from the image. Since it becomes easier to acquire information, it is possible to improve the detection accuracy of stationary objects by pattern recognition processing.
  • the identification processing of the stationary object information 123 may be performed by comparing a plurality of image data 122 captured at the same point or points that are close to each other. Further, in step S ⁇ b>34 , the control unit 110 may specify the type of the stationary object based on the results of steps S ⁇ b>31 and/or S ⁇ b>32 and include the type information in the stationary object information 123 . Further, the specifying process in step S30 is not limited to the above example, and for example, a technique similar to each example of the determination process in step S80 to be described later may be used. However, even if the techniques used are the same, the algorithm for the identification process in step S30 and the algorithm for the determination process in step S80 are different.
  • step S40 When the vehicle 2 is in the second state (Yes in step S40), the control unit 110 stores the stationary object information 123 and the vehicle position information 124 corresponding to the stationary object information 123 in the storage unit 220 in step S50. It is transmitted to the provided stationary object information storage device 200 . Further, in step S50, time information, lighting information, illuminance information, etc. may also be transmitted together. Together with these pieces of information, on the other hand, if vehicle 2 is not in the second state (No in step S40), control unit 110 waits to execute the transmission process in step S50 until vehicle 2 is in the second state. .
  • the "second state” is a state in which the processing load on the vehicle ECU 10 or the lamp ECU is considered to be small.
  • the “second state” includes, for example, a stopped state or a slow-moving state (for example, while traveling at a speed of 10 km/h or less).
  • the vehicle ECU 10 or the lamp ECU is configured to execute the transmission process of step S50 at the timing when the vehicle 2 is in the second state. reduce the burden on It should be noted that if the control unit 110 is configured independently of the vehicle ECU 10 and the lamp ECU, the determination in step S40 may not be performed.
  • the stationary object information 123 transmitted in step S50 may be the stationary object image data of the image identified as having the stationary object, the stationary object position information calculated from the image, or both. There may be.
  • the still object image data can be further examined in the still object information storage device 200 to obtain more accurate information.
  • the stationary object image data is not included in the transmitted stationary object information 123, it is advantageous in that the amount of data to be transmitted becomes small.
  • step S60 the control unit 210 receives various information transmitted in step S50.
  • step S70 the control unit 210 associates the stationary object information 123 received in step S60 with the vehicle position information 124 corresponding to the stationary object information 123 and records them in the stationary object database 222.
  • the stationary object database 222 records a plurality of pieces of stationary object position information in association with the imaging position.
  • the stationary object position information is information specified by the specifying process of step S30 or the determination process of step S80, which will be described later. Note that when only one stationary object is identified at a certain imaging position, one piece of stationary object position information can be associated with that imaging position.
  • FIGS. 6 and 7 show an example of information recorded in the stationary object database 222, and some information may not be recorded, and other information may be included.
  • the stationary object database 222 preferably includes both stationary object image data and stationary object position information from the viewpoint of increasing the accuracy of the determination process in step S80 and increasing the utility value of the stationary object database.
  • the stationary object database 222 may be managed as an individual database for each vehicle 2, for example. In this case, information accumulated in one database is based on information transmitted from one vehicle 2 . Also, the stationary object database 222 may be managed as a database of the entire plurality of vehicles 2, for example. In this case, multiple pieces of information transmitted from multiple vehicles 2 are aggregated in one database.
  • the stationary object database 222 may be managed as a database for each model of the vehicle 2, for example.
  • a plurality of pieces of information transmitted from a plurality of vehicles 2 of the same vehicle type are aggregated.
  • the vehicle height of the vehicle type, the position of the sensor unit 31, and the like can be taken into consideration in the determination process of step S80, which will be described later, so that more accurate determination can be made.
  • the stationary object information storage device 200 is configured to receive vehicle model information of the vehicle 2 when it receives the stationary object information 123 and the like. It may be configured to
  • step S80 the control unit 210 executes determination processing for determining whether or not a stationary object is included in the image corresponding to the stationary object image data using an algorithm different from the specifying processing in step S30, and ends the process.
  • the determination process in step S80 is performed to reconfirm whether or not a stationary object exists in the image data 122 for which the stationary object information 123 has been specified, to increase the information accuracy of the stationary object database 222, and/or to It is a process for identifying information and other details.
  • the light spot detection and pattern recognition processing techniques described in step S30 may be used in part of the processing.
  • FIG. 8 is a schematic diagram showing image capturing timings and respective images 122A to 122D acquired at the respective capturing timings.
  • the visible camera captures the front of the vehicle 2 at times T1, T2, T3, and T4, and outputs image data 122 of images 122A-D.
  • the intervals F1 to F4 between each time are all the same. That is, the images 122A to 122D are data of images captured at regular time intervals.
  • the headlights are on at times T1, T3, and T4, and off at time T2. Note that the vehicle 2 is traveling forward at a predetermined speed between times T1 to T4.
  • Whether or not a stationary object present in the image corresponding to the image data 122 is a self-luminous object is determined, for example, by at least two images captured before and after the switching timing of turning on and off the headlights mounted on the vehicle 2. can be determined based on the image data 122 of .
  • light points LP1 and LP2 are detected in image 122A.
  • the image 122B the light spot LP1 is detected, but the light spot is not detected at the position where the light spot LP2 is estimated to be detected (the position indicated by the dotted line).
  • the image 122C the light spot LP2 is detected again.
  • the light spot LP2 is detected as a light spot by reflecting the light from the headlamp, and that the light spot LP2 is not caused by the self-luminous body. Also, it can be determined that the light spot LP1 detected in the image 122B captured when the headlight is not turned on is caused by the self-luminous body.
  • determination processing by comparing a plurality of images can be performed by comparing images captured at the same position or positions close to each other, for example.
  • image 122C' is an image captured at the same position as image 122C before image 122C.
  • image 122C light points LP3 and LP4 are detected in addition to light points LP1 and LP2.
  • image 122C' light points LP1 and LP2 are detected, but light points LP3 and LP4 are not detected. If the light points LP3 and LP4 are stationary objects, they are also detected from the image 122C', but the light points LP3 and LP4 are not actually detected from the image 122C'. Therefore, it can be determined that the light points LP3 and LP4 are not stationary objects. Also, it can be determined that the light points LP1 and LP2 detected at the same position in both the image 122C and the image data C' are caused by stationary objects.
  • light spots detected from each image include not only light spots caused by stationary objects but also light spots caused by moving objects such as vehicles. It may contain light spots that cause
  • the position of each detected light spot is compared between a plurality of images, and if the position of the light spot does not change or the relative position between the light spots does not change, it is determined that the light spot is caused by a stationary object.
  • it is also possible to determine a stationary object by determining that a light spot whose position has changed significantly is a light spot of a moving object.
  • a light spot that is presumed to be a stationary object is specified based on the image 122C' captured at a certain position, and when the vehicle 2 passes through the certain position again after the image 122C' is captured, or
  • the light spot is identified also from the image 122C captured at the same position as the image 122C' when the other vehicle 2 passes through the certain position, the light spot is regarded as a stationary object and is regarded as a stationary object.
  • Information 123 can be determined.
  • the determination processing by comparing the plurality of image data 122 compares the plurality of image data 122 along the time series in which the images were captured, and determines the amount of movement of each light spot between the plurality of images and the traveling speed of the vehicle 2. It may be done based on For example, the amount of movement of light points LP3 and LP4 between images 122C-122D in the example of FIG. 8 is greater than the amount of movement of light points LP1 and LP2. If the amount of movement of the light points LP3 and LP4 is greater than the amount of movement estimated from the running speed of the vehicle 2, it is considered that the light points LP3 and LP4 are moving in the direction of the vehicle 2. LP4 can be identified as originating from mobile objects.
  • the amount of movement of the light points LP1 and LP2 is equal to the amount of movement estimated from the traveling speed of the vehicle 2, it can be identified that the light points LP1 and LP2 are caused by stationary objects. If the amount of movement of the light points LP1 and LP2 is smaller than the amount of movement estimated from the traveling speed of the vehicle 2, it is considered that the light points LP1 and LP2 are moving in the same direction as the traveling direction of the vehicle 2. , light points LP1 and LP2 are identified as originating from the moving object.
  • the determination processing in step S80 may be performed using a statistical method. For example, it is determined whether or not a still object is included in each of images corresponding to a plurality of still object image data recorded in association with a certain imaging position, and there is a still object for all of the plurality of still object image data. A final determination may be made as to whether or not there is a stationary object at a certain imaging position based on the ratio of the stationary object image data determined to be . Still object position information recorded in association with a certain imaging position can also be determined in the same manner as for still object image data.
  • FIG. 9 is a flowchart showing an example of determination processing in step S80.
  • the control unit 210 acquires reference image data of a reference image captured at the same position as the target image to be determined.
  • the reference image is, for example, a case where the vehicle 2 passes again through the imaging position of the image corresponding to the image data 122 in which the stationary object information 123 is specified, and indicates that the illuminance of the illuminance sensor 33 is equal to or higher than a predetermined value. It is the image data 122 of the image captured by the visible camera of the vehicle 2 when the signal is being output.
  • the reference image may be an image whose illuminance indicated by the illuminance information is equal to or greater than a predetermined value among other images captured at the same position as the target image.
  • step S182 the control unit 210 identifies the position of the stationary object in the reference image indicated by the reference image data.
  • the same technique as in steps S31 to S34 may be used.
  • step S183 the control unit 210 determines whether or not the still object position of the reference image matches the still object position of the target image, which is the image for which the still object information 123 has already been specified. If they match (Yes in step S183), control unit 210 determines that the identified stationary object information 123 is correct, and terminates the process.
  • step S184 the control unit 210 updates the stationary object information 123 in the stationary object database 222, and terminates.
  • step S184 for example, the positions of stationary objects whose positions match between the reference image and the target image are determined to be correct, and the positions of stationary objects whose positions do not match between the reference image and the target image are determined to be erroneous. , updates the stationary object database 222 .
  • FIG. 10 is a schematic diagram showing an example of the reference image 122E used for the determination process shown in FIG.
  • FIG. 11 is a schematic diagram showing an example of the target image 122F used in the determination process shown in FIG.
  • the reference image 122E is an image captured during the daytime
  • the target image 122F is an image captured during the nighttime.
  • the sign O1 and the street lights O2 to O4 are identified as stationary objects by the process of step S182.
  • the other vehicle C1 which is the preceding vehicle, has issued a hazard and has stopped, and the rear lamps BL1 and BL2 are on.
  • the rear lamps BL1 and BL2 are also erroneously detected as stationary objects by the process of step S30.
  • the oncoming vehicle C2 has its headlights HL1 and HL2 turned off because it is daytime. Therefore, headlights HL1 and HL2 are not specified as stationary objects.
  • step S30 identifies the sign O1 and the street lights O2 to O4 as stationary objects, and their surroundings as stationary object regions Z11 to Z14. Further, since it is nighttime, the other vehicle C1, which is the preceding vehicle, has its tail lamps turned on, and the rear lamps BL3 and BL4 are turned on. As a result, it is assumed that the rear lamps BL3 and BL4 are also erroneously detected as stationary objects by the process of step S30. Similarly, another vehicle C4, which is an oncoming vehicle, has its headlights HL3 and HL4 turned on, and the headlights HL3 and HL4 are erroneously detected as stationary objects. Although not shown, the surroundings of the rear lamps BL3 and BL4 and the headlamps HL3 and HL4 are also identified as stationary object areas.
  • the sign O1 and the street lights O2 to O4 are present at the same positions in both the reference image 122E and the target image 122F. Therefore, the sign O1 and the street lights O2 to O4 are determined to be stationary objects.
  • the rear lamps BL1 to BL4 and the headlamps HL3 to HL4 are present only in one of the images. Therefore, it is determined that the rear lamps BL1 to BL4 and the headlamps HL3 to HL4 are not stationary objects.
  • the stationary object information 123 is updated assuming that the rear lamps BL3 to BL4 and the headlamps HL3 to HL4 are not stationary objects.
  • FIG. 12 is a flowchart showing another example of determination processing in step S80.
  • the control unit 210 determines an imaging position to be determined, and acquires a plurality of still object image data captured at the imaging position from the stationary object database 222 .
  • step S82 the control unit 210 selects the image with the fewest light spots from among the images corresponding to the plurality of still object image data acquired in step S81.
  • the image 122C' is selected from the images 122C and 122C'.
  • a method similar to step S31 may be used to detect light spots in each image.
  • step S83 the control unit 210 determines whether or not there is a stationary object in the image selected in step S82.
  • a conventionally known technique such as pattern recognition processing can be used to determine whether or not there is a stationary object.
  • step S86 the control unit 210 deletes the image data 122 corresponding to that image from the stationary object database 222, and ends. Further, other still object image data acquired in step S81 may also be deleted.
  • step S84 the control unit 210 identifies the stationary object area or the stationary object position in the image.
  • the data size can be reduced by specifying the still object area and using the data of the portion including the still object area in the image as the still object image data. In this case, it is preferable to also specify information indicating the position of the stationary object area in the original image.
  • Still object image data may be processed to reduce the amount of data in an area other than the still object area.
  • step S85 the control unit 210 updates the stationary object database 222 so as to include the content specified in step S84, and terminates. Even if the imaging time is different, the position of the light spot caused by the stationary object does not change. Also, if there is a light spot not found in other images, it is presumed that the light spot is caused by a moving object. Therefore, according to the method described with reference to FIG. 9, the processing load can be reduced more than specifying the position of a stationary object from each of a plurality of images.
  • the present invention is not limited to the above-described embodiments, and can be modified, improved, etc. as appropriate.
  • the material, shape, size, numerical value, form, number, location, etc. of each component in the above-described embodiment are arbitrary and not limited as long as the present invention can be achieved.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un système de collecte d'informations d'objet stationnaire (1) comprenant un dispositif d'acquisition d'informations d'objet stationnaire (100) et un dispositif de stockage d'informations d'objet stationnaire (200). Le dispositif d'acquisition d'informations d'objet stationnaire (100) comprend : une unité d'acquisition d'image (111) qui acquiert des données d'image (122) ; une unité de spécification (112) qui spécifie des informations d'objet stationnaire (123) sur la base des données d'image (122) ; et une unité de transmission (113) qui transmet, au dispositif de stockage d'informations d'objet stationnaire (200), des informations de position de véhicule (124) à partir du moment où une image correspondant aux données d'image (122) a été capturée et les informations d'objet stationnaire (123). Le dispositif de stockage d'informations d'objet stationnaire (200) comprend : une unité de réception (211) qui reçoit les informations d'objet stationnaire (123) et les informations de position de véhicule (124) ; et une unité d'enregistrement (212) qui associe les informations d'objet stationnaire (123) et les informations de position de véhicule (124) à partir du moment où l'image correspondant aux données d'image (122), pour laquelle les informations d'objet stationnaire (123) ont été spécifiées, a été capturée et stocke celles-ci dans une base de données d'objets stationnaires (222).
PCT/JP2022/027594 2021-07-16 2022-07-13 Système de collecte d'informations d'objet stationnaire, programme et procédé de stockage d'informations d'objet stationnaire WO2023286811A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023534838A JPWO2023286811A1 (fr) 2021-07-16 2022-07-13
CN202280050076.8A CN117677996A (zh) 2021-07-16 2022-07-13 静止物信息积累系统、程序以及静止物信息存储方法

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2021117828 2021-07-16
JP2021-117828 2021-07-16
JP2021-117824 2021-07-16
JP2021117824 2021-07-16

Publications (1)

Publication Number Publication Date
WO2023286811A1 true WO2023286811A1 (fr) 2023-01-19

Family

ID=84920271

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/027594 WO2023286811A1 (fr) 2021-07-16 2022-07-13 Système de collecte d'informations d'objet stationnaire, programme et procédé de stockage d'informations d'objet stationnaire

Country Status (2)

Country Link
JP (1) JPWO2023286811A1 (fr)
WO (1) WO2023286811A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015108604A (ja) * 2013-12-06 2015-06-11 日立オートモティブシステムズ株式会社 車両位置推定システム,装置,方法、及び、カメラ装置
WO2020045317A1 (fr) * 2018-08-31 2020-03-05 株式会社デンソー Système de carte, procédé et support d'enregistrement

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015108604A (ja) * 2013-12-06 2015-06-11 日立オートモティブシステムズ株式会社 車両位置推定システム,装置,方法、及び、カメラ装置
WO2020045317A1 (fr) * 2018-08-31 2020-03-05 株式会社デンソー Système de carte, procédé et support d'enregistrement

Also Published As

Publication number Publication date
JPWO2023286811A1 (fr) 2023-01-19

Similar Documents

Publication Publication Date Title
US8103422B2 (en) Method for the anticipated ascertainment of a bend on a portion of road, and associated system
JP6572930B2 (ja) 情報処理装置及び情報処理システム
JP5820843B2 (ja) 周囲環境判定装置
US20170144585A1 (en) Vehicle exterior environment recognition apparatus
US8543254B1 (en) Vehicular imaging system and method for determining roadway width
US8232895B2 (en) Vehicle detection apparatus, vehicle detection program and light control apparatus
US20170144587A1 (en) Vehicle exterior environment recognition apparatus
US9042600B2 (en) Vehicle detection apparatus
US9102265B2 (en) Method and device for the distance-based debouncing of light-characteristic changes
US20050036660A1 (en) Image processing system and vehicle control system
US20170001564A1 (en) Vehicle on board system and method for the detection of objects in an environment surrounding a vehicle
KR102164461B1 (ko) 적응형 하이 빔 제어 장치를 갖는 이미징 시스템
JP2014160419A (ja) 周辺車両識別システム、特徴量送信装置、及び周辺車両識別装置
US10885359B2 (en) Non-transitory storage medium storing image transmission program, image transmission method, in-vehicle device, vehicle, and image processing system
RU2691939C1 (ru) Система управления передними фарами
JP6835149B2 (ja) 情報処理装置及び情報処理システム
KR20080004833A (ko) 주간 및 야간 주행 차량을 조도상황에 따라 검출하는 방법및 장치
JP2021128705A (ja) 物体状態識別装置
WO2023286811A1 (fr) Système de collecte d'informations d'objet stationnaire, programme et procédé de stockage d'informations d'objet stationnaire
JP7255706B2 (ja) 信号機認識方法及び信号機認識装置
JP6947316B2 (ja) 劣化診断装置、劣化診断システム、劣化診断方法、プログラム
WO2023286806A1 (fr) Dispositif d'acquisition d'informations d'objet fixe, programme, et procédé d'acquisition d'informations d'objet fixe
JP6151569B2 (ja) 周囲環境判定装置
WO2023286810A1 (fr) Dispositif d'utilisation d'informations d'objet fixe, programme, procédé d'utilisation d'informations d'objet fixe, système de véhicule et système d'utilisation d'informations d'objet fixe
JP2020181310A (ja) 車両の照明制御システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22842153

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023534838

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 202280050076.8

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE