CN117677996A - Stationary object information accumulation system, program, and stationary object information storage method - Google Patents

Stationary object information accumulation system, program, and stationary object information storage method Download PDF

Info

Publication number
CN117677996A
CN117677996A CN202280050076.8A CN202280050076A CN117677996A CN 117677996 A CN117677996 A CN 117677996A CN 202280050076 A CN202280050076 A CN 202280050076A CN 117677996 A CN117677996 A CN 117677996A
Authority
CN
China
Prior art keywords
stationary object
image data
information
vehicle
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280050076.8A
Other languages
Chinese (zh)
Inventor
神谷美纱子
片冈拓弥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koito Manufacturing Co Ltd
Original Assignee
Koito Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koito Manufacturing Co Ltd filed Critical Koito Manufacturing Co Ltd
Priority claimed from PCT/JP2022/027594 external-priority patent/WO2023286811A1/en
Publication of CN117677996A publication Critical patent/CN117677996A/en
Pending legal-status Critical Current

Links

Abstract

A stationary object information accumulation system (1) is provided with a stationary object information acquisition device (100) and a stationary object information storage device (200). A stationary object information acquisition device (100) is provided with: an image acquisition unit (111) for acquiring image data (122); a determination unit (112) for determining stationary object information (123) on the basis of the image data (122); and a transmitting unit (113) that transmits the stationary object information (123) and the vehicle position information (124) when the image corresponding to the image data (122) is captured to the stationary object information storage device (200). A stationary object information storage device (200) is provided with: a receiving unit (211) for receiving stationary object information (123) and vehicle position information (124); and a recording unit (212) that records, in a stationary database (222), stationary information (123) in association with vehicle position information (124) when an image corresponding to image data (122) for which the stationary information (123) has been specified is captured.

Description

Stationary object information accumulation system, program, and stationary object information storage method
Technical Field
The present disclosure relates to a stationary information accumulation system, a program, and a stationary information storage method.
Background
In recent years, an ADB (Adaptive Driving Beam) technology has been proposed that shields or dims the light of the position of a highly reflective object such as a light shield or a sign of a forward vehicle or a vehicle position based on the surrounding situation of the vehicle. For example, patent document 1 describes that a forward vehicle is detected and a forward light distribution is controlled.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open publication No. 2011-246023
Disclosure of Invention
Problems to be solved by the invention
Generally, the light distribution control of the ADB is performed based on target information transmitted from the vehicle. Each object is detected by a specific algorithm based on data acquired by a sensor such as a camera, but depending on the accuracy of the data or the accuracy of the detection by the algorithm, the object may not be detected (missed detection) although it is present or may be detected (false detection) although it is not present.
Incidentally, if stationary objects with high brightness such as street lamps and signs are present on the road, these stationary objects may be erroneously recognized as front vehicles. In addition, the headlight or the like of the front vehicle may be erroneously recognized as a street lamp or the like. If information on stationary objects such as street lamps and signs on roads can be collected, the method can be flexibly applied to reduce the possibility of occurrence of erroneous recognition as described above, and is therefore advantageous.
The present disclosure is directed to collecting and accumulating stationary object information of stationary objects such as street lamps, signs, etc. on a road.
Means for solving the problems
A stationary object information accumulation system according to an embodiment of the present disclosure includes a stationary object information acquisition device mounted on a vehicle and a stationary object information storage device disposed outside the vehicle,
The stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, stationary object information including at least one of stationary object image data corresponding to an image or a part of the image of a stationary object in which one or more of a self-luminous body, a logo, a contour mark, and a guardrail are present, and stationary object position information indicating a position of the stationary object calculated based on the image data; and
a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device by wireless communication,
the stationary object information storage device includes:
a receiving unit that receives the stationary object information and the vehicle position information transmitted from the transmitting unit; and
and a recording unit that records the stationary object information and the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured in a stationary object database in association with each other.
A program according to an embodiment of the present disclosure is a program executed in a computer device that includes a processor and is capable of being communicatively connected to a stationary object information acquisition device mounted on a vehicle,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on the vehicle;
a determination unit that determines, based on the image data, stationary object information including at least one of stationary object image data corresponding to an image or a part of the image of a stationary object in which one or more of a self-luminous body, a logo, a contour mark, and a guardrail are present, and stationary object position information indicating a position of the stationary object calculated based on the image data; and
a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device by wireless communication,
the program causes the processor to perform the steps of:
A receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit; and
and a recording step of recording the stationary information and the vehicle position information when an image corresponding to the image data for which the stationary information is specified is captured in association with each other in a stationary database.
A still object information storage method according to an aspect of the present disclosure is a still object information storage method executed in a computer device that includes a processor and is capable of being communicatively connected to a still object information acquisition device mounted on a vehicle,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on the vehicle;
a determination unit that determines, based on the image data, stationary object information including at least one of stationary object image data corresponding to an image or a part of the image of a stationary object in which one or more of a self-luminous body, a logo, a contour mark, and a guardrail are present, and stationary object position information indicating a position of the stationary object calculated based on the image data; and
A transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device by wireless communication,
the stationary object information storage method includes causing the processor to execute the steps of:
a receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit; and
and a recording step of recording the stationary information and the vehicle position information when an image corresponding to the image data for which the stationary information is specified is captured in association with each other in a stationary database.
A stationary object information accumulation system according to another aspect of the present disclosure includes a stationary object information acquisition device mounted on a vehicle, and a stationary object information storage device capable of being communicatively connected to the stationary object information acquisition device,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
A determination unit that determines, based on the image data, still object information including still object image data corresponding to an image of one or more of a self-luminous body, a logo, a delineator, and a guardrail, or a part of the image; and
a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device,
the stationary object information storage device includes:
a receiving unit that receives the stationary object information and the vehicle position information transmitted from the transmitting unit;
a recording unit that records the stationary object information and the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured in a stationary object database in association with each other; and
and a determination unit that determines whether or not the stationary object is included in the image corresponding to the stationary object image data by an algorithm different from the algorithm by which the determination unit determines the stationary object information.
A program according to another aspect of the present disclosure is a program executed in a computer device that includes a processor and is capable of being communicatively connected to a stationary object information acquisition device mounted on a vehicle,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, still object information including still object image data corresponding to an image of one or more of a self-luminous body, a logo, a delineator, and a guardrail, or a part of the image; and
a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device,
the program causes the processor to perform the steps of:
a receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit;
A recording step of recording the stationary information and the vehicle position information when an image corresponding to the image data for which the stationary information is specified is captured in a stationary database in association with each other; and
and a determination step of determining whether or not the stationary object is included in the image corresponding to the stationary object image data by an algorithm different from the algorithm by which the determination unit determines the stationary object information.
A still object information storage method according to another aspect of the present disclosure is a still object information storage method executed in a computer device that includes a processor and is capable of being communicatively connected to a still object information acquisition device mounted on a vehicle,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, still object information including still object image data corresponding to an image of one or more of a self-luminous body, a logo, a delineator, and a guardrail, or a part of the image; and
A transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device,
the stationary object information storage method includes causing the processor to execute the steps of:
a receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit;
a recording step of recording the stationary information and the vehicle position information when an image corresponding to the image data for which the stationary information is specified is captured in a stationary database in association with each other; and
and a determination step of determining whether or not the stationary object is included in the image corresponding to the stationary object image data by an algorithm different from the algorithm by which the determination unit determines the stationary object information.
Effects of the invention
According to the present disclosure, it is possible to collect and accumulate stationary object information of stationary objects such as street lamps, signs, and the like on a road.
Drawings
Fig. 1 is a schematic diagram showing an example of a stationary object information accumulation system according to an embodiment of the present disclosure.
Fig. 2 is a block diagram showing an example of a stationary object information accumulation system according to an embodiment of the present disclosure.
Fig. 3 is a flowchart showing an example of a still information storage method according to an embodiment of the present disclosure.
Fig. 4 is a flowchart showing an example of the process of determining the stationary object information shown in fig. 3.
Fig. 5 is a schematic diagram for explaining the stationary object position information.
Fig. 6 is an example of a stationary object information database.
Fig. 7 is an example of a stationary object information database.
Fig. 8 is a schematic diagram showing shooting timings and respective images acquired at the respective shooting timings.
Fig. 9 is a flowchart showing an example of the process of determining the stationary object information.
Fig. 10 is a schematic diagram showing an example of a reference image used in the process of determining the stationary object information.
Fig. 11 is a schematic diagram showing an example of an object image used in the determination processing of the stationary object information.
Fig. 12 is a flowchart showing another example of the still information determination process.
Detailed Description
The present invention will be described below with reference to the drawings based on embodiments. The same or equivalent components, members, and processes shown in the drawings are denoted by the same reference numerals, and repetitive description thereof will be omitted as appropriate. The embodiments are not limited to the invention, and all the features and combinations thereof described in the embodiments are not necessarily essential to the invention.
(System)
First, a system 1 according to an embodiment of the present disclosure will be described with reference to fig. 1 and 2. Fig. 1 is a schematic diagram illustrating a system 1 according to an embodiment of the present disclosure. As shown in fig. 1, the system 1 includes a stationary object information storage device 200 and a plurality of vehicles 2 such as a vehicle 2A and a vehicle 2B each having a stationary object information acquisition device 100 mounted thereon. The stationary information storage device 200 and each vehicle 2 can be communicatively connected to each other by wireless communication. The system 1 is an example of a stationary information storage system according to the present disclosure.
The stationary object information acquisition device 100 acquires stationary object information related to a stationary object, and transmits the stationary object information to the stationary object information storage device 200. The stationary information storage device 200 accumulates, for example, stationary information received from each stationary information acquisition device 100. The stationary object information storage device 200 analyzes the received stationary object information, for example, to improve the accuracy of the stationary object information, to acquire more detailed information, or to create a light distribution pattern based on the stationary object information. The stationary object information storage device 200 transmits the stationary object information and the like with the improved accuracy to each vehicle 2, for example, in response to a request from each vehicle 2. In each vehicle 2, for example, by using the stationary object information and the like having improved accuracy received from the stationary object information storage device 200, it is possible to improve accuracy and efficiency in detecting a target object, or to appropriately perform light distribution control of the headlight.
Here, the "stationary object" in the present embodiment refers to an object fixed to a road and having high brightness, and specifically, one or more of a self-luminous body (for example, a street lamp, a signal, etc.), a sign, a logo, and a guardrail. That is, the stationary object information acquisition device 100 in the present embodiment acquires stationary object information related to various stationary objects listed as specific examples. In addition, as another embodiment, the stationary object information acquisition device 100 may be configured to be able to identify an object that is not included in the specific example described above, that is, an object that is fixed to a road and has high brightness and that can affect the detection of a target object, as a stationary object.
Fig. 2 is a block diagram illustrating a system 1 according to an embodiment of the present disclosure. The vehicle 2 includes a vehicle ECU (Electronic Control Unit: electronic control unit) 10, a storage unit 20, a sensor unit 31, a position information acquisition unit 32, an illuminance sensor 33, and a stationary object information acquisition apparatus 100. The vehicle 2 can be connected to the stationary information storage device 200 by wireless communication via the communication network 3. The means of wireless communication is not particularly limited, and may be, for example, a mobile communication system such as a telematics (telematics) system for a vehicle, cooperation with a smart phone, and flexible use of Wi-Fi in a vehicle.
The vehicle ECU10 controls various operations such as running of the vehicle 2. The vehicle ECU10 includes a processor such as an ASIC (Application Specific Integrated Circuit: application specific integrated circuit), an FPGA (Field programmable Gate Array: field programmable gate array), or a general-purpose CPU (Central Proc essing Unit: central processing unit), for example. The storage unit 20 includes, for example, a ROM (Read Only Memory) in which various vehicle control programs are stored, and a RAM (Random Access Memory: random access Memory) in which various vehicle control data are temporarily stored. The processor of the vehicle ECU10 loads data designated by various vehicle control programs stored in the ROM on the RAM, and controls various operations of the vehicle 2 by cooperation with the RAM.
The sensor unit 31 outputs image data of an image obtained by capturing an image of the outside of the vehicle 2. The sensor unit 31 includes, for example, one or more sensors selected from a visual camera, a LiDAR, and a millimeter wave radar. The image data output by the LiDAR and millimeter wave radar may be data of a three-dimensional image. The position information acquisition unit 32 outputs vehicle position information indicating the current position of the vehicle 2. The position information acquisition unit 32 includes, for example, a GPS (Global Positioning System: global positioning system) sensor. The illuminance sensor 33 detects and outputs illuminance around the vehicle 2.
The stationary object information acquiring apparatus 100 includes a control unit 110 and a storage unit 120. The control unit 110 is configured by a processor such as a CPU, for example. The control unit 110 may be configured as a part of a lamp ECU that controls the operation of a lamp such as a headlight in the vehicle 2. The control unit 110 may be configured as a part of the vehicle ECU10, for example. The storage unit 120 is configured by, for example, ROM, RAM, or the like. The storage unit 120 may be configured as a part of the storage unit 20 or a storage device provided for the lamp ECU.
The control unit 110 functions as an image acquisition unit 111, a determination unit 112, a transmission/reception unit 113, and a determination unit 114 by reading the program 121 stored in the storage unit 120. Some of these functions may be realized by the vehicle ECU10 or the lamp ECU. In the case of such a configuration, the vehicle ECU10 or the lamp ECU constitutes a part of the stationary information acquisition device 100. In addition, the program 121 may also be recorded in a non-transitory computer-readable medium.
The image acquisition unit 111 acquires image data 122 of the image captured by the sensor unit 31. The acquired image data 122 is stored in the storage unit 120. In addition, the image acquisition section 111 acquires vehicle position information 124 (i.e., shooting position information indicating the shooting position of the image) when the image corresponding to the acquired image data 122 is shot from the position information acquisition section 32. The vehicle position information 124 preferably includes information indicating the orientation of the vehicle 2 at the time of capturing the image. The vehicle position information 124 may include information indicating the position of the vehicle in the vehicle width direction. The position of the vehicle in the vehicle width direction may be calculated based on a travel lane detected, for example. The acquired vehicle position information 124 is stored in the storage unit 120. The vehicle position information 124 is stored in the storage unit 120 in association with the corresponding image data 122, for example.
The image acquisition unit 111 may acquire time information indicating the time when the image was captured. The time information may include information indicating the date and time of the image captured. The image acquisition unit 111 may acquire lighting information on whether or not the headlight of the vehicle 2 is lighted when the image is captured. The time information and the lighting information are stored in the storage unit 120 in association with the corresponding image data 122, for example.
The image acquisition unit 111 can acquire, as reference image data, image data 122 captured when the illuminance sensor 33 outputs a signal indicating illuminance equal to or higher than a predetermined value (for example, 1000 lux). Here, the illuminance equal to or higher than the predetermined value is, for example, an illuminance equal to or higher than a value determined to be the daytime. That is, the image acquisition unit 111 can store the image data 122 of the image captured during the daytime as reference image data in the storage unit 120. The image acquisition unit 111 may acquire illuminance information indicating illuminance around the vehicle 2 when the image is captured from the illuminance sensor 33, and store the image data 122 in the storage unit 120 in association with the illuminance information. In this case, the image data 122 whose illuminance indicated by the associated illuminance information is equal to or greater than a predetermined value can be used as the reference image data.
The determination section 112 determines the stationary object information 123 based on the image data 122. The stationary object information 123 specified by the specifying unit 112 is stored in the storage unit 120. Here, the "stationary object information" is information including at least one of stationary object image data corresponding to an image in which a stationary object exists or a part of the image, and stationary object position information indicating a position of the stationary object calculated based on the image data 122.
The determination unit 112 detects a stationary object in an image by, for example, image analysis, and includes image data 122 of an image in which the stationary object is detected as stationary object image data in the stationary object information 123. The determination unit 112 determines, for example, a region including a stationary object in an image in which the stationary object is detected as a stationary object region, and includes data corresponding to the stationary object region as part of the image as stationary object image data in the stationary object information 123. The determination unit 112 calculates the position of the stationary object based on, for example, the image in which the stationary object is detected, and includes stationary object position information indicating the position of the stationary object in the stationary object information 123. The stationary object position information may be information indicating the position of the stationary object in the image (for example, the coordinates and the size of the position of the stationary object in the image), or information indicating the distance and the direction from the imaging position of the image to the stationary object. The determination unit 112 may determine the type of the stationary object and include information indicating the type in the stationary object information 123.
The transmitting/receiving unit 113 transmits and receives information to and from the vehicle ECU10 and the stationary object information storage device 200. That is, the transmitting/receiving unit 113 functions as a transmitting unit and a receiving unit. The transmitting/receiving unit 113 transmits the stationary information 123 and the vehicle position information 124 corresponding to the stationary information 123 (when an image corresponding to the image data 122 for which the stationary information 123 is specified is captured) to the stationary information storage device 200 including the storage unit 220. In addition, the transmitting/receiving section 113 may transmit the reference image data to the stationary information storage device 200. The transmitting/receiving unit 113 may transmit/receive other information to/from the stationary information storage device 200 as needed.
The determination unit 114 determines whether or not a stationary object is present at the position indicated by the stationary object position information calculated by the determination unit 112, based on the reference image data captured at the same position as the image capturing position of the image corresponding to the image data 122 for calculation of the stationary object position. The reference image data used for the determination by the determination unit 114 is, for example, the following image data: the image data 122 of the image captured when the vehicle 2 passes the position indicated by the vehicle position information 124 corresponding to the image data 122 of the stationary object information determined by the determining unit 112 and when the illuminance sensor 33 outputs a signal indicating illuminance equal to or higher than a predetermined value is the image data acquired by the image acquiring unit 111.
The stationary information storage device 200 includes a control unit 210 and a storage unit 220. In the present embodiment, the stationary information storage device 200 is a computer device that collects and accumulates information transmitted from a plurality of vehicles 2, and is provided in a data center, for example. The control unit 210 is configured by a processor such as a CPU, for example. The storage unit 220 is configured by, for example, ROM, RAM, or the like.
The control unit 210 functions as a transmitting/receiving unit 211, a recording unit 212, and a determining unit 213 by reading a program 221 stored in the storage unit 220. Note that the program 221 may also be recorded in a non-transitory computer-readable medium.
The transmitting/receiving unit 211 transmits and receives information to and from the vehicle ECU10 and the stationary object information acquiring apparatus 100. That is, the transmitting/receiving unit 211 functions as a transmitting unit and a receiving unit. The transmitting/receiving unit 211 receives the stationary object information 123 transmitted from the transmitting/receiving unit 113 and the vehicle position information 124 corresponding to the stationary object information 123. In addition, the transmitting and receiving section 211 may receive the reference image data from the stationary object information acquiring apparatus 100. The transmitting/receiving unit 211 may transmit/receive other information to/from the vehicle ECU10 and the stationary object information acquiring apparatus 100 as necessary.
The recording unit 212 records the stationary object information 123 received by the transmitting/receiving unit 211 and the vehicle position information 124 corresponding to the stationary object information 123 in the stationary object database 222 in association with each other. The recording unit 212 may update the stationary object database 222 based on the determination result of the determination unit 213.
In the stationary object database 222, vehicle position information 124 and stationary object information 123 are recorded in association. In the stationary object database 222, for example, a plurality of stationary object image data are recorded for one shooting position indicated by the vehicle position information 124. In addition, in the still image database 222, still image data having the same shooting position may be recorded in association with reference image data. In the stationary object database 222, information such as the position, size, distance from the imaging position, direction, type of stationary object, and the like of the stationary object may be recorded in association with the imaging position.
The determination unit 213 determines whether or not the stationary object information 123 includes a stationary object in an algorithm different from the algorithm by which the determination unit 112 determines the stationary object information 123. The algorithm used by the determination unit 213 is preferably an algorithm having higher detection accuracy of the stationary object than the algorithm used by the determination unit 112, for example.
The determination unit 213 functions as, for example, a first determination unit that determines whether or not a stationary object is included in an image corresponding to the stationary object image data included in the stationary object information 123, using the image. The determination unit 213 functions as, for example, a second determination unit that determines whether or not the stationary object is included in the image corresponding to the stationary object image data based on the reference image data associated with the stationary object image data determined by the determination unit 112 that the stationary object is present. The determination unit 213 functions as a third determination unit that determines whether or not the stationary object is a self-luminous body based on two or more images captured before and after the switching timing of turning on and off the headlight of the vehicle 2. The determination unit 213 may determine detailed information such as the position and size of the stationary object in the image, the distance from the imaging position of the image to the stationary object, the direction, and the type of the stationary object, using the image corresponding to the image data of the stationary object.
The determination unit 213 may determine, for example, first still image data having a minimum light spot among a plurality of still image data recorded at one imaging position, and determine whether or not a still exists at the one position based on the first still image data. The determination unit 213 may determine whether or not a still object is included in each of the images corresponding to the plurality of still object image data recorded at one imaging position, and may perform a final determination of whether or not a still object is present at the one position based on the ratio of the still object image data determined to be a still object to the entire plurality of still object image data recorded at the one position. The determination unit 213 may determine whether or not the stationary object is included in each of the plurality of images by comparing the plurality of images corresponding to the plurality of stationary object image data along the time series based on the time information associated with the stationary object image data.
As another example of the system 1, the stationary object information storage device 200 may be mounted on the vehicle 2. In this case, the control unit 210 and the storage unit 220 may be provided separately from the vehicle ECU10, the control unit 110, the storage unit 20, and the storage unit 120. On the other hand, the control unit 210 may be configured as part of any one or more of the lamp ECU, the vehicle ECU10, and the control unit 110, for example. A part of the functions listed as the functions of the control unit 210 may be realized by the vehicle ECU10 or the lamp ECU. The storage unit 220 may be configured as part of at least one of the storage unit 20, the storage unit 120, and a storage device provided in the lamp ECU, for example. When the stationary information storage device 200 is mounted on the vehicle 2, the stationary information acquisition device 100 and the stationary information storage device 200 are configured to be connectable by wireless communication or wired communication.
(still information storage method)
Next, a method for storing still information in the system 1 according to the present embodiment will be described. The stationary object information storage method according to the present embodiment is executed by, for example, the control unit 110 of the stationary object information acquisition device 100, which reads the program 121, and the control unit 210 of the stationary object information storage device 200, which reads the program 221. In the following description, the case where the stationary information acquisition device 100 determines the stationary information 123 using an image captured by a video camera is taken as an example, but the present disclosure is not limited thereto. The stationary object information acquiring apparatus 100 may determine the stationary object information 123 using, for example, an image output by millimeter wave radar or LiDAR.
Fig. 3 is a flowchart showing an example of the stationary object information storage method according to the present embodiment. The order of the processes constituting the flowcharts described in the present specification may be different in order and parallel to each other insofar as the process contents do not contradict or match each other.
First, in step S10, the control unit 110 acquires image data and the like. Specifically, the control unit 110 acquires image data of an image captured by the video camera. The control unit 110 acquires vehicle position information 124 corresponding to the image data.
In step S10, the control unit 110 preferably acquires one or more of time information indicating a time when the image was captured, lighting information regarding whether or not the headlight of the vehicle 2 was lit when the image was captured, and illuminance information indicating illuminance of the surroundings of the vehicle 2 when the image was captured. By acquiring these pieces of information, the images can be appropriately compared, and as a result, the detection accuracy of the stationary object can be improved.
Here, the video camera is controlled by the vehicle ECU, for example, so as to take an image of the outside of the vehicle 2 at predetermined time intervals. The control unit 110 preferably obtains, for example, the following: the image data 122 of the images photographed at predetermined time intervals are removed at time intervals (for example, 0.1 to 1 second) longer than the time intervals at the time of photographing or at predetermined distance intervals (for example, 1 to 10m intervals) at the photographing positions. By eliminating the image data 122, the memory unit 120 can be suppressed from increasing in capacity. Further, since the number of the determination processing steps in step S30 to be described later can be reduced, the burden on the control unit 110 can be reduced. For example, the control unit 110 may acquire all of the image data 122 of the images captured at predetermined time intervals, temporarily store the acquired image data in the storage unit 120, and delete the image data 122 at a predetermined timing such as before the determination processing in step S30.
The image acquisition unit 111 may also reject the image data 122 from the viewpoint of whether or not the image is captured at a place where the vehicle 2 is traveling normally. Specifically, the image acquisition unit 111 may remove the image data 122 of the image captured on the road (for example, one time a month or less in the past) on which the number of traveling times in the past predetermined period is lower than the predetermined number. This is because, even if a stationary object is determined at a place where the vehicle 2 does not travel at ordinary times, it is not very beneficial to the user of the vehicle 2. In particular, when the stationary object information storage device 200 is mounted on the vehicle 2, it is preferable to eliminate the image data 122 based on the number of times of travel of the imaging position in a predetermined period.
Next, when the vehicle 2 is in the first state (yes in step S20), the control unit 110 executes a process of determining the stationary object information 123 in step S30. On the other hand, when the vehicle 2 is not in the first state (no in step S20), the control unit 110 waits for the determination process of step S30 to be executed until the vehicle 2 is in the first state.
Here, the "first state" is a state that is considered to be a small processing load of the vehicle ECU10 or the lamp ECU. The "first state" includes, for example, a parking state or a jogging state (for example, traveling at a speed of 10Km or less per hour). When the control unit 110 is configured as a part of the vehicle ECU10 or the lamp ECU, the determination processing in step S30 is executed at the timing when the vehicle 2 is in the first state, which contributes to a reduction in the load on the vehicle ECU10 or the lamp ECU. In the case where the control unit 110 is configured independently of the vehicle ECU10 and the lamp ECU, the determination in step S20 may not be performed.
In step S30, the control section 110 performs a determination process of determining the stationary object information 123 based on the image data 122.
The process of determining the stationary object information 123 in step S30 will be described in detail with reference to fig. 4. Fig. 4 is a flowchart showing an example of the determination processing of the stationary object information 123. In step S31, the control unit 110 detects a light spot in an image. The detection of the light spot can be performed by, for example, luminance analysis of an image, using a conventionally known technique.
In step S32, the control unit 110 performs pattern recognition processing on the image. The pattern recognition method may be a conventionally known method, and for example, a machine learning model may be used to detect a stationary object, or a clustering method may be used to detect a stationary object.
Next, in step S33, the control unit 110 determines whether or not a stationary object is present in the image based on the result of the processing in step S31 and/or step S32. When it is determined that there is no stationary object in the image (no in step S33), the control unit 110 deletes the image data 122 corresponding to the image from the storage unit 120 and ends in step S35.
When it is determined that a stationary object exists in the image (yes in step S33), the control unit 110 determines a stationary object region or a stationary object position in the image in step S34. By specifying the still object region and setting the data of the portion including the still object region in the image as still object image data, the data capacity at the time of transmission to the still object information storage device 200 can be reduced. In this case, it is preferable that information indicating the position of the still object region in the original image is also determined and included in the still object information 123. The still image data may be data obtained by performing a process of reducing the data amount of the region other than the still region.
The stationary object position is, for example, a position of a stationary object in an image. The stationary object position can be determined using, for example, an arbitrary coordinate system set in the image. The stationary object position may be a position indicating a center point of the stationary object, or a position indicating an outer edge position of the stationary object, for example. Further, the stationary object position preferably includes information on the size determined by using the coordinate system.
Fig. 5 is a schematic diagram for explaining the stationary object position information. In the image shown in fig. 5, the markers O1, the street lamps O2 to O4 are determined as stationary objects. In this case, for example, the positions of the zones Z1 to Z4 including the marks O1, the street lamps O2 to O4 may be defined as the stationary object position information by using the coordinates defined by the x-axis and the y-axis. The method of setting the coordinates is not particularly limited, and for example, the center of the image may be set as the origin. In the example of fig. 5, the pillar portions of the marks O1 and the street lamps O2 to O4 are not included in the regions Z1 to Z4, but the regions including these pillar portions may be set as stationary positions.
The stationary object position determined in step S34 may be a position indicating a distance or direction from the image capturing position to the stationary object. For example, when the image data 122 includes depth information, the distance and direction from the imaging position to the stationary object may be calculated using the depth information. The calculation may be performed by comparing the image data with other image data 122 captured in the vicinity of the imaging position, or may be performed using data acquired from a millimeter wave radar or LiDAR.
When the stationary object position is determined, the image data 122 may be deleted from the storage unit 120 or may be included in the stationary object information 123 in association with the stationary object position information. After step S34, the process advances to step S40 in fig. 2.
When the determination processing shown in fig. 3 is executed based on the image data 122 captured during the daytime when the illuminance is equal to or higher than the predetermined value, the outline of the structure in the image is easily grasped, and the color information of the structure is easily acquired from the image, so that the detection accuracy of the stationary object by the pattern recognition processing can be improved.
The process of determining the stationary object information 123 may be performed by comparing a plurality of image data 122 captured at the same location or at locations close to each other. In step S34, the control unit 110 may determine the type of the stationary object based on the result of step S31 and/or step S32, and include the type information in the stationary object information 123. The determination processing in step S30 is not limited to the above example, and for example, the same method as each example of the determination processing in step S80 described later may be used. However, even if the same method is used, the algorithm for the determination processing in step S30 is different from the algorithm for the determination processing in step S80.
The description returns to fig. 3. When the vehicle 2 is in the second state (yes in step S40), the control unit 110 transmits the stationary object information 123 and the vehicle position information 124 corresponding to the stationary object information 123 to the stationary object information storage device 200 including the storage unit 220 in step S50. In step S50, time information, lighting information, illuminance information, and the like may be transmitted together. On the other hand, if the vehicle 2 is not in the second state (no in step S40), the control unit 110 waits for the transmission processing of step S50 to be executed until the vehicle 2 is in the second state.
Here, the "second state" refers to a state in which the processing load of the vehicle ECU10 or the lamp ECU is considered to be small. The "second state" includes, for example, a parking state or a jogging state (for example, traveling at a speed of 10Km or less). When the control unit 110 is configured as a part of the vehicle ECU10 or the lamp ECU, the transmission process of step S50 is executed at the timing when the vehicle 2 is in the second state, thereby contributing to a reduction in the load on the vehicle ECU10 or the lamp ECU. In the case where the control unit 110 is configured independently of the vehicle ECU10 and the lamp ECU, the determination in step S40 may not be performed.
The stationary object information 123 transmitted in step S50 may be stationary object image data of an image determined to be a stationary object, stationary object position information calculated from the image, or both. When still image data is included in the transmitted still information 123, still image data can be further examined in the still information storage device 200 to obtain information with higher accuracy. On the other hand, in the case where still image data is not included in the transmitted still information 123, it is advantageous in that the capacity of the transmitted data becomes small.
Next, in step S60, the control unit 210 receives the various information transmitted in step S50. Next, in step S70, the control unit 210 records the stationary object information 123 received in step S60 and the vehicle position information 124 corresponding to the stationary object information 123 in the stationary object database 222 in association with each other.
Fig. 6 and 7 are examples of the stationary object database 222. In the example of fig. 6, in the still object database 222, a plurality of still object image data and reference image data are recorded in association with the shooting position. As the photographing position, latitude and longitude of the photographing position, and the orientation of the vehicle 2 at the time of photographing (the orientation of the visual camera) are recorded. As the still image data, an ID for identifying the still image data, time information, illuminance information, and lighting information are recorded. An ID for identification is recorded in the reference image data. The reference image data may also contain the same information as the still image data. Still image data whose illuminance indicated by the illuminance information is equal to or greater than a predetermined value may be processed as reference image data.
In the example of fig. 7, a plurality of pieces of still position information are recorded in association with the shooting positions in the still database 222. As the stationary object position information, the position, size, height, distance from the photographing position, direction, and type of stationary object are recorded. The stationary object position information is information determined by the determination process in step S30 or the determination process in step S80 described later. In the case where only one of the determined stationary objects exists at a certain shooting position, the stationary object position information associated with the shooting position may be one.
Fig. 6 and 7 show an example of the information recorded in the stationary object database 222, and a part of the information may not be recorded, or other information may be contained. From the viewpoint of improving the accuracy of the determination processing in step S80 or the use value of the stationary object database, it is preferable that the stationary object database 222 contains both stationary object image data and stationary object position information.
The stationary object database 222 may be managed as a separate database for each vehicle 2, for example. In this case, the information accumulated in one database is based on the information transmitted from one vehicle 2. The stationary object database 222 may be managed as a database of the entire plurality of vehicles 2, for example. In this case, a plurality of pieces of information transmitted from a plurality of vehicles 2 are collected in one database.
The stationary object database 222 may be managed as a database for each model of the vehicle 2, for example. In this case, a plurality of pieces of information transmitted from a plurality of vehicles 2 of the same vehicle type are collected in one database. When managed as a database for each vehicle type, in the determination process of step S80 described later, determination with higher accuracy can be performed in consideration of the vehicle height of the vehicle type, the position of the sensor unit 31, and the like. In addition, when providing a service using the stationary object database 222, it is easy to provide a service that is optimized for the vehicle type. The stationary object information storage device 200 may be configured to receive the model information of the vehicle 2 when receiving the stationary object information 123 and the like, and to record the model information in association with the stationary object image data and the like in the stationary object database 222.
The description returns to fig. 3. In step S80, the control unit 210 executes a determination process of determining whether or not the stationary object is included in the image corresponding to the stationary object image data by an algorithm different from the determination process in step S30, and ends. The determination processing in step S80 is processing for confirming again whether or not a stationary object is present in the image data 122 for which the stationary object information 123 is determined, to improve the information accuracy of the stationary object database 222, and/or for determining the stationary object position information and other detailed information. In step S80, a method of detecting a light spot and performing pattern recognition processing as described in the description of step S30 may be used as a part of the processing.
Hereinafter, a method of comparing a plurality of images will be described as a specific example of the determination processing in step S80. Fig. 8 is a schematic diagram showing the image capturing timing and the images 122A to 122D acquired at the respective capturing timings. In the example of fig. 8, the video camera captures images of the front of the vehicle 2 at times T1, T2, T3, and T4, respectively, and outputs image data 122 of images 122A to 122D. The intervals F1 to F4 are the same for each time. That is, the images 122A to 122D are data of images captured at regular time intervals. The head lamp is turned on at times T1, T3, and T4, and turned off at time T2. Between time T1 and time T4, vehicle 2 travels forward at a predetermined speed.
Whether or not the stationary object present in the image corresponding to the image data 122 is a self-luminous body can be determined based on the image data 122 of each of at least two images captured before and after the switching timing of turning on and off the head lamp mounted on the vehicle 2, for example. In fig. 8, in the image 122A, the light spots LP1 and LP2 are detected. On the other hand, in the image 122B, although the light spot LP1 is detected, the light spot is not detected at the position (position indicated by a broken line) where the light spot LP2 is estimated to be detected. In addition, in the image 122C, the light spot LP2 is detected again. Thus, it can be determined that the light spot LP2 is a light spot that is detected as a light spot by reflection of light from the headlight, and that the light spot LP2 is not a light spot formed by a self-luminous body. In addition, it can be determined that the light spot LP1 detected also in the image 122B captured when the headlamp is not on is a light spot formed by a self-luminous body.
In addition, the determination processing based on the comparison of the plurality of images can be performed by comparing images captured at the same position or at positions close to each other, for example. In fig. 8, an image 122C' is an image captured at the same position as the image 122C before the image 122C. In the image 122C, the light spots LP3 and LP4 are detected in addition to the light spots LP1 and LP 2. On the other hand, in the image 122C', the light spots LP1 and LP2 are detected, but the light spots LP3 and LP4 are not detected. If the light spots LP3 and LP4 are stationary objects, the light spots LP3 and LP4 may be detected from the image 122C ', but the light spots LP3 and LP4 are not actually detected from the image 122C'. Therefore, it can be determined that the light spots LP3 and LP4 are not stationary objects. It can be determined that the light spots LP1 and LP2 detected at the same position in both the image 122C and the image data 122C' are light spots generated by the stationary object.
In this way, when there are a plurality of images captured at the same point, the light spots detected from the respective images may include not only the light spots generated by the stationary object but also the light spots generated by the moving object such as a vehicle. For example, the positions of the detected light spots are compared among a plurality of images, and if the positions of the light spots are unchanged or the relative positions between the light spots are unchanged, the light spots are determined to be light spots generated by a stationary object, and the light spots whose positions are greatly changed are determined to be light spots of a moving object, whereby the stationary object can also be determined.
In other words, the light spot estimated to be a stationary object is determined based on the image 122C ' captured at a certain position, and when the light spot is also determined from the image of the image 122C captured at the same position as the image 122C ' when the vehicle 2 passes through the certain position again after the image 122C ' is captured or when another vehicle 2 passes through the certain position, the stationary object information 123 can be determined by regarding the light spot as a stationary object.
Since the lamp is turned on by another vehicle traveling at night, a light spot generated by the lamp of the other vehicle is easily detected from an image captured at night. Therefore, from the viewpoint of improving the detection accuracy of the stationary object, it is preferable that the image captured at night be compared with other images captured at the same position, and more preferably with other images captured at the same position in the daytime.
The determination process based on the comparison of the plurality of image data 122 may be performed by comparing the plurality of image data 122 along the time series of the captured images, and based on the movement amount of each light spot between the plurality of images and the travel speed of the vehicle 2. For example, in the example of fig. 8, the movement amounts of the light spots LP3 and LP4 between the images 122C to 122D are larger than the movement amounts of the light spots LP1 and LP 2. When the movement amounts of the light spots LP3 and LP4 are larger than the movement amount estimated from the traveling speed of the vehicle 2, it is considered that the light spots LP3 and LP4 move in the direction of the vehicle 2, and it can be determined that the light spots LP3 and LP4 are light spots generated by the moving body. When the movement amounts of the light spots LP1 and LP2 are equal to the movement amount estimated from the traveling speed of the vehicle 2, it can be determined that the light spots LP1 and LP2 are light spots generated by stationary objects. When the movement amounts of the light spots LP1 and LP2 are smaller than the movement amount estimated from the traveling speed of the vehicle 2, the light spots LP1 and LP2 are considered to be moving in the same direction as the traveling direction of the vehicle 2, and therefore, it is possible to determine that the light spots LP1 and LP2 are light spots generated by the moving body.
The determination processing in step S80 may be performed using a statistical method. For example, it may be determined whether or not a still object is included in each of images corresponding to a plurality of still object image data recorded in association with a certain imaging position, and a final determination of whether or not a still object is present at the certain imaging position may be performed based on a ratio of the still object image data determined to be a still object to the entire plurality of still object image data. The still object position information recorded in association with a certain imaging position can be determined by the same method as that for still object image data.
Next, as an example of the determination processing in step S80, a case where the reference image is used will be described. Fig. 9 is a flowchart showing an example of the determination processing in step S80. In step S181, the control unit 210 acquires reference image data of a reference image captured at the same position as the target image to be determined. The reference image is, for example, image data 122 of an image captured by a visual camera of the vehicle 2 when the vehicle 2 passes through the capturing position of the image corresponding to the image data 122 for which the stationary object information 123 is determined again and when the illuminance sensor 33 outputs a signal indicating illuminance equal to or higher than a predetermined value. The reference image may be an image in which illuminance indicated by illuminance information in another image captured at the same position as the target image is equal to or greater than a predetermined value.
Next, in step S182, the control unit 210 determines the position of the stationary object in the reference image indicated by the reference image data. The stationary object position determination in step S182 may be performed by the same method as in steps S31 to S34, for example.
Next, in step S183, the control unit 210 determines whether or not the stationary object position of the reference image matches the stationary object position of the image that is the determined stationary object information 123. If the pieces of information match (yes in step S183), the control unit 210 determines that the determined stationary object information 123 is correct, and ends the process.
On the other hand, if the pieces of still information 123 do not match (no in step S183), the control unit 210 updates and ends the still information 123 in the still database 222 in step S184. In step S184, for example, the stationary object position at which the position between the reference image and the target image is coincident is determined to be the correct position, and the stationary object position at which the position between the reference image and the target image is not coincident is regarded as the wrong position, and the stationary object database 222 is updated.
Fig. 10 is a schematic diagram showing an example of the reference image 122E used in the determination process shown in fig. 9. Fig. 11 is a schematic diagram showing an example of the target image 122F used in the determination process shown in fig. 9. In this example, the reference image 122E is an image captured during the daytime, and the target image 122F is an image captured during the nighttime.
In the reference image 122E, the marker O1 and the street lamps O2 to O4 are determined as stationary objects by the processing of step S182. The other vehicle C1 as the preceding vehicle turns on the hazard lamp and stops, and the rear lamp BL1 and the rear lamp BL2 are turned on. As a result, the rear lamp BL1 and the rear lamp BL2 are also erroneously detected as stationary by the process of step S30. In addition, since the vehicle is daytime, the other vehicle C2, which is the oncoming vehicle, turns off the head lamp HL1 and the head lamp HL 2. Therefore, the head lamp HL1 and the head lamp HL2 are not determined as stationary objects.
In the target image 122F, the markers O1, the street lamps O2 to O4 are specified as stationary objects, and the respective surroundings are specified as stationary object regions Z11 to Z14 by the processing of step S30. Since the vehicle is at night, the other vehicle C1, which is a preceding vehicle, lights up the tail lamp and lights up the rear lamp BL3 and the rear lamp BL 4. As a result, the rear lamp BL3 and the rear lamp BL4 are also erroneously detected as stationary by the process of step S30. Similarly, the other vehicle C4, which is the oncoming vehicle, also lights the head lamp HL3 and the head lamp HL4, and the head lamp HL3 and the head lamp HL4 are erroneously detected as stationary objects. Although not shown, the rear lamp BL3 and the rear lamp BL4, and the surroundings of the head lamps HL3 and HL4 are also determined as stationary areas.
The marks O1 and the street lamps O2 to O4 are present at the same positions in both the reference image 122E and the target image 122F. Therefore, the signs O1 and the street lamps O2 to O4 are determined to be stationary. On the other hand, the rear lamps BL1 to BL4 and the headlamps HL3 to HL4 are present in only one image. Therefore, it is determined that the rear lamps BL1 to BL4 and the headlamps HL3 to HL4 are not stationary. As a result, in step S184, it is assumed that the rear lamps BL3 to BL4 and the headlamps HL3 to HL4 are not stationary, and the stationary information 123 is updated.
(another example of the determination processing)
Next, as another example of the determination processing in step S80, a case will be described in which an image having the smallest light spot among a plurality of images having the same imaging position is used. Fig. 12 is a flowchart showing another example of the determination processing in step S80. In step S81, the control unit 210 determines an imaging position to be determined, and acquires a plurality of still image data imaged at the imaging position from the still database 222.
Next, in step S82, the control unit 210 selects an image having the smallest light spot among the images corresponding to the plurality of still image data acquired in step S81. If the example of FIG. 8, image 122C 'is selected from image 122C and image 122C'. The detection of the light spot in each image may be performed by the same method as in step S31.
Next, in step S83, the control unit 210 determines whether or not a stationary object is present in the image selected in step S82. For example, conventionally known methods such as pattern recognition processing can be used for determining whether or not a stationary object is present.
When it is determined that no stationary object exists in the image (no in step S83), in step S86, the control unit 210 deletes the image data 122 corresponding to the image from the stationary object database 222 and ends. In addition, other still image data acquired in step S81 may be deleted.
When it is determined that a stationary object exists in the image (yes in step S83), the control unit 210 determines a stationary object region or a stationary object position in the image in step S84. By determining the still object region, the data of the portion including the still object region in the image is taken as still object image data, and the data size can be reduced. In this case, it is preferable that information indicating the position of the stationary object region in the original image is also determined. The still image data may be data obtained by performing a process of reducing the data amount of the region other than the still region. The description of the determination of the position of the stationary object in step S34 is incorporated by reference, and the description thereof is omitted.
Next, in step S85, the control unit 210 updates and ends the stationary object database 222 so as to include the content determined in step S84. The position of the spot generated by the stationary object is unchanged even if the times of photographing are different. If there is a spot that is not present in another image, it is estimated that the spot is a spot generated by a moving body. Therefore, according to the method described in fig. 9, the processing load can be reduced as compared with the determination of the stationary object position from the plurality of images, respectively.
The present invention is not limited to the above-described embodiments, and can be appropriately modified or improved. In addition, the materials, shapes, sizes, numerical values, forms, numbers, arrangement locations, and the like of the respective constituent elements in the above-described embodiments are arbitrary as long as the present invention can be realized, and are not limited.
The present application is based on Japanese patent application No. 2021-117824 of application No. 2021, 7 and 16 and Japanese patent application No. 2021-117828 of application No. 2021, 7 and 16, the contents of which are incorporated herein by reference.

Claims (17)

1. A stationary object information accumulation system is provided with a stationary object information acquisition device mounted on a vehicle and a stationary object information storage device disposed outside the vehicle,
The stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, stationary object information including at least one of stationary object image data corresponding to an image or a part of the image of a stationary object in which one or more of a self-luminous body, a logo, a contour mark, and a guardrail are present, and stationary object position information indicating a position of the stationary object calculated based on the image data; and
a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device by wireless communication,
the stationary object information storage device includes:
a receiving unit that receives the stationary object information and the vehicle position information transmitted from the transmitting unit; and
and a recording unit that records the stationary object information and the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured in a stationary object database in association with each other.
2. A stationary object information accumulation system is provided with a stationary object information acquisition device mounted on a vehicle and a stationary object information storage device mounted on the vehicle,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, stationary object information including at least one of stationary object image data corresponding to an image or a part of the image of a stationary object in which one or more of a self-luminous body, a logo, a contour mark, and a guardrail are present, and stationary object position information indicating a position of the stationary object calculated based on the image data; and
a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device by wireless communication or wired communication,
the stationary object information storage device includes:
A receiving unit that receives the stationary object information and the vehicle position information transmitted from the transmitting unit; and
and a recording unit that records the stationary object information and the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured in a stationary object database in association with each other.
3. The stationary object information accumulation system according to claim 1 or 2, wherein,
the transmitting section can transmit the still image data to the still information storing device,
the stationary object information storage device further includes a first determination unit that determines whether or not the stationary object is included in an image corresponding to the stationary object image data by an algorithm different from an algorithm by which the determination unit determines the stationary object information.
4. The stationary object information accumulation system according to claim 3, wherein,
the first determination unit deletes the still image data and the vehicle position information associated with the still image data from the still database when it is determined that the image corresponding to the still image data does not include the still.
5. The stationary object information accumulation system according to claim 3 or 4, wherein,
the first determination section is further capable of calculating stationary object position information indicating a position of the stationary object based on the stationary object image data,
the registration unit updates the stationary object database based on the stationary object position information calculated by the first determination unit when the stationary object position information is not recorded in the stationary object database or when the stationary object position information recorded in the stationary object database is different from the stationary object position information calculated by the first determination unit.
6. The stationary object information accumulation system according to any one of claims 1 to 5, wherein,
when the vehicle is in a position indicated by the vehicle position information when an image corresponding to the image data for specifying the stationary object information is captured, and when an illuminance sensor mounted on the vehicle for detecting illuminance around the vehicle outputs a signal indicating illuminance equal to or higher than a predetermined value, the transmitting unit transmits the image data of the image captured by the sensor unit as reference image data to a stationary object information storage device,
The recording unit records the reference image data and the still image data captured at the same position as the reference image data in the still database in association with each other.
7. The stationary object information accumulation system according to claim 6, wherein,
the stationary object information storage device further includes a second determination unit that determines whether or not the stationary object is included in an image corresponding to the stationary object image data based on the reference image data associated with the stationary object image data.
8. The stationary object information accumulation system as in any one of claims 1 to 7 in which,
the transmitting unit may transmit image data of at least two images captured by the sensor unit before and after a switching timing of turning on and off a headlight mounted on the vehicle to the stationary object information storage device,
the stationary object information storage device further includes a third determination unit that determines whether or not a stationary object present in an image corresponding to the image data is a self-luminous body, based on the image data of each of the at least two images.
9. A program to be executed in a computer device having a processor and capable of being communicatively connected to a stationary object information acquisition device mounted on a vehicle,
The stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on the vehicle;
a determination unit that determines, based on the image data, stationary object information including at least one of stationary object image data corresponding to an image or a part of the image of a stationary object in which one or more of a self-luminous body, a logo, a contour mark, and a guardrail are present, and stationary object position information indicating a position of the stationary object calculated based on the image data; and
a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device by wireless communication,
the program causes the processor to perform the steps of:
a receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit; and
and a recording step of recording the stationary information and the vehicle position information when an image corresponding to the image data for which the stationary information is specified is captured in association with each other in a stationary database.
10. A stationary object information storage method executed in a computer device having a processor and capable of communication connection with a stationary object information acquisition device mounted on a vehicle, wherein,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on the vehicle;
a determination unit that determines, based on the image data, stationary object information including at least one of stationary object image data corresponding to an image or a part of the image of a stationary object in which one or more of a self-luminous body, a logo, a contour mark, and a guardrail are present, and stationary object position information indicating a position of the stationary object calculated based on the image data; and
a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device by wireless communication,
The stationary object information storage method includes causing the processor to execute the steps of:
a receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit; and
and a recording step of recording the stationary information and the vehicle position information when an image corresponding to the image data for which the stationary information is specified is captured in association with each other in a stationary database.
11. A stationary object information accumulation system is provided with a stationary object information acquisition device mounted on a vehicle, and a stationary object information storage device capable of being connected in communication with the stationary object information acquisition device,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, still object information including still object image data corresponding to an image of one or more of a self-luminous body, a logo, a delineator, and a guardrail, or a part of the image; and
A transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device,
the stationary object information storage device includes:
a receiving unit that receives the stationary object information and the vehicle position information transmitted from the transmitting unit;
a recording unit that records the stationary object information and the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured in a stationary object database in association with each other; and
and a determination unit that determines whether or not the stationary object is included in the image corresponding to the stationary object image data by an algorithm different from the algorithm by which the determination unit determines the stationary object information.
12. The stationary object information accumulation system according to claim 11, wherein,
in the stationary object database, a plurality of stationary object image data can be recorded for one position indicated by the vehicle position information,
The determination section determines first still image data having a minimum light spot among the plurality of still image data recorded for the one position, and determines whether or not a still exists at the one position based on the first still image data.
13. The stationary object information accumulation system according to claim 11, wherein,
in the stationary object database, a plurality of stationary object image data can be recorded for one position indicated by the vehicle position information,
the determination unit determines whether or not the stationary object is included in each of the images corresponding to the plurality of stationary object image data recorded for the one position, and determines whether or not the stationary object is present at the one position based on a ratio of the stationary object image data determined to be present to the entire plurality of stationary object image data recorded for the one position.
14. The stationary object information accumulation system as in any of claims 11 to 13 in which,
the still image information includes time information about a time when an image corresponding to the still image data is captured,
the determination unit compares a plurality of images corresponding to a plurality of the still image data along a time series based on the time information, and determines whether or not each of the plurality of images includes the still.
15. The stationary object information accumulation system as in any of claims 11 to 14 in which,
the recording unit updates the stationary object database based on the determination result of the determining unit.
16. A program to be executed in a computer device having a processor and capable of being communicatively connected to a stationary object information acquisition device mounted on a vehicle,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, still object information including still object image data corresponding to an image of one or more of a self-luminous body, a logo, a delineator, and a guardrail, or a part of the image; and
a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device,
The program causes the processor to perform the steps of:
a receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit;
a recording step of recording the stationary information and the vehicle position information when an image corresponding to the image data for which the stationary information is specified is captured in a stationary database in association with each other; and
and a determination step of determining whether or not the stationary object is included in the image corresponding to the stationary object image data by an algorithm different from the algorithm by which the determination unit determines the stationary object information.
17. A stationary object information storage method executed in a computer device having a processor and capable of communication connection with a stationary object information acquisition device mounted on a vehicle, wherein,
the stationary object information acquisition device includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, still object information including still object image data corresponding to an image of one or more of a self-luminous body, a logo, a delineator, and a guardrail, or a part of the image; and
A transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to the stationary object information storage device,
the stationary object information storage method includes causing the processor to execute the steps of:
a receiving step of receiving the stationary object information and the vehicle position information transmitted from the transmitting unit;
a recording step of recording the stationary information and the vehicle position information when an image corresponding to the image data for which the stationary information is specified is captured in a stationary database in association with each other; and
and a determination step of determining whether or not the stationary object is included in the image corresponding to the stationary object image data by an algorithm different from the algorithm by which the determination unit determines the stationary object information.
CN202280050076.8A 2021-07-16 2022-07-13 Stationary object information accumulation system, program, and stationary object information storage method Pending CN117677996A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2021117828 2021-07-16
JP2021-117824 2021-07-16
JP2021-117828 2021-07-16
PCT/JP2022/027594 WO2023286811A1 (en) 2021-07-16 2022-07-13 Stationary object information collection system, program, and stationary object information storage method

Publications (1)

Publication Number Publication Date
CN117677996A true CN117677996A (en) 2024-03-08

Family

ID=90069957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280050076.8A Pending CN117677996A (en) 2021-07-16 2022-07-13 Stationary object information accumulation system, program, and stationary object information storage method

Country Status (1)

Country Link
CN (1) CN117677996A (en)

Similar Documents

Publication Publication Date Title
US10286834B2 (en) Vehicle exterior environment recognition apparatus
US7957559B2 (en) Apparatus and system for recognizing environment surrounding vehicle
US8232895B2 (en) Vehicle detection apparatus, vehicle detection program and light control apparatus
JP4415996B2 (en) In-vehicle image recognition device, light distribution control device, and light distribution control method
US9493108B2 (en) Apparatus for detecting other vehicle lights and light control apparatus for vehicles
US8848980B2 (en) Front vehicle detecting method and front vehicle detecting apparatus
US9566900B2 (en) Driver assistance system and operating procedure for the latter
WO2018174255A1 (en) Information processing device and information processing system
CN104185588A (en) Vehicular imaging system and method for determining roadway width
JP6056540B2 (en) Peripheral vehicle identification system, feature amount transmission device, and peripheral vehicle identification device
CN112131922A (en) Signal recognition system
JP2009298344A (en) Apparatus and program for determining lights of vehicle
KR101134857B1 (en) Apparatus and method for detecting a navigation vehicle in day and night according to luminous state
JP5353531B2 (en) Vehicle light recognition device and program
US20210231457A1 (en) Apparatus and method for collecting data for map generation, and vehicle
JP5407920B2 (en) Lighting color identification device and program
CN117677996A (en) Stationary object information accumulation system, program, and stationary object information storage method
WO2023286811A1 (en) Stationary object information collection system, program, and stationary object information storage method
WO2020158262A1 (en) Deterioration diagnostic device, deterioration diagnostic system, deterioration diagnostic method, and recording medium
CN117716405A (en) Stationary object information acquisition device, program, and stationary object information acquisition method
CN113628447B (en) High beam light starting detection method, device, equipment and system
JP6500606B2 (en) Vehicle position determination device and vehicle position determination method
CN114730520B (en) Semaphore recognition method and semaphore recognition device
WO2023286810A1 (en) Stationary object information using device, program, stationary object information using method, vehicle system, and stationary object information using system
JP7035339B2 (en) Brake judgment method and brake judgment device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination