CN117716405A - Stationary object information acquisition device, program, and stationary object information acquisition method - Google Patents

Stationary object information acquisition device, program, and stationary object information acquisition method Download PDF

Info

Publication number
CN117716405A
CN117716405A CN202280050074.9A CN202280050074A CN117716405A CN 117716405 A CN117716405 A CN 117716405A CN 202280050074 A CN202280050074 A CN 202280050074A CN 117716405 A CN117716405 A CN 117716405A
Authority
CN
China
Prior art keywords
stationary object
vehicle
image
image data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280050074.9A
Other languages
Chinese (zh)
Inventor
神谷美纱子
片冈拓弥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koito Manufacturing Co Ltd
Original Assignee
Koito Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koito Manufacturing Co Ltd filed Critical Koito Manufacturing Co Ltd
Publication of CN117716405A publication Critical patent/CN117716405A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/40Transportation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/10Detection; Monitoring

Abstract

A stationary object information acquisition device (100) mounted on a vehicle (2) is provided with: an image acquisition unit (111) that acquires image data (122) of an image captured by a sensor unit (31) mounted on a vehicle (2); a specifying unit (112) for specifying, based on the image data (122), still information (123), wherein the still information (123) includes at least one of still image data corresponding to an image or a part of the image of one or more of the self-luminous body, the logo, the outline marker, and the guardrail, and still position information indicating the position of the still calculated based on the image data (122); and a transmitting unit (113) that transmits, to the storage unit (220), stationary object information (123) and vehicle position information (124) of the vehicle (2) acquired from the position information acquisition unit (32) mounted on the vehicle (2), that is, vehicle position information (124) when an image corresponding to the image data (122) for which the stationary object information (123) is specified is captured.

Description

Stationary object information acquisition device, program, and stationary object information acquisition method
Technical Field
The present disclosure relates to a stationary object information acquisition device, a program, and a stationary object information acquisition method.
Background
In recent years, an ADB (Adaptive Driving Beam) technology has been proposed that shields or dims the light of the position of a highly reflective object such as a light shield or a sign of a forward vehicle or a vehicle position based on the surrounding situation of the vehicle. For example, patent document 1 describes that a forward vehicle is detected and a forward light distribution is controlled.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open publication No. 2011-246023
Disclosure of Invention
Problems to be solved by the invention
Generally, the light distribution control of the ADB is performed based on target information transmitted from the vehicle. Each object is detected by a specific algorithm based on data acquired by a sensor such as a camera, but depending on the accuracy of the data or the accuracy of the detection by the algorithm, the object may not be detected (missed detection) although it is present or may be detected (false detection) although it is not present.
Incidentally, if stationary objects with high brightness such as street lamps and signs are present on the road, these stationary objects may be erroneously recognized as front vehicles. In addition, the headlight or the like of the front vehicle may be erroneously recognized as a street lamp or the like. If information on stationary objects such as street lamps and signs on roads can be collected, the method can be flexibly applied to reduce the possibility of occurrence of erroneous recognition as described above, and is therefore advantageous.
The purpose of the present disclosure is to collect stationary object information of stationary objects such as street lamps, signs, etc. on roads.
Means for solving the problems
A stationary object information acquisition device according to an aspect of the present disclosure is mounted on a vehicle, and includes:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, stationary object information including at least one of stationary object image data corresponding to an image or a part of the image of a stationary object in which one or more of a self-luminous body, a logo, a contour mark, and a guardrail are present, and stationary object position information indicating a position of the stationary object calculated based on the image data; and
and a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to a storage unit.
A program according to an embodiment of the present disclosure is a program executed in a computer device provided with a processor and mounted on a vehicle,
The program causes the processor to perform the steps of:
an image acquisition step of acquiring image data of an image captured by a sensor unit mounted on the vehicle;
a specifying step of specifying, based on the image data, still object information including at least one of still object image data corresponding to an image or a part of the image of one or more of a self-luminous body, a logo, a contour mark, and a guardrail, and still object position information indicating a position of the still object calculated based on the image data; and
and a transmission step of transmitting the stationary object information and the vehicle position information of the vehicle acquired from a position information acquisition unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to a storage unit.
A stationary object information acquisition method according to an embodiment of the present disclosure is a stationary object information acquisition method executed in a computer device equipped with a processor and mounted on a vehicle,
The stationary object information acquisition method includes causing the processor to execute the steps of:
an image acquisition step of acquiring image data of an image captured by a sensor unit mounted on the vehicle;
a specifying step of specifying, based on the image data, still object information including at least one of still object image data corresponding to an image or a part of the image of one or more of a self-luminous body, a logo, a contour mark, and a guardrail, and still object position information indicating a position of the still object calculated based on the image data; and
and a transmission step of transmitting the stationary object information and the vehicle position information of the vehicle acquired from a position information acquisition unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to a storage unit.
Effects of the invention
According to the present disclosure, stationary object information of stationary objects such as street lamps, signs, and the like on a road can be collected.
Drawings
Fig. 1 is a schematic diagram showing an example of a system including a stationary object information acquisition device according to an embodiment of the present disclosure.
Fig. 2 is a block diagram showing an example of a system including a stationary object information acquisition device according to an embodiment of the present disclosure.
Fig. 3 is a flowchart showing an example of a stationary object information acquisition method according to an embodiment of the present disclosure.
Fig. 4 is a flowchart showing an example of the process of determining the stationary object information shown in fig. 3.
Fig. 5 is a schematic diagram for explaining the stationary object position information.
Fig. 6 is a schematic diagram showing image data acquired at a shooting timing and each shooting timing.
Fig. 7 is a flowchart showing an example of the process of determining the stationary object information.
Fig. 8 is a schematic diagram showing an example of reference image data used in the process of determining the stationary object information.
Fig. 9 is a schematic diagram showing an example of image data used in the determination processing of the stationary object information.
Detailed Description
The present invention will be described below with reference to the drawings based on embodiments. The same or equivalent components, members, and processes shown in the drawings are denoted by the same reference numerals, and repetitive description thereof will be omitted as appropriate. The embodiments are not limited to the invention, and all the features and combinations thereof described in the embodiments are not necessarily essential to the invention.
(System)
First, a system 1 including a stationary object information acquiring apparatus 100 according to an embodiment of the present disclosure will be described with reference to fig. 1 and 2. Fig. 1 is a schematic diagram illustrating a system 1 according to an embodiment of the present disclosure. As shown in fig. 1, the system 1 includes a stationary object information storage device 200 and a plurality of vehicles 2 such as a vehicle 2A and a vehicle 2B each having a stationary object information acquisition device 100 mounted thereon. The stationary information storage device 200 and each vehicle 2 can be communicatively connected to each other by wireless communication.
The stationary object information acquisition device 100 acquires stationary object information related to a stationary object, and transmits the stationary object information to the stationary object information storage device 200. The stationary information storage device 200 accumulates, for example, stationary information received from each stationary information acquisition device 100. The stationary object information storage device 200 analyzes the received stationary object information, for example, to improve the accuracy of the stationary object information, to acquire more detailed information, or to create a light distribution pattern based on the stationary object information. The stationary object information storage device 200 transmits the stationary object information and the like with the improved accuracy to each vehicle 2, for example, in response to a request from each vehicle 2. In each vehicle 2, for example, by using the stationary object information and the like having improved accuracy received from the stationary object information storage device 200, it is possible to improve accuracy and efficiency in detecting a target object, or to appropriately perform light distribution control of the headlight.
Here, the "stationary object" in the present embodiment refers to an object fixed to a road and having high brightness, and specifically, one or more of a self-luminous body (for example, a street lamp, a signal, etc.), a sign, a logo, and a guardrail. That is, the stationary object information acquisition device 100 in the present embodiment acquires stationary object information related to various stationary objects listed as specific examples. In addition, as another embodiment, the stationary object information acquisition device 100 may be configured to be able to identify an object that is not included in the specific example described above, that is, an object that is fixed to a road and has high brightness and that can affect the detection of a target object, as a stationary object.
Fig. 2 is a block diagram illustrating a system 1 according to an embodiment of the present disclosure. The vehicle 2 includes a vehicle ECU (Electronic Control Unit: electronic control unit) 10, a storage unit 20, a sensor unit 31, a position information acquisition unit 32, an illuminance sensor 33, and a stationary object information acquisition apparatus 100. The vehicle 2 can be connected to the stationary information storage device 200 by wireless communication via the communication network 3. The means of wireless communication is not particularly limited, and may be, for example, a mobile communication system such as a telematics (telematics) system for a vehicle, cooperation with a smart phone, and flexible use of Wi-Fi in a vehicle.
The vehicle ECU10 controls various operations such as running of the vehicle 2. The vehicle ECU10 includes a processor such as an ASIC (Application Specific Integrated Circuit: application specific integrated circuit), an FPGA (Field programmable Gate Array: field programmable gate array), or a general-purpose CPU (Central Proc essing Unit: central processing unit), for example. The storage unit 20 includes, for example, a ROM (Read Only Memory) in which various vehicle control programs are stored, and a RAM (Random Access Memory: random access Memory) in which various vehicle control data are temporarily stored. The processor of the vehicle ECU10 loads data designated by various vehicle control programs stored in the ROM on the RAM, and controls various operations of the vehicle 2 by cooperation with the RAM.
The sensor unit 31 outputs image data of an image obtained by capturing an image of the outside of the vehicle 2. The sensor unit 31 includes, for example, one or more sensors selected from a visual camera, a LiDAR, and a millimeter wave radar. The image data output by the LiDAR and millimeter wave radar may be data of a three-dimensional image. The position information acquisition unit 32 outputs vehicle position information indicating the current position of the vehicle 2. The position information acquisition unit 32 includes, for example, a GPS (Global Positioning System: global positioning system) sensor. The illuminance sensor 33 detects and outputs illuminance around the vehicle 2.
The stationary object information acquiring apparatus 100 includes a control unit 110 and a storage unit 120. The control unit 110 is configured by a processor such as a CPU, for example. The control unit 110 may be configured as a part of a lamp ECU that controls the operation of a lamp such as a headlight in the vehicle 2. The control unit 110 may be configured as a part of the vehicle ECU10, for example. The storage unit 120 is configured by, for example, ROM, RAM, or the like. The storage unit 120 may be configured as a part of the storage unit 20 or a storage device provided for the lamp ECU.
The control unit 110 functions as an image acquisition unit 111, a determination unit 112, a transmission/reception unit 113, and a determination unit 114 by reading the program 121 stored in the storage unit 120. Some of these functions may be realized by the vehicle ECU10 or the lamp ECU. In the case of such a configuration, the vehicle ECU10 or the lamp ECU constitutes a part of the stationary information acquisition device 100. In addition, the program 121 may also be recorded in a non-transitory computer-readable medium.
The image acquisition unit 111 acquires image data 122 of the image captured by the sensor unit 31. The acquired image data 122 is stored in the storage unit 120. In addition, the image acquisition section 111 acquires vehicle position information 124 (i.e., shooting position information indicating the shooting position of the image) when the image corresponding to the acquired image data 122 is shot from the position information acquisition section 32. The vehicle position information 124 preferably includes information indicating the orientation of the vehicle 2 at the time of capturing the image. The vehicle position information 124 may include information indicating the position of the vehicle in the vehicle width direction. The position of the vehicle in the vehicle width direction may be calculated based on a travel lane detected, for example. The acquired vehicle position information 124 is stored in the storage unit 120. The vehicle position information 124 is stored in the storage unit 120 in association with the corresponding image data 122, for example.
The image acquisition unit 111 may acquire time information indicating the time when the image was captured. The time information may include information indicating the date and time of the image captured. The image acquisition unit 111 may acquire lighting information on whether or not the headlight of the vehicle 2 is lighted when the image is captured. The time information and the lighting information are stored in the storage unit 120 in association with the corresponding image data 122, for example.
The image acquisition unit 111 can acquire, as reference image data, image data 122 captured when the illuminance sensor 33 outputs a signal indicating illuminance equal to or higher than a predetermined value (for example, 1000 lux). Here, the illuminance equal to or higher than the predetermined value is, for example, an illuminance equal to or higher than a value determined to be the daytime. That is, the image acquisition unit 111 can store the image data 122 of the image captured during the daytime as reference image data in the storage unit 120. The image acquisition unit 111 may acquire illuminance information indicating illuminance around the vehicle 2 when the image is captured from the illuminance sensor 33, and store the image data 122 in the storage unit 120 in association with the illuminance information. In this case, the image data 122 whose illuminance indicated by the associated illuminance information is equal to or greater than a predetermined value can be used as the reference image data.
The determination section 112 determines the stationary object information 123 based on the image data 122. The stationary object information 123 specified by the specifying unit 112 is stored in the storage unit 120. Here, the "stationary object information" is information including at least one of stationary object image data corresponding to an image in which a stationary object exists or a part of the image, and stationary object position information indicating a position of the stationary object calculated based on the image data 122.
The determination unit 112 detects a stationary object in an image by, for example, image analysis, and includes image data 122 of an image in which the stationary object is detected as stationary object image data in the stationary object information 123. The determination unit 112 determines, for example, a region including a stationary object in an image in which the stationary object is detected as a stationary object region, and includes data corresponding to the stationary object region as part of the image as stationary object image data in the stationary object information 123. The determination unit 112 calculates the position of the stationary object based on, for example, the image in which the stationary object is detected, and includes stationary object position information indicating the position of the stationary object in the stationary object information 123. The stationary object position information may be information indicating the position of the stationary object in the image (for example, the coordinates and the size of the position of the stationary object in the image), or information indicating the distance and the direction from the imaging position of the image to the stationary object. The determination unit 112 may determine the type of the stationary object and include information indicating the type in the stationary object information 123.
The transmitting/receiving unit 113 transmits and receives information to and from the vehicle ECU10 and the stationary object information storage device 200. That is, the transmitting/receiving unit 113 functions as a transmitting unit and a receiving unit. The transmitting/receiving unit 113 transmits the stationary information 123 and the vehicle position information 124 corresponding to the stationary information 123 (when an image corresponding to the image data 122 for which the stationary information 123 is specified is captured) to the stationary information storage device 200 including the storage unit 220. In addition, the transmitting/receiving section 113 may transmit the reference image data to the stationary information storage device 200. The transmitting/receiving unit 113 may transmit/receive other information to/from the stationary information storage device 200 as needed.
The determination unit 114 determines whether or not a stationary object is present at the position indicated by the stationary object position information calculated by the determination unit 112, based on the reference image data captured at the same position as the image capturing position of the image corresponding to the image data 122 for calculation of the stationary object position. The reference image data used for the determination by the determination unit 114 is, for example, the following image data: the image data 122 of the image captured when the vehicle 2 passes the position indicated by the vehicle position information 124 corresponding to the image data 122 of the stationary object information determined by the determining unit 112 and when the illuminance sensor 33 outputs a signal indicating illuminance equal to or higher than a predetermined value is the image data acquired by the image acquiring unit 111.
The stationary information storage device 200 includes a control unit 210 and a storage unit 220. In the present embodiment, the stationary information storage device 200 is a computer device that collects and accumulates information transmitted from a plurality of vehicles 2, and is provided in a data center, for example. The control unit 210 is configured by a processor such as a CPU, for example. The storage unit 220 is configured by, for example, ROM, RAM, or the like.
The control unit 210 functions as a transmitting/receiving unit 211, a recording unit 212, and a determining unit 213 by reading a program 221 stored in the storage unit 220. Note that the program 221 may also be recorded in a non-transitory computer-readable medium.
The transmitting/receiving unit 211 transmits and receives information to and from the vehicle ECU10 and the stationary object information acquiring apparatus 100. The transmitting/receiving unit 211 receives the stationary object information 123 transmitted from the transmitting/receiving unit 113 and the vehicle position information 124 corresponding to the stationary object information 123. In addition, the transmitting and receiving section 211 may receive the reference image data from the stationary object information acquiring apparatus 100. The transmitting/receiving unit 211 may transmit/receive other information to/from the vehicle ECU10 and the stationary object information acquiring apparatus 100 as necessary.
The recording unit 212 records the stationary object information 123 received by the transmitting/receiving unit 211 and the vehicle position information 124 corresponding to the stationary object information 123 in the stationary object database 222 in association with each other. The recording unit 212 may update the stationary object database 222 based on the determination result of the determination unit 213.
In the stationary object database 222, vehicle position information 124 and stationary object information 123 are recorded in association. In the stationary object database 222, for example, a plurality of stationary object image data are recorded for one shooting position indicated by the vehicle position information 124. In addition, in the still image database 222, still image data having the same shooting position may be recorded in association with reference image data. In the stationary object database 222, information such as the position, size, distance from the imaging position, direction, type of stationary object, and the like of the stationary object may be recorded in association with the imaging position.
The determination unit 213 determines whether or not the stationary object information 123 includes a stationary object in an algorithm different from the algorithm by which the determination unit 112 determines the stationary object information 123. The algorithm used by the determination unit 213 is preferably an algorithm having higher detection accuracy of the stationary object than the algorithm used by the determination unit 112, for example.
The determination unit 213 determines whether or not the image includes a stationary object using, for example, an image corresponding to the stationary object image data included in the stationary object information 123. The determination unit 213 may determine detailed information such as the position and size of the stationary object in the image, the distance from the imaging position of the image to the stationary object, the direction, and the type of the stationary object, using the image corresponding to the image data of the stationary object.
As another example of the system 1, the stationary object information storage device 200 may be mounted on the vehicle 2. In this case, the control unit 210 and the storage unit 220 may be provided separately from the vehicle ECU10, the control unit 110, the storage unit 20, and the storage unit 120. On the other hand, the control unit 210 may be configured as part of any one or more of the lamp ECU, the vehicle ECU10, and the control unit 110, for example. A part of the functions listed as the functions of the control unit 210 may be realized by the vehicle ECU10 or the lamp ECU. The storage unit 220 may be configured as part of at least one of the storage unit 20, the storage unit 120, and a storage device provided in the lamp ECU, for example. When the stationary information storage device 200 is mounted on the vehicle 2, the stationary information acquisition device 100 and the stationary information storage device 200 are configured to be connectable by wireless communication or wired communication.
(stationary object information acquiring method)
Next, a description will be given of a still information acquiring method of the still information acquiring apparatus 100 according to the present embodiment. The stationary object information acquisition method according to the present embodiment is executed by the control unit 110 of the stationary object information acquisition apparatus 100, for example, which reads the program 121. In the following description, the case where the stationary information acquisition device 100 determines the stationary information 123 using an image captured by a video camera is taken as an example, but the present disclosure is not limited thereto. The stationary object information acquiring apparatus 100 may determine the stationary object information 123 using, for example, an image output by millimeter wave radar or LiDAR.
Fig. 3 is a flowchart showing an example of the stationary object information acquisition method according to the present embodiment. The order of the processes constituting the flowcharts described in the present specification may be different in order and parallel to each other insofar as the process contents do not contradict or match each other.
First, in step S10, the control unit 110 acquires image data and the like. Specifically, the control unit 110 acquires image data of an image captured by the video camera. The control unit 110 acquires vehicle position information 124 corresponding to the image data.
In step S10, the control unit 110 preferably acquires one or more of time information indicating a time when the image was captured, lighting information regarding whether or not the headlight of the vehicle 2 was lit when the image was captured, and illuminance information indicating illuminance of the surroundings of the vehicle 2 when the image was captured. By acquiring these pieces of information, the images can be appropriately compared, and as a result, the detection accuracy of the stationary object can be improved.
Here, the video camera is controlled by the vehicle ECU, for example, so as to take an image of the outside of the vehicle 2 at predetermined time intervals. The control unit 110 preferably obtains, for example, the following: the image data 122 of the images photographed at predetermined time intervals are removed at time intervals (for example, 0.1 to 1 second) longer than the time intervals at the time of photographing or at predetermined distance intervals (for example, 1 to 10m intervals) at the photographing positions. By eliminating the image data 122, the memory unit 120 can be suppressed from increasing in capacity. Further, since the number of the determination processing steps in step S30 to be described later can be reduced, the burden on the control unit 110 can be reduced. For example, the control unit 110 may acquire all of the image data 122 of the images captured at predetermined time intervals, temporarily store the acquired image data in the storage unit 120, and delete the image data 122 at a predetermined timing such as before the determination processing in step S30.
The image acquisition unit 111 may also reject the image data 122 from the viewpoint of whether or not the image is captured at a place where the vehicle 2 is traveling normally. Specifically, the image acquisition unit 111 may remove the image data 122 of the image captured on the road (for example, one time a month or less in the past) on which the number of traveling times in the past predetermined period is lower than the predetermined number. This is because, even if a stationary object is determined at a place where the vehicle 2 does not travel at ordinary times, it is not very beneficial to the user of the vehicle 2. In particular, when the stationary object information storage device 200 is mounted on the vehicle 2, it is preferable to eliminate the image data 122 based on the number of times of travel of the imaging position in a predetermined period.
Next, when the vehicle 2 is in the first state (yes in step S20), the control unit 110 executes a process of determining the stationary object information 123 in step S30. On the other hand, when the vehicle 2 is not in the first state (no in step S20), the control unit 110 waits for the determination process of step S30 to be executed until the vehicle 2 is in the first state.
Here, the "first state" is a state that is considered to be a small processing load of the vehicle ECU10 or the lamp ECU. The "first state" includes, for example, a parking state or a jogging state (for example, traveling at a speed of 10Km or less per hour). When the control unit 110 is configured as a part of the vehicle ECU10 or the lamp ECU, the determination processing in step S30 is executed at the timing when the vehicle 2 is in the first state, which contributes to a reduction in the load on the vehicle ECU10 or the lamp ECU. In the case where the control unit 110 is configured independently of the vehicle ECU10 and the lamp ECU, the determination in step S20 may not be performed.
In step S30, the control section 110 performs a determination process of determining the stationary object information 123 based on the image data 122. Details of the determination process will be described later using fig. 4 and the like.
Next, when the vehicle 2 is in the second state (yes in step S40), the control unit 110 transmits the stationary object information 123 and the vehicle position information 124 corresponding to the stationary object information 123 to the stationary object information storage device 200 including the storage unit 220, and ends in step S50. In step S50, time information, lighting information, illuminance information, and the like may be transmitted together. On the other hand, when the vehicle 2 is not in the second state (no in step S40), the control unit 110 waits for the transmission process of step S50 to be executed until the vehicle 2 is in the second state.
Here, the "second state" is a state that is considered to be a small processing load of the vehicle ECU10 or the lamp ECU. The "second state" includes, for example, a parking state or a jogging state (for example, traveling at a speed of 10Km or less per hour). When the control unit 110 is configured as a part of the vehicle ECU10 or the lamp ECU, the transmission process of step S50 is executed at the timing when the vehicle 2 is in the second state, which contributes to a reduction in the load on the vehicle ECU10 or the lamp ECU. In the case where the control unit 110 is configured independently of the vehicle ECU10 and the lamp ECU, the determination in step S40 may not be performed.
The stationary object information 123 transmitted in step S50 may be stationary object image data of an image determined to be a stationary object, stationary object position information calculated from the image, or both. When the transmitted still image data is included in the still image information 123, the still image data is further examined in the still image information storage device 200, and thus information with higher accuracy can be obtained. On the other hand, in the case where the still image data is not included in the transmitted still information 123, it is advantageous in that the capacity of the transmitted data becomes small.
The following describes in detail the process of determining the stationary object information 123 in step S30. Fig. 4 is a flowchart showing an example of the determination processing of the stationary object information 123. In step S31, the control unit 110 detects a light spot in an image. The detection of the light spot can be performed by, for example, luminance analysis of an image, using a conventionally known technique.
In step S32, the control unit 110 performs pattern recognition processing on the image. The pattern recognition method may be a conventionally known method, and for example, a machine learning model may be used to detect a stationary object, or a clustering method may be used to detect a stationary object.
Next, in step S33, the control unit 110 determines whether or not a stationary object is present in the image based on the result of the processing in step S31 and/or step S32. When it is determined that there is no stationary object in the image (no in step S33), the control unit 110 deletes the image data 122 corresponding to the image from the storage unit 120 and ends in step S35.
When it is determined that a stationary object exists in the image (yes in step S33), the control unit 110 determines a stationary object region or a stationary object position in the image in step S34. By specifying the still object region and setting the data of the portion including the still object region in the image as still object image data, the data capacity at the time of transmission to the still object information storage device 200 can be reduced. In this case, it is preferable that information indicating the position of the still object region in the original image is also determined and included in the still object information 123. The still image data may be data obtained by performing a process of reducing the data amount of the region other than the still region.
The stationary object position is, for example, a position of a stationary object in an image. The stationary object position can be determined using, for example, an arbitrary coordinate system set in the image. The stationary object position may be a position indicating a center point of the stationary object, or a position indicating an outer edge position of the stationary object, for example. Further, the stationary object position preferably includes information on the size determined by using the coordinate system.
Fig. 5 is a schematic diagram for explaining the stationary object position information. In the image shown in fig. 5, the markers O1, the street lamps O2 to O4 are determined as stationary objects. In this case, for example, the positions of the zones Z1 to Z4 including the marks O1, the street lamps O2 to O4 may be defined as the stationary object position information by using the coordinates defined by the x-axis and the y-axis. The method of setting the coordinates is not particularly limited, and for example, the center of the image may be set as the origin. In the example of fig. 5, the pillar portions of the marks O1 and the street lamps O2 to O4 are not included in the regions Z1 to Z4, but the regions including these pillar portions may be set as stationary positions.
The stationary object position determined in step S34 may be a position indicating a distance or direction from the image capturing position to the stationary object. For example, when the image data 122 includes depth information, the distance and direction from the imaging position to the stationary object may be calculated using the depth information. The calculation may be performed by comparing the image data with other image data 122 captured in the vicinity of the imaging position, or may be performed using data acquired from a millimeter wave radar or LiDAR.
When the stationary object position is determined, the image data 122 may be deleted from the storage unit 120 or may be included in the stationary object information 123 in association with the stationary object position information. After step S34, the process advances to step S40 in fig. 3.
When the determination processing shown in fig. 4 is executed based on the image data 122 captured during the daytime when the illuminance is equal to or higher than the predetermined value, the outline of the structure in the image is easily grasped, and the color information of the structure is easily acquired from the image, so that the detection accuracy of the stationary object by the pattern recognition processing can be improved.
The process of determining the stationary object information 123 may be performed by comparing a plurality of image data 122 captured at the same location or at locations close to each other. In step S34, the control unit 110 may determine the type of the stationary object based on the result of step S31 and/or step S32, and include the type information in the stationary object information 123. An example of a method of determining whether or not a stationary object is a self-luminous body and an example of detection of a stationary object based on comparison of a plurality of image data 122 will be described below with reference to fig. 6.
Fig. 6 is a schematic diagram showing the image capturing timing and the images 122A to 122D acquired at the respective capturing timings. In the example of fig. 6, the video camera captures images of the front of the vehicle 2 at times T1, T2, T3, and T4, respectively, and outputs image data 122 of images 122A to 122D. The intervals F1 to F4 are the same for each time. That is, the images 122A to 122D are data of images captured at regular time intervals. The head lamp is turned on at times T1, T3, and T4, and turned off at time T2. Between time T1 and time T4, vehicle 2 travels forward at a predetermined speed.
Whether or not the stationary object present in the image corresponding to the image data 122 is a self-luminous body can be determined based on the image data 122 of each of at least two images captured before and after the switching timing of turning on and off the head lamp mounted on the vehicle 2, for example. In fig. 6, in image 122A, light spots LP1 and LP2 are detected. On the other hand, in the image 122B, although the light spot LP1 is detected, the light spot is not detected at the position (position indicated by a broken line) where the light spot LP2 is estimated to be detected. In addition, in the image 122C, the light spot LP2 is detected again. Thus, it can be determined that the light spot LP2 is a light spot that is detected as a light spot by reflection of light from the headlight, and that the light spot LP2 is not a light spot formed by a self-luminous body. In addition, it can be determined that the light spot LP1 detected in the image 122B captured when the headlamp is not on is a light spot formed by a self-luminous body.
The detection of the stationary object based on the comparison of the plurality of images can be performed by comparing images captured at the same position or at positions close to each other, for example. In fig. 6, an image 122C' is an image captured at the same position as the image 122C before the image 122C. In the image 122C, the light spots LP3 and LP4 are detected in addition to the light spots LP1 and LP2. On the other hand, in the image 122C', the light spots LP1 and LP2 are detected, but the light spots LP3 and LP4 are not detected. If the light spots LP3 and LP4 are stationary objects, the light spots LP3 and LP4 may be detected from the image 122C ', but the light spots LP3 and LP4 are not actually detected from the image 122C'. Therefore, it can be determined that the light spot LP3 and the light spot LP4 are not stationary. It can be determined that the light spots LP1 and LP2 detected at the same position in both the image 122C and the image data 122C' are light spots generated by the stationary object.
In this way, when there are a plurality of images captured at the same point, the light spots detected from the respective images may include not only the light spots generated by the stationary object but also the light spots generated by the moving object such as a vehicle. For example, the positions of the detected light spots are compared among a plurality of images, and if the positions of the light spots are unchanged or the relative positions between the light spots are unchanged, the light spots are determined to be light spots generated by a stationary object, and the light spots whose positions are greatly changed are determined to be light spots of a moving object, whereby the stationary object can be determined.
In other words, the spot estimated to be a stationary object is determined based on the image 122C ' captured at a certain position, and when the spot is also determined from the image of the image 122C captured at the same position as the image 122C ' when the vehicle 2 passes through the certain position again after the image 122C ' is captured, the stationary object information 123 can be determined by regarding the spot as a stationary object.
Since the lamp is turned on by another vehicle traveling at night, a light spot generated by the lamp of the other vehicle is easily detected from an image captured at night. Therefore, from the viewpoint of improving the detection accuracy of the stationary object, it is preferable that the image captured at night be compared with other images captured at the same position, and more preferably with other images captured at the same position in the daytime.
The detection of the stationary object based on the comparison of the plurality of image data 122 may be performed by comparing the plurality of image data 122 along the time series of the captured images, and based on the movement amount of each light spot between the plurality of images and the traveling speed of the vehicle 2. For example, in the example of fig. 6, the movement amounts of the light spots LP3 and LP4 between the images 122C to 122D are larger than the movement amounts of the light spots LP1 and LP 2. When the movement amounts of the light spots LP3 and LP4 are larger than the movement amount estimated from the traveling speed of the vehicle 2, it is considered that the light spots LP3 and LP4 move in the direction of the vehicle 2, and it can be determined that the light spots LP3 and LP4 are light spots generated by the moving body. When the movement amounts of the light spots LP1 and LP2 are equal to the movement amount estimated from the traveling speed of the vehicle 2, it can be determined that the light spots LP1 and LP2 are light spots generated by stationary objects. When the movement amounts of the light spots LP1 and LP2 are smaller than the movement amount estimated from the traveling speed of the vehicle 2, the light spots LP1 and LP2 are considered to be moving in the same direction as the traveling direction of the vehicle 2, and therefore, it is possible to determine that the light spots LP1 and LP2 are light spots generated by the moving body.
Hereinafter, a determination process for determining whether or not a stationary object is present again with respect to the image data 122 for which the stationary object information 123 has been determined, and for improving the accuracy of the stationary object information 123 will be described. Fig. 7 is a flowchart showing an example of the determination processing of the stationary object information 123.
In step S131, the control unit 110 acquires reference image data. Specifically, the control unit 110 acquires, as the reference image data, the image data 122 of the image captured when the vehicle 2 passes through the imaging position of the image corresponding to the image data 122 for which the still information 123 is determined again and when the signal indicating the illuminance of the illuminance sensor 33 is equal to or greater than the predetermined value is output.
Next, in step S132, the control unit 110 determines the position of the stationary object in the reference image indicated by the reference image data. The determination of the stationary object position in step S132 can be performed by the same processing as in steps S31 to S34, for example.
Next, in step S133, the control unit 110 determines whether or not the stationary object position of the reference image matches the stationary object position of the image that is the determined stationary object information 123. If the pieces of information match (yes in step S133), the control unit 110 determines that the determined stationary object information 123 is correct, and ends the process.
On the other hand, if the pieces of information do not match (no in step S133), the control unit 110 updates and ends the stationary object information 123 in step S134. In step S134, for example, the stationary object position at which the position between the reference image and the target image is coincident is determined as the correct position, the stationary object position at which the position between the reference image and the target image is not coincident is determined as the wrong position, and the stationary object information 123 is updated.
Fig. 8 is a schematic diagram showing an example of the reference image 122E used in the determination process shown in fig. 7. Fig. 9 is a schematic diagram showing an example of the target image 122F used in the determination process shown in fig. 7. In this example, the reference image 122E is an image captured during the daytime, and the target image 122F is an image captured during the nighttime.
In the reference image 122E, the marker O1 and the street lamps O2 to O4 are determined as stationary objects by the processing of step S132. The other vehicle C1 as the preceding vehicle turns on the hazard lamp and stops, and the rear lamp BL1 and the rear lamp BL2 are turned on. As a result, the rear lamp BL1 and the rear lamp BL2 are also erroneously detected as stationary objects by the process of step S132. In addition, since the vehicle is daytime, the other vehicle C2, which is the oncoming vehicle, turns off the head lamp HL1 and the head lamp HL 2. Therefore, the head lamp HL1 and the head lamp HL2 are not determined as stationary objects.
In the target image 122F, the markers O1, the street lamps O2 to O4 are specified as stationary objects, and the respective surroundings are specified as stationary object regions Z1 to Z4 by the processing of step S30. Since the vehicle is at night, the other vehicle C1, which is a preceding vehicle, lights up the tail lamp and lights up the rear lamp BL3 and the rear lamp BL 4. As a result, the rear lamp BL3 and the rear lamp BL4 are also erroneously detected as stationary by the process of step S30. Similarly, the other vehicle C4, which is the oncoming vehicle, also lights the head lamp HL3 and the head lamp HL4, and the head lamp HL3 and the head lamp HL4 are erroneously detected as stationary objects. Although not shown, the rear lamp BL3 and the rear lamp BL4, and the surroundings of the head lamps HL3 and HL4 are also determined as stationary areas.
The marks O1 and the street lamps O2 to O4 are present at the same positions in both the reference image 122E and the target image 122F. Therefore, the signs O1 and the street lamps O2 to O4 are determined to be stationary. On the other hand, the rear lamps BL1 to BL4 and the headlamps HL3 to HL4 are present in only one image. Therefore, it is determined that the rear lamps BL1 to BL4 and the headlamps HL3 to HL4 are not stationary. As a result, in step S134, it is assumed that the rear lamps BL3 to BL4 and the headlamps HL3 to HL4 are not stationary, and the stationary information 123 is updated.
The present invention is not limited to the above-described embodiments, and can be appropriately modified or improved. In addition, the materials, shapes, sizes, numerical values, forms, numbers, arrangement locations, and the like of the respective constituent elements in the above-described embodiments are arbitrary as long as the present invention can be realized, and are not limited.
The present application is based on Japanese patent application No. 2021-117823 filed on 7/16 of 2021, the contents of which are incorporated herein by reference.

Claims (11)

1. A stationary object information acquisition device mounted on a vehicle, wherein,
the stationary object information acquisition device is provided with:
an image acquisition unit that acquires image data of an image captured by a sensor unit mounted on a vehicle;
a determination unit that determines, based on the image data, stationary object information including at least one of stationary object image data corresponding to an image or a part of the image of a stationary object in which one or more of a self-luminous body, a logo, a contour mark, and a guardrail are present, and stationary object position information indicating a position of the stationary object calculated based on the image data; and
And a transmitting unit that transmits the stationary object information and the vehicle position information of the vehicle acquired from a position information acquiring unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to a storage unit.
2. The stationary object information acquiring apparatus according to claim 1, wherein,
the determination portion determines the stationary information in a case where the vehicle is in a first state,
the first state includes a situation in which the vehicle is in a stopped state or a jogged state.
3. The stationary object information acquiring apparatus according to claim 1 or 2, wherein,
the transmitting portion transmits the vehicle position information and the stationary information to the storing portion when the vehicle is in the second state,
the second state includes a case where the vehicle is in a stopped state or a jogged state.
4. The stationary object information acquiring apparatus according to any one of claims 1 to 3, wherein,
the determination section is capable of determining a stationary object region, which is a region containing the stationary object in the image,
the transmitting section transmits data corresponding to a portion including the still object region in the image as the still object image data.
5. The stationary object information acquiring apparatus according to any one of claims 1 to 4, wherein,
the determination section may determine stationary object position information indicating a position of the stationary object calculated based on the image data,
the stationary object position information is included in the stationary object information transmitted by the transmitting unit.
6. The stationary object information acquiring apparatus according to any one of claims 1 to 5, wherein,
the transmitting unit transmits image data of an image captured by the sensor unit to the storage unit as reference image data when the vehicle is in a position indicated by the vehicle position information when an image corresponding to the image data of the stationary object information is captured and when an illuminance sensor mounted on the vehicle for detecting illuminance around the vehicle outputs a signal indicating illuminance equal to or higher than a predetermined value.
7. The stationary object information acquiring apparatus according to any one of claims 1 to 6, wherein,
the image acquisition unit acquires, as reference image data, image data of an image captured by the sensor unit when the vehicle is in a position indicated by the vehicle position information when an image corresponding to the image data of the stationary object information is captured, and when an illuminance sensor mounted on the vehicle and detecting illuminance around the vehicle outputs a signal indicating illuminance equal to or higher than a predetermined value,
The stationary object information acquisition device further includes a determination unit that determines, based on the reference image data, whether or not the stationary object is present at a position indicated by the stationary object position information calculated based on the image data corresponding to the reference image data.
8. The stationary object information acquiring apparatus according to any one of claims 1 to 7, wherein,
the determination unit may determine whether or not a stationary object present in an image corresponding to image data of at least two images captured by the sensor unit before and after a switching timing of turning on and off a headlight mounted on the vehicle is a self-luminous body,
the information on the stationary object transmitted by the transmitting unit includes information on whether the stationary object is a self-luminous body.
9. The stationary object information acquiring apparatus according to any one of claims 1 to 8, wherein,
the determination unit determines a light spot estimated to be the stationary object based on a first image corresponding to first image data captured at a first position, and determines the stationary object information by regarding the light spot as the stationary object when the light spot is present in a second image corresponding to second image data captured at the first position when the vehicle passes through the first position again after the first image is captured.
10. A program to be executed by a computer device equipped with a processor and mounted on a vehicle,
the program causes the processor to perform the steps of:
an image acquisition step of acquiring image data of an image captured by a sensor unit mounted on the vehicle;
a specifying step of specifying, based on the image data, still object information including at least one of still object image data corresponding to an image or a part of the image of one or more of a self-luminous body, a logo, a contour mark, and a guardrail, and still object position information indicating a position of the still object calculated based on the image data; and
and a transmission step of transmitting the stationary object information and the vehicle position information of the vehicle acquired from a position information acquisition unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to a storage unit.
11. A stationary object information acquisition method executed in a computer device equipped with a processor and mounted on a vehicle, wherein,
The stationary object information acquisition method includes causing the processor to execute the steps of:
an image acquisition step of acquiring image data of an image captured by a sensor unit mounted on the vehicle;
a specifying step of specifying, based on the image data, still object information including at least one of still object image data corresponding to an image or a part of the image of one or more of a self-luminous body, a logo, a contour mark, and a guardrail, and still object position information indicating a position of the still object calculated based on the image data; and
and a transmission step of transmitting the stationary object information and the vehicle position information of the vehicle acquired from a position information acquisition unit mounted on the vehicle, that is, the vehicle position information when an image corresponding to the image data for which the stationary object information is specified is captured, to a storage unit.
CN202280050074.9A 2021-07-16 2022-07-13 Stationary object information acquisition device, program, and stationary object information acquisition method Pending CN117716405A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2021-117823 2021-07-16
JP2021117823 2021-07-16
PCT/JP2022/027575 WO2023286806A1 (en) 2021-07-16 2022-07-13 Stationary object information acquisition device, program, and stationary object information acquisition method

Publications (1)

Publication Number Publication Date
CN117716405A true CN117716405A (en) 2024-03-15

Family

ID=84919504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280050074.9A Pending CN117716405A (en) 2021-07-16 2022-07-13 Stationary object information acquisition device, program, and stationary object information acquisition method

Country Status (3)

Country Link
JP (1) JPWO2023286806A1 (en)
CN (1) CN117716405A (en)
WO (1) WO2023286806A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009129290A (en) * 2007-11-27 2009-06-11 Aisin Aw Co Ltd Traffic signal detection apparatus, traffic signal detection method and program
JP2013045176A (en) * 2011-08-22 2013-03-04 Pioneer Electronic Corp Signal recognition device, candidate point pattern transmitter, candidate point pattern receiver, signal recognition method, and candidate point pattern reception method
JP6325806B2 (en) * 2013-12-06 2018-05-16 日立オートモティブシステムズ株式会社 Vehicle position estimation system
JP2017102918A (en) * 2015-11-20 2017-06-08 アサヒリサーチ株式会社 Drive recorder
JP2020034472A (en) * 2018-08-31 2020-03-05 株式会社デンソー Map system, method and storage medium for autonomous navigation

Also Published As

Publication number Publication date
WO2023286806A1 (en) 2023-01-19
JPWO2023286806A1 (en) 2023-01-19

Similar Documents

Publication Publication Date Title
US10286834B2 (en) Vehicle exterior environment recognition apparatus
US10442343B2 (en) Vehicle exterior environment recognition apparatus
US7957559B2 (en) Apparatus and system for recognizing environment surrounding vehicle
US8232895B2 (en) Vehicle detection apparatus, vehicle detection program and light control apparatus
JP4415996B2 (en) In-vehicle image recognition device, light distribution control device, and light distribution control method
US9493108B2 (en) Apparatus for detecting other vehicle lights and light control apparatus for vehicles
US8848980B2 (en) Front vehicle detecting method and front vehicle detecting apparatus
CN111727135B (en) Automatic lighting system
US20070211482A1 (en) Method for controlling the automatic switching of the projector of a vehicle
JP2013147112A (en) Vehicle driving environment recognition apparatus
JP2014515893A (en) Method and apparatus for evaluating an image taken by a camera of a vehicle
JP6056540B2 (en) Peripheral vehicle identification system, feature amount transmission device, and peripheral vehicle identification device
CN112131922A (en) Signal recognition system
JP5407920B2 (en) Lighting color identification device and program
CN117716405A (en) Stationary object information acquisition device, program, and stationary object information acquisition method
CN117677996A (en) Stationary object information accumulation system, program, and stationary object information storage method
CN114746915B (en) Signal machine identification method and signal machine identification device
WO2023286811A1 (en) Stationary object information collection system, program, and stationary object information storage method
CN114730520B (en) Semaphore recognition method and semaphore recognition device
JP7035339B2 (en) Brake judgment method and brake judgment device
KR20160091331A (en) System and method for forming nighttime images for a motor vehicle
CN113628447B (en) High beam light starting detection method, device, equipment and system
US20200361375A1 (en) Image processing device and vehicle lamp
WO2023286810A1 (en) Stationary object information using device, program, stationary object information using method, vehicle system, and stationary object information using system
CN117651982A (en) Stationary object information utilization device, program, stationary object information utilization method, vehicle system, and stationary object information utilization system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination