WO2019107353A1 - Structure de données de données cartographiques - Google Patents

Structure de données de données cartographiques Download PDF

Info

Publication number
WO2019107353A1
WO2019107353A1 PCT/JP2018/043571 JP2018043571W WO2019107353A1 WO 2019107353 A1 WO2019107353 A1 WO 2019107353A1 JP 2018043571 W JP2018043571 W JP 2018043571W WO 2019107353 A1 WO2019107353 A1 WO 2019107353A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
deterioration
data
feature
data structure
Prior art date
Application number
PCT/JP2018/043571
Other languages
English (en)
Japanese (ja)
Inventor
岩井 智昭
加藤 正浩
多史 藤谷
良司 野口
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to JP2019557235A priority Critical patent/JPWO2019107353A1/ja
Publication of WO2019107353A1 publication Critical patent/WO2019107353A1/fr
Priority to JP2022104284A priority patent/JP2022121579A/ja
Priority to JP2023201299A priority patent/JP2024009306A/ja

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3826Terrain data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3837Data obtained from a single source
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Definitions

  • the present application relates to a data structure of map data including deterioration information on the deterioration state of a feature.
  • the object position measured by a sensor such as LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) and the object position described in map data for automatic operation are matched with high accuracy. It is necessary to estimate the car position.
  • Patent Document 1 discloses a technique for extracting data measuring a road surface from among points cloud data including many data other than the road surface, such as buildings and road trees, etc. There is. The technology can be used as a pretreatment for extracting white lines drawn on roads.
  • the road area could be determined from the point cloud data obtained by measuring the ground surface by using the technique of Patent Document 1, it was not possible to determine the deterioration state of the feature.
  • the importance of processing using feature information in self-location estimation, lane keeping, recognition of travelable area, etc. is very high, and the degradation state of features also affects the processing. large. Therefore, it is important to grasp the actual degradation state of the feature and reflect the degraded feature on the map data.
  • the present invention provides an example of a problem of providing a data structure of map data that can determine whether a feature at a certain position is deteriorated.
  • the invention according to claim 1 is a data structure of map data including position information indicating a position of a feature and degradation information on a degradation state of the feature, the information processing apparatus including the position It is a data structure of the said map data used for the process which determines based on the said deterioration information whether the said terrestrial feature specified by information has deteriorated.
  • FIG. 1 It is a block diagram showing composition of a degradation feature specific system concerning an embodiment. It is a block diagram showing a schematic structure of a map data management system concerning an example.
  • (A) is a figure which shows a mode that Lidar measures the reflective intensity of the light in a white line (without deterioration) which concerns on an Example
  • (B) is a figure which shows an example of the graph at the time of carrying out the statistical processing of the said reflective intensity.
  • (A) is a figure which shows a mode that Lidar measures the reflective intensity of the light in a white line (with degradation) which concerns on an Example
  • (B) is a figure which shows an example of the graph at the time of carrying out the statistical processing of the said reflective intensity. It is.
  • (A) is a figure which shows an example of the reflective intensity measured about the white line of a broken line along the advancing direction of the vehicle which concerns on a 3rd modification
  • (B) is used for degradation determination based on distribution of the said reflective intensity It is an example figure showing an effective area and an invalid area which divide reflective intensity. It is a figure which shows an example of the data structure of the map data which concerns on a 5th modification. It is a figure which shows an example of the data structure of the transmission data which concerns on a 6th modification.
  • FIG. 1 is a block diagram showing a schematic configuration of a data structure of map data according to the present embodiment.
  • the data structure 1 of map data includes position information 1A indicating the position of a feature, and degradation information 1B related to the degradation state of the feature in association with each other. ing.
  • the transmission data is used in a process in which the information processing apparatus determines, based on the deterioration information, whether the feature specified by the position information is deteriorated.
  • the information processing apparatus determines whether or not the feature at a certain position is deteriorated based on the deterioration information associated with the position information indicating the position. can do.
  • the map data management system S of the present embodiment includes a server device 100 that manages map data, and an on-board terminal 200 mounted on each of a plurality of vehicles. And each in-vehicle terminal 200 are connected via the network NW. Although one in-vehicle terminal 200 is shown in FIG. 2, the map data management system S may include a plurality of in-vehicle terminals 200. Further, the server apparatus 100 may also be configured by a plurality of apparatuses.
  • the on-vehicle terminal 200 includes the white line W1 and the white line W2 of the light L emitted by the lidar 205 by itself in the vehicle V on which the lidar 205 is mounted.
  • Reflection intensity data D indicating the reflection intensity measured by receiving the reflected light from “an example of the object” is transmitted to the server apparatus 100.
  • the bar shown as the reflection intensity data D in FIG. 3A and FIG. 4A represents the magnitude of the reflection intensity at that point by the length thereof (the longer the reflection intensity, the longer the reflection intensity).
  • the reflection intensity data D is data including the reflection intensity at each point irradiated with the light L irradiated by the lidar 205.
  • FIGS. 3A and 4A show that the reflection intensities at five points are measured for each of the white line W1 and the white line W2.
  • the server device 100 identifies the degraded feature based on the plurality of pieces of reflection intensity data D received from each of the plurality of on-vehicle terminals 200.
  • the degraded feature is identified by statistically processing the plurality of reflection intensity data D.
  • the reflection intensity at each point in the white line is a high value and a uniform value for the white line not degraded, as shown in FIG. 3B.
  • the average value ⁇ calculated based on the plurality of reflection intensities increases, and the standard deviation ⁇ decreases.
  • FIG. 4 (A) as for the deteriorated white line, the reflection intensity at each point in the white line has a low value or an uneven value, as shown in FIG.
  • the server apparatus 100 determines whether the feature is degraded or not by comparing the average value ⁇ and the standard deviation ⁇ calculated based on the plurality of reflection intensities with a threshold, and identifies the degraded feature. Do. Then, the server device 100 updates the map data corresponding to the degraded feature.
  • the map data may be updated by a device that has received an instruction from the server device 100.
  • the on-vehicle terminal 200 is roughly divided into a control unit 201, a storage unit 202, a communication unit 203, and an interface unit 204.
  • the storage unit 202 is configured by, for example, a hard disk drive (HDD) or a solid state drive (SSD), and an operating system (OS), a reflection intensity data processing program, map data, reflection intensity data D, various data, and the like.
  • the map data describes position information indicating the position of a feature (white line in the present embodiment) to be subjected to the deterioration determination, and a feature ID for identifying the feature as another feature (position Since the information and the feature ID are information linked to one feature, the feature ID can be said to be one of position information indicating the position of the one feature).
  • different feature IDs are assigned to the white line W1 and the white line W2.
  • map data maps data in which position information and feature ID are described for each feature
  • map data stored in the storage unit 202 may store, for example, map data of the whole country, or map data corresponding to a certain area including the current position of the vehicle is received in advance from the server device 100 or the like. May be stored.
  • the communication unit 203 controls the communication state between the in-vehicle terminal 200 and the server device 100.
  • the interface unit 204 implements an interface function when exchanging data between the external device Lidar 205 or the internal sensor 206 and the on-vehicle terminal 200.
  • the Lidar 205 is attached to the roof of the vehicle, and emits infrared laser light (irradiating downward from the roof at a certain angle), and receives light reflected from points on the surface of the feature around the vehicle It is an apparatus which detects a feature by repeating the processing of doing so as to draw a circle around the vehicle and generating reflection intensity data D indicating the reflection intensity at each point. Since the reflection intensity data D is data indicating the intensity of laser light being horizontally irradiated and reflected by the ground or a feature, the portion with low reflection intensity (the ground portion without features) and the portion with high reflection intensity ( The portion where the feature exists) is included.
  • a plurality of lidars 205 may be attached to the front or rear of the vehicle, and the reflection intensity data of the field of view acquired by each may be combined to generate reflection intensity data D around the vehicle.
  • the Lidar 205 When the reflection intensity is measured, the Lidar 205 immediately transmits reflection intensity data D (including a portion with low reflection intensity and a portion with high reflection intensity) to the on-vehicle terminal 200 via the interface unit 204.
  • the control unit 201 receives the reflection intensity data D from the lidar 205, the control unit 201 receives irradiation intensity data indicating the intensity of the infrared laser light irradiated for measuring the reflection intensity data D, and the reflection intensity data D.
  • the measurement position information indicating the position of the vehicle (Lidar 205) at the time of reception and the measurement date and time information indicating the date and time of reception of the reflection intensity data D are stored in the storage unit 202 in association with each other.
  • the control unit 201 transmits, to the server apparatus 100, one of the reflection intensity data D, the irradiation intensity data, the measurement position information, and the measurement date and time information stored in the storage unit 202 after a predetermined time has elapsed from measurement.
  • the items may be deleted from the storage unit 202.
  • the internal sensor 206 is a generic term such as a satellite positioning sensor (GNSS (Global Navigation Satellite System)), a gyro sensor, a vehicle speed sensor, and the like mounted on a vehicle.
  • GNSS Global Navigation Satellite System
  • the camera 207 is mounted on a vehicle, and transmits a photographed image obtained by photographing the surroundings of the vehicle to the on-vehicle terminal 200 via the interface unit 204.
  • the control unit 201 temporarily stores various data, such as a central processing unit (CPU) for controlling the entire control unit 201, a read only memory (ROM) in which a control program for controlling the control unit 201 and the like is stored in advance. And a random access memory (RAM).
  • the CPU reads and executes various programs stored in the ROM and the storage unit 202 to realize various functions.
  • the control unit 201 acquires estimated own vehicle position information.
  • the estimated own vehicle position information may be generated by a device outside the on-board terminal 200, or may be generated by the control unit 201.
  • the estimated vehicle position information is generated, for example, by matching the feature position measured by Lidar 205 with the feature position of map data for automatic driving, or based on information detected by the inner sensor 206 and map data It can be generated or a combination of these.
  • control unit 201 predicts the actual white line position viewed from the own vehicle (Lidar 205) based on the estimated own vehicle position information and the white line position information indicated by the map data. At this time, the control unit 201 calculates and sets the white line prediction range in which the white line is included, with some margin.
  • Map coordinate system Xm
  • Ym Vehicle coordinate system XV
  • YV White line map position in map coordinate system: mxm
  • mym White line predicted position in the vehicle coordinate system lxv
  • lyv Estimated vehicle position in map coordinate system: xm
  • ym Estimated vehicle azimuth in map coordinate system: ⁇ m
  • a vehicle coordinate system is a coordinate system which makes the position of a vehicle a reference (origin).
  • the control unit 201 calculates a white line prediction range from the white line map position indicated by the position information of the white line in the traveling direction (for example, 10 m ahead) of the vehicle based on the estimated own vehicle position indicated by the estimated own vehicle position information. At this time, as shown in FIG. 5, if the lane in which the vehicle V travels is divided by the white line 1 on the left side and the white line 2 on the right side, the white line prediction range is calculated for each of the white lines 1 and 2.
  • the control unit 201 calculates the white line 1 predicted position 301 (white line predicted position for the white line 1) based on the white line 1 map position and the estimated own vehicle position.
  • the white line predicted position is obtained by the following equation (1).
  • the control unit 201 sets a white line 1 prediction range 311 based on the white line 1 prediction position 301. Specifically, a certain range including the white line 1 predicted position 301 is set as the white line 1 predicted range 311. Then, the control unit 201 extracts white line 1 reflection intensity data 321 indicating the reflection intensity in the white line 1 prediction range 311 from the reflection intensity data D including the reflection intensities at a plurality of points.
  • the control unit 201 extracts the reflection intensity data 321 and 322 thus extracted from the ground object ID respectively corresponding to the white line 1 map position and the white line 2 map position, and the reflection intensity data D from which the reflection intensity data 321 and 322 are extracted. It transmits to the server apparatus 100 in association with the corresponding irradiation intensity data and measurement date and time information.
  • reflection intensity data D (which may be raw data measured by Lidar 205 or data obtained by processing the raw data) measured by Lidar 205 and stored in storage unit 202 by in-vehicle terminal 200 may be pre-extraction reflection intensity
  • the data D may be referred to as data D, and the data D may be referred to as reflection intensity data D after extraction, which indicates the reflection intensity extracted in the white line prediction range.
  • the transmission data 500 is configured to include a basic information unit 510, a recognition object information unit 520, and a unique information unit 530.
  • the basic information unit 510 includes a header 511, vehicle metadata 512, and a vehicle position 513.
  • the header 511 stores Ver (version) of a data format of transmission data, and a time stamp (transmission time of transmission data 500).
  • the vehicle metadata 512 stores a vehicle ID identifying a vehicle on which the on-board terminal 200 is mounted, information indicating a vehicle size, and information indicating a type of sensor mounted on the vehicle (Lidar 205 or camera 207).
  • the vehicle position 513 stores information indicating the vehicle position when the on-board terminal 200 recognizes a feature.
  • the recognition object information unit 520 is configured to include the feature ID 521.
  • the feature ID 521 stores an ID (for example, feature IDs respectively corresponding to the white line 1 map position and the white line 2 map position) specifying the feature which is the target of the deterioration information.
  • the server apparatus 100 or the on-board terminal 200 specifies a feature based on the feature ID.
  • the specific information unit 530 includes an acquisition date 531, deterioration information 532, weather information 533, and a sensor type 534.
  • the deterioration information 532 stores the post-extraction reflection intensity data D and the irradiation intensity data.
  • the acquisition date 531 stores measurement date information indicating the date when the control unit 201 receives the reflection intensity data D from the lidar 205.
  • the weather information 533 stores information indicating the weather (such as weather, rain, snow, fog, etc.) when the control unit 201 receives the reflection intensity data D from the lidar 205.
  • the weather information may be generated by the control unit 201 based on, for example, an image captured by the camera 207, or may be received from the weather information providing server.
  • the specific information unit 530 may include road surface information indicating the state of the road surface (wet, snow, etc.) instead of or in addition to the weather information 533.
  • the usage sensor type 534 stores information indicating the sensor type for which the data stored in the deterioration information 532 has been measured. In the present embodiment, since the post-extraction reflection intensity data D and the irradiation intensity data are stored in the deterioration information 532, information indicating the lidar 205 is stored in the use sensor type 534.
  • the server apparatus 100 is roughly divided into a control unit 101, a storage unit 102, a communication unit 103, a display unit 104, and an operation unit 105.
  • the storage unit 102 is configured by, for example, an HDD or an SSD, and stores an OS, a white line deterioration determination program, map data, transmission data 500 received from the on-vehicle terminal 200, and various other data.
  • the communication unit 103 controls the communication state with the on-vehicle terminal 200.
  • the display unit 104 is configured of, for example, a liquid crystal display or the like, and displays information such as characters and images.
  • the operation unit 105 includes, for example, a keyboard, a mouse, and the like, receives an operation instruction from an operator, and outputs the content of the instruction to the control unit 101 as an instruction signal.
  • the control unit 101 includes a CPU that controls the entire control unit 101, a ROM in which a control program that controls the control unit 101 and the like is stored in advance, and a RAM that temporarily stores various data.
  • the CPU reads and executes various programs stored in the ROM and the storage unit 102 to realize various functions.
  • the control unit 101 determines the deterioration state of the white line based on the plurality of pieces of reflection intensity data D received from each of the one or more on-vehicle terminals 200. Then, the control unit 101 updates the map data corresponding to the feature so that the degraded feature can be identified as being degraded.
  • step S105 the server device 100 executes the processing of steps S201 to S202.
  • control unit 201 of the on-vehicle terminal 200 acquires estimated own vehicle position information (step S101).
  • control unit 201 acquires white line position information from the map data corresponding to the estimated own vehicle position indicated by the estimated own vehicle position information acquired in the process of step S101 (step S102). At this time, as described above, the control unit 201 acquires the position information of the white line in the traveling direction of the vehicle and the feature ID.
  • control unit 201 calculates and sets a white line prediction range from the estimated own vehicle position indicated by the estimated own vehicle position information and the white line map position indicated by the white line position information (step S103).
  • the control unit 201 extracts the reflection intensity data D before extraction out of the reflection intensity data D before extraction measured by Lidar 205 and obtains the reflection intensity data D after extraction (step S104). Specifically, the control unit 201 first measures the measurement position of the measurement position information stored in association with the pre-extraction reflection intensity data D, and the irradiation angle when the lidar 205 irradiates the laser light Based on the angle), the pre-extraction reflection intensity data D measured by irradiating the laser light to the range including the white line prediction range is specified. Next, the control unit 101 extracts a portion within the white line prediction range in the identified pre-extraction reflection intensity data D to obtain post-extraction reflection intensity data D2. For example, the control unit 101 extracts a portion corresponding to the azimuth ( ⁇ 1, ⁇ 2 (see FIG. 5)) of the white line prediction range based on the vehicle direction.
  • control unit 201 generates transmission data 500 (step S105).
  • the post-extraction reflection intensity data D extracted in the process of step S104 is stored in the deterioration information 532.
  • control unit 201 transmits the transmission data 500 generated in the process of step S105 to the server apparatus 100 (step S106), and ends the process of reflection intensity data transmission.
  • control unit 101 of the server device 100 receives the transmission data 500 from the in-vehicle terminal 200 (step S201), the control unit 101 stores the transmission data 500 in the storage unit 102 (step S202), and ends the reflection intensity data transmission process. Do. As a result, the plurality of post-extraction reflection intensity data D transmitted from each of the plurality of in-vehicle terminals 200 is accumulated in the storage unit 102 of the server device 100.
  • the control unit 101 of the server device 100 acquires the specified feature ID (step S211).
  • the control unit 101 is the transmission data 500 including the feature ID, and the deterioration information of the transmission data 500 whose measurement date and time is within a predetermined period (for example, the last three months) (reflection intensity data D after extraction) ) From the storage unit 102 (step S212).
  • the reason for limiting the acquisition target to the reflection intensity data D whose measurement date and time is within a predetermined period is because the reflection intensity data D which is too old is not suitable for determining the current deterioration state.
  • the control unit 101 corrects the extracted reflection intensity data D extracted in the process of step S212 (step S213).
  • the reflection intensity data D is corrected using at least one of measurement date information and weather information (road surface information).
  • the reflection intensity measured by the Lidar 205 differs depending on the measurement date and time and the weather (road surface condition) at the time of measurement because it is affected by the influence of sunlight and the condition of the road surface. For example, even if the reflection intensity at the same white line is, the reflection intensity at dawn or dusk and the reflection intensity in the day are different.
  • the reflection intensity at the time of fine weather and the reflection intensity at the time of cloudy weather, rain, snow, etc. differ. Thereby, the difference between the reflection intensity data due to the time of measurement or the weather (road surface condition) can be offset to perform the deterioration determination appropriately.
  • the control unit 101 calculates an average of the extracted reflection intensity data D corrected in the process of step S213 (step S214), and then calculates a standard deviation (step S215).
  • the control unit 101 extracts reflection intensity data D after extraction for each feature ID (every white line W1 and white line W2 in the example of FIG. 3A and FIG. 4A).
  • the control unit 101 calculates the average and the standard deviation of the reflection intensities at the respective points. Calculate
  • step S216 determines whether the average calculated in the process of step S214 is equal to or less than the first threshold.
  • step S216 determines that the average is equal to or less than the first threshold.
  • step S2128 determines that the designated white line is "deteriorated” (step S218). Updating is performed to add information indicating "deterioration" for the corresponding map data (step S219), and the deterioration determination process is ended.
  • the control unit 101 determines that the designated white line is "deteriorated” in the process of step S218 is an example of specifying the deteriorated white line.
  • step S216 determines that the average is not less than the first threshold (step S216: NO)
  • step S217 determines whether the standard deviation calculated in the process of step S215 is the second threshold or more It is determined (step S217).
  • step S217: YES when the control unit 101 determines that the standard deviation is equal to or larger than the second threshold (step S217: YES), the control unit 101 determines that the designated white line is "deteriorated” (step S218). The map data corresponding to is updated by adding information indicating "deterioration" (step S219), and the deterioration determination process is ended.
  • step S217: NO when the control unit 101 determines that the standard deviation is not greater than or equal to the second threshold
  • step S220 determines that the specified white line is "not deteriorated" (step S220). Finish.
  • the control unit 101 of the server apparatus 100 controls the light irradiated by the lidar 205 provided in the vehicle (an example of the “moving object”) of the white line (“feature”)
  • An example is acquired reflection intensity data D measured by receiving the reflected light which reflected, and the white line which has deteriorated based on the acquired reflection intensity data D concerned is specified.
  • the deteriorated white line can be identified by using the reflection intensity data D acquired from the vehicle.
  • the white line deterioration based on an image captured by a camera mounted on a vehicle, the night time, changes in brightness such as back light, camera resolution, etc. cause deterioration judgment appropriately. It is difficult to carry out, and the deterioration determination using the reflection intensity data D as in this embodiment is superior.
  • the control unit 101 further acquires measurement date and time information indicating the measurement date and time of the reflection intensity data D, selects the reflection intensity data D measured in a predetermined period based on the measurement date and time information, and selects the selected reflection intensity
  • the deteriorated white line is identified based on the data D. Therefore, by appropriately setting a predetermined period (for example, the last few months), the white line that has deteriorated based on the reflection intensity data excluding the reflection intensity data D that is inappropriate for the white line deterioration determination can be appropriately determined. It can be identified.
  • control unit 101 further acquires a feature ID for specifying the position of the white line that has reflected light, and specifies the deteriorated white line based on the reflection intensity data D and the feature ID. Thereby, the position of the deteriorated white line can also be specified.
  • control unit 101 specifies the deteriorated white line based on the post-extraction reflection intensity data D measured within the set white line prediction range based on the position information.
  • the deterioration determination can be performed excluding the reflection intensity data D indicating the light reflected by other than the white line, and the deterioration determination can be performed on the white line with higher accuracy.
  • control unit 101 updates the map data corresponding to the white line determined to be “deteriorated” in the process of step S 218 in FIG. 8 (performs update to add information indicating “deterioration is present”) .
  • the degradation information can be reflected on the map data representing the degraded white line.
  • control unit 101 receives the reflected light in which the white line reflects the light irradiated by Lidar included in one or a plurality of vehicles, and acquires the reflection intensity data D measured in each of the one or a plurality of vehicles. And identifies the white line that has deteriorated based on the plurality of pieces of acquired reflection intensity data D. Therefore, according to the map data management system S of the present embodiment, it is possible to identify the white line which has deteriorated with high accuracy, using the reflection intensity data D acquired from one or a plurality of vehicles.
  • the feature that is the target of the degradation determination is described as a white line, but all features that can be subjected to the degradation determination based on the reflection intensity can be the targets of the degradation determination.
  • the white line which is the target of the deterioration determination is a solid line
  • the white line may be a broken line.
  • White lines are used not only to separate lanes, but also as guideways, letters and pedestrian crossings.
  • the target of the deterioration determination may include not only white lines but also features such as signs and signs. That is, the features to be subjected to the deterioration determination can be classified into various types. Therefore, feature type information indicating the type of feature may be further linked to the feature ID, and the threshold value may be changed in the deterioration determination using the reflection intensity data according to the type of feature. .
  • the white line prediction range is set for each type of white line (for example, the white line prediction range)
  • the position information of the four corner points is described in the map data, and the control unit 201 of the in-vehicle device 200 sets the white line prediction range based thereon) and the white line prediction range according to the type of white line to be subjected to the deterioration determination. May be set.
  • the control unit 201 of the on-vehicle terminal 200 periodically transmits the reflection intensity data D to the server device 100
  • the on-vehicle terminal 200 (vehicle) of the control unit 201 is predetermined.
  • a condition may be added that only the reflection intensity data D measured in the area of (for example, the measurement area designated by the server apparatus 100) is transmitted.
  • the reflection intensity data D measured in the area in which the white line does not need to be determined does not need to be received. I'm sorry.
  • control unit 101 of the server apparatus 100 refers to the position information stored together with the reflection intensity data D in the storage unit 102, and performs deterioration determination of the designated area (for example, the white line designated by the operator or the like). It is also possible to identify the white line that is deteriorating based on the reflection intensity data measured in the required area). In this way, by specifying an area that needs to be judged for deterioration, only the white line in the area can be judged for deterioration, and the processing load is higher than when it is judged that the area does not need to be judged for deterioration. Can be reduced.
  • the designated area for example, the white line designated by the operator or the like.
  • the lidar 205 is attached to the roof portion of the vehicle and the like, and one infrared laser beam L is irradiated to draw a circle around the vehicle downward at a constant angle, for example, as shown in FIG.
  • a plurality (five in FIG. 9A) of infrared laser beams L are irradiated at different irradiation angles in the downward direction so that Lidar 205 draws a circle around the vehicle. It may be As a result, as shown in FIG. 9B, the reflection intensity data D can be measured at one time along the traveling direction of the vehicle.
  • the measurement of the reflection intensity data D by irradiating a single infrared laser beam L is performed every time the vehicle V moves by a predetermined distance, and by combining them, as shown in FIG. Similar to the example shown in, the reflected intensity data D can be obtained along the traveling direction of the vehicle.
  • the control unit 101 of the server device 100 sets the area (paint area) where the white line is painted as the effective area, based on the reflection intensity data D measured along the traveling direction of the vehicle V as described above. It is also possible to divide a non-painted part (non-painted area) as an invalid area, and to perform deterioration determination based on only the reflection intensity corresponding to the effective area. Specifically, as shown in FIGS.
  • the reflection intensity indicated by the reflection intensity data D measured along the traveling direction of the vehicle V with respect to the white line of the broken line is roughly classified into the paint area Since a high value is obtained and a low value is obtained in the non-painting area, a threshold value is set, and if the reflection intensity is equal to or more than the threshold value, the area is separated as an effective area, and other areas are classified as invalid areas.
  • the feature ID is set for each predetermined distance similarly to the white line which is a solid line, without setting the feature ID for each paint area of the white line which is a broken line. It is possible to prevent the deterioration determination on the non-painted area.
  • the method of separating the paint area (effective area) and the non-paint area (ineffective area) is not only the white line of the broken line but also the deterioration of the conduction zone, the character, the pedestrian crossing etc. composed of the paint area and the non-paint area It may be adopted for determination.
  • the white line prediction range is calculated in advance, and the reflection intensity data D included therein is processed as the light reflected by the white line. That is, it has been guaranteed that the post-extraction reflection intensity data D transmitted by the on-board terminal 200 to the server device 100 is data indicating the reflection intensity at the white line.
  • the on-vehicle terminal 200 transmits the pre-extraction reflection intensity data D received from the lidar 205 in association with the measurement position information and the measurement date and time information to the server device 100.
  • the control unit 101 of the server device 100 specifies the reflection intensity based on the reflection by the white line from the distribution of the reflection intensity indicated by the pre-extraction reflection intensity data D and the positional relationship of the white line that divides the lane where the vehicle travels.
  • the deterioration determination is performed based on the identified reflection intensity.
  • the degradation is caused by the combination of the measurement position information corresponding to the reflection intensity data D and the information indicating the irradiation angle at which the lidar 205 irradiates the laser light L.
  • the position of the white line may be specified.
  • the reflectance (the ratio of the post-extraction reflection intensity data D to the irradiation intensity data) may be stored instead of the post-extraction reflection intensity data D and the irradiation intensity data.
  • the date and time of measurement of the post-extraction reflection intensity data D as a basis of the reflectance is stored in the acquisition date and time 531.
  • the control unit 101 of the server 100 receives the extracted reflection intensity data D and the irradiation intensity data received from the on-vehicle terminal 200, or the reflectance (hereinafter, “the post-extraction reflection intensity data D and the irradiation intensity data, or the reflectance”).
  • the degradation level for example, 10 levels
  • the degradation type for example, partial peeling of paint, smear of paint, adhesion of dirt
  • the control unit 101 receives the photographed image photographed by the camera 207 from the on-vehicle terminal 200, and based on at least one of the photographed image or the reflection intensity data D, the deterioration level of the feature or The type of deterioration may be determined.
  • the control unit 101 may perform statistical processing on the reflection intensity data D or the like to make a determination, analyze a captured image to make a determination, or may make a combination by making a combination of both.
  • the deterioration information 532 of the transmission data 500 may include the photographed image or an area for the photographed image may be separately provided.
  • the acquisition date 531 stores the shooting date and time of the captured image
  • the usage sensor type 534 stores information indicating the camera 207.
  • the control unit 101 reflects the determined degradation level and degradation type of the feature on the map data.
  • the data structure of map data stored in storage unit 102 using FIG. 11 (in particular, the data structure of map data that holds degradation information related to white lines (sometimes referred to as “division lines” that are features) explain.
  • the map data 600 includes the feature ID (division line ID) 601, the position 602, the line width 603, the belonging link 604, the type of line 605, the deterioration information acquisition date (hour) 606, and the deterioration information 607. And the degradation type 608.
  • the map data 600 is used for processing in which the server apparatus 100, the in-vehicle terminal 200, or another information processing apparatus determines, based on the deterioration information 607, whether the feature specified by the position 602 is deteriorated.
  • the feature ID (section line ID) 601 stores an ID for identifying a section line, and is used when the server apparatus 100 or the on-vehicle terminal 200 specifies a feature (section line).
  • latitude and longitude information indicating the position of the dividing line is stored. In addition, it is good also as storing latitude and longitude information which shows a position about each of a plurality of points which constitute a longitudinal direction line which passes along the center of a division line.
  • the line width 603 stores length information in the short direction of the dividing line.
  • the type of line 605 stores information indicating the type of dividing line (such as a broken line or a solid line).
  • a deterioration information acquisition date (hour) 606 stores a date when the deterioration level stored in the deterioration information 607 is acquired (determined).
  • the deterioration information 607 stores the deterioration level of the dividing line.
  • the degradation type 608 stores information indicating the type of degradation.
  • the measurement date and time such as the reflection intensity data D used to determine the deterioration level or the date or date obtained from this
  • a date or date representing the predetermined period may be stored. For example, when a large amount of reflection intensity data D etc.
  • the predetermined period in step S 212 may be one day such as the day before the treatment day, in which case the date (the day before the treatment day Store the date of In addition, the predetermined period may be from 10 am to 11 am on the ⁇ ⁇ ⁇ month ⁇ day, in which case the deterioration information acquisition date (hour) 606 on the ⁇ ⁇ ⁇ ⁇ ⁇ 10 am (or morning 11) may be stored.
  • the above-mentioned thing was shown as an example of data structure of map data 600, when an object is a white line, whether or not the retroreflective material is further applied besides the above, a puddle at the time of rainfall You may add information indicating whether it is likely to be
  • the deterioration information acquisition date (hour) 606 storing the information, the deterioration information 607 storing the deterioration level, and the deterioration type 608 storing the information indicating the deterioration type may be held in the map data 600 as history information.
  • the history information can be used to predict the progress of the deterioration of the feature.
  • the control unit 101 of the server device 100 determines the degradation level or the degradation type of the feature based on the reflection intensity data D or the like received from the in-vehicle terminal 200 or the photographed image
  • the in-vehicle terminal The control unit 201 may determine the degradation level or the degradation type of the feature based on the reflection intensity data D and the like, and may transmit the determination result to the server device 100 by including the determination result in the transmission data 500.
  • transmission data 500 in the sixth modification the date and time when the deterioration level is determined is stored in acquisition date and time 531 (or the date or date and time when measurement timing such as reflection intensity data D used for determination of the deterioration level is specified May be stored).
  • the deterioration level determined by the control unit 201 is stored in the deterioration information 532 (However, in this case as well, it is preferable to include the reflection intensity data D etc. in the transmission data 500). Furthermore, as shown in FIG. 12, the degradation type 535 is provided in the specific information portion 530 of the transmission data 500, and information indicating the degradation type is stored. Furthermore, the weather information 533 may store information specifying the weather or road surface condition when measuring the reflection intensity data D used to determine the deterioration level. Furthermore, when the degradation level or the degradation type of the feature is determined based on both the reflection intensity data D measured by Lidar 205 and the like and the captured image captured by the camera 207, Lidar 205 and the use sensor type 534 of the transmission data 500 are used.
  • the information indicating the camera 207 is stored.
  • the on-vehicle terminal 200 stores the information indicating the other sensors in the use sensor type 534 of the transmission data 500. Do.
  • the control unit 101 of the server device 100 stores the information stored in the acquisition date 531 of the transmission data 500 in the deterioration information acquisition date (hour) 606 of the map data 600. Further, the control unit 101 stores the degradation level stored in the degradation information 532 of the transmission data 500 in the degradation information 607 of the map data 600. Further, the control unit 101 stores the information stored in the degradation type 535 of the transmission data 500 in the degradation type 608 of the map data 600. Each time the control unit 101 receives the transmission data 500 including the same feature ID (division line ID), the degradation information stores the information stored in the acquisition date 531 in association with the feature ID.
  • the acquisition date (hour) 606, the deterioration information 607 storing the deterioration level stored in the deterioration information 532 and the deterioration type 608 storing the information stored in the deterioration type 535 may be held in the map data 600 as history information .
  • the history information can be used to predict the progress of the deterioration of the feature.
  • the seventh modification is a modification of the fifth modification.
  • the reflection information D and the like stored in the deterioration information 532 of the transmission data 500 may be stored in the deterioration information 607 of the map data 600 in the seventh modification. That is, the control unit 101 of the server device 100 stores reflection intensity data D and the like stored in the deterioration information 532 in the transmission data 500 received from the vehicle terminal 200 without determining the deterioration level and the deterioration type.
  • the measurement date and time stored in the acquisition date and time 531 of the data 500 may be stored in the deterioration information acquisition date (hour) 606 of the map data 600.
  • each time transmission data 500 including the same feature ID (division line ID) is received deterioration information acquisition is performed in which the information stored in the acquisition date 531 is stored in association with the feature ID.
  • the degradation information 607 storing reflection intensity data D and the like stored in the degradation information 532 may be held in the map data 600 as history information.
  • the history information can be used to predict the progress of the deterioration of the feature.
  • the deterioration of the feature is recovered without the restoration of the feature, it can be determined that the deterioration type is dirt adhesion.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Immunology (AREA)
  • Chemical & Material Sciences (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

L'invention concerne une structure de données de données cartographiques, dans laquelle il est possible de déterminer la dégradation ou non d'une caractéristique de terrain à une position. Les données cartographiques sont conçues en une structure de données dans laquelle des informations de position indiquant la position d'une caractéristique de terrain et des informations de dégradation concernant l'état de dégradation de la caractéristique de terrain sont comprises en association les unes avec les autres. Les données cartographiques sont utilisées dans un processus au cours duquel un dispositif de traitement d'informations détermine, en fonction des informations de dégradation, la dégradation ou non de la caractéristique de terrain spécifiée par les informations de position.
PCT/JP2018/043571 2017-11-30 2018-11-27 Structure de données de données cartographiques WO2019107353A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2019557235A JPWO2019107353A1 (ja) 2017-11-30 2018-11-27 地図データのデータ構造
JP2022104284A JP2022121579A (ja) 2017-11-30 2022-06-29 地図データのデータ構造
JP2023201299A JP2024009306A (ja) 2017-11-30 2023-11-29 地図データのデータ構造

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-229995 2017-11-30
JP2017229995 2017-11-30

Publications (1)

Publication Number Publication Date
WO2019107353A1 true WO2019107353A1 (fr) 2019-06-06

Family

ID=66664035

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/043571 WO2019107353A1 (fr) 2017-11-30 2018-11-27 Structure de données de données cartographiques

Country Status (2)

Country Link
JP (3) JPWO2019107353A1 (fr)
WO (1) WO2019107353A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150285688A1 (en) * 2014-04-03 2015-10-08 General Electric Company Thermographic route examination system and method
JP2017020303A (ja) * 2015-07-14 2017-01-26 株式会社キクテック 道路標示体劣化検出方法
WO2017014288A1 (fr) * 2015-07-21 2017-01-26 株式会社東芝 Analyseur de fissures, procédé d'analyse de fissures et programme d'analyse de fissures
WO2017057356A1 (fr) * 2015-09-28 2017-04-06 倉敷紡績株式会社 Appareil d'imagerie de structure, appareil d'inspection de structure et système d'inspection de structure

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4984152B2 (ja) * 2007-08-31 2012-07-25 アイシン・エィ・ダブリュ株式会社 画像認識システム、サーバ装置、及び画像認識装置
JP2011027594A (ja) * 2009-07-27 2011-02-10 Toyota Infotechnology Center Co Ltd 地図データ検証システム
JP5776546B2 (ja) * 2011-12-28 2015-09-09 富士通株式会社 路面調査プログラム及び路面調査装置
JP6040096B2 (ja) * 2013-05-20 2016-12-07 国立大学法人 東京大学 路面状態推定装置
CN107430817B (zh) * 2015-03-03 2021-07-13 日本先锋公司 路径搜索装置、控制方法及非暂时性计算机可读介质
JP2016170708A (ja) * 2015-03-13 2016-09-23 株式会社東芝 道路状態判定装置および道路状態判定方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150285688A1 (en) * 2014-04-03 2015-10-08 General Electric Company Thermographic route examination system and method
JP2017020303A (ja) * 2015-07-14 2017-01-26 株式会社キクテック 道路標示体劣化検出方法
WO2017014288A1 (fr) * 2015-07-21 2017-01-26 株式会社東芝 Analyseur de fissures, procédé d'analyse de fissures et programme d'analyse de fissures
WO2017057356A1 (fr) * 2015-09-28 2017-04-06 倉敷紡績株式会社 Appareil d'imagerie de structure, appareil d'inspection de structure et système d'inspection de structure

Also Published As

Publication number Publication date
JP2022121579A (ja) 2022-08-19
JPWO2019107353A1 (ja) 2020-12-17
JP2024009306A (ja) 2024-01-19

Similar Documents

Publication Publication Date Title
US11821750B2 (en) Map generation system, server, vehicle-side device, method, and non-transitory computer-readable storage medium for autonomously driving vehicle
US11852498B2 (en) Lane marking localization
US20210183099A1 (en) Map system, method and non-transitory computer-readable storage medium for autonomously navigating vehicle
US20210179138A1 (en) Vehicle control device, method and non-transitory computer-readable storage medium for automonously driving vehicle
US10604156B2 (en) System and method for adjusting a road boundary
US11248925B2 (en) Augmented road line detection and display system
US10620317B1 (en) Lidar-based high definition map generation
RU2654502C2 (ru) Система и способ дистанционного наблюдения за транспортными средствами
US9863928B1 (en) Road condition detection system
US10145692B2 (en) Vehicle position determination apparatus and vehicle position determination method
JP2020500290A (ja) 位置特定基準データを生成及び使用する方法及びシステム
JP2022016460A (ja) 劣化地物特定装置、劣化地物特定システム、劣化地物特定方法、劣化地物特定プログラム及び劣化地物特定プログラムを記録したコンピュータ読み取り可能な記録媒体
US11002553B2 (en) Method and device for executing at least one measure for increasing the safety of a vehicle
US20230148097A1 (en) Adverse environment determination device and adverse environment determination method
CN115294544A (zh) 驾驶场景分类方法、装置、设备及存储介质
US11485373B2 (en) Method for a position determination of a vehicle, control unit, and vehicle
CN108986463B (zh) 一种路况信息处理方法、装置及电子设备
EP2945138B1 (fr) Procédé et appareil de fourniture de données d'informations sur des entités le long d'un trajet emprunté par un véhicule
JP2019101605A (ja) 送信データのデータ構造
JP6500607B2 (ja) 自車位置判定装置及び自車位置判定方法
US20210061164A1 (en) System and method for illuminating a path for an object by a vehicle
WO2019107353A1 (fr) Structure de données de données cartographiques
DE102022104054A1 (de) Die fahrzeugzustandsschätzung verbessernde sensordaten zur fahrzeugsteuerung und zum autonomen fahren
JP6500606B2 (ja) 自車位置判定装置及び自車位置判定方法
JP2019101138A (ja) 地図データのデータ構造

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18883421

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019557235

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18883421

Country of ref document: EP

Kind code of ref document: A1