WO2020045324A1 - Dispositif côté véhicule, procédé et support de stockage - Google Patents

Dispositif côté véhicule, procédé et support de stockage Download PDF

Info

Publication number
WO2020045324A1
WO2020045324A1 PCT/JP2019/033210 JP2019033210W WO2020045324A1 WO 2020045324 A1 WO2020045324 A1 WO 2020045324A1 JP 2019033210 W JP2019033210 W JP 2019033210W WO 2020045324 A1 WO2020045324 A1 WO 2020045324A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
map
road
information
tile
Prior art date
Application number
PCT/JP2019/033210
Other languages
English (en)
Japanese (ja)
Inventor
亮太 寺田
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019143135A external-priority patent/JP7251394B2/ja
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to DE112019004323.4T priority Critical patent/DE112019004323T5/de
Publication of WO2020045324A1 publication Critical patent/WO2020045324A1/fr
Priority to US17/185,694 priority patent/US11835361B2/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Definitions

  • the disclosure of this specification relates to a vehicle-side device, a method, and a storage medium for autonomously traveling using map data.
  • position information such as landmarks is recorded using an image captured by a camera mounted on a vehicle, and the information is uploaded to a server or the like to generate a sparse map.
  • a technique for downloading a generated sparse map and determining the position of the host vehicle is disclosed.
  • an object of the disclosure of this specification is to provide a map system and a method capable of efficiently acquiring map information, and a storage medium for causing a computer to execute them.
  • One of the vehicle-side devices disclosed in this specification autonomously drives a vehicle along a road segment using map data including coordinate information of at least one of a landmark existing along a road and a road edge.
  • a vehicle-side device using at least one processor for causing the map data to be distributed from a predetermined server to the vehicle in a patch unit that divides a map recording area;
  • the processor acquires the position of the vehicle based on the detection result of the positioning sensor mounted on the vehicle, downloads the map data in patch units from a server that manages the map data, and based on the current position of the vehicle.
  • Setting a patch to which a road to which a vehicle may pass may belong to a download target, and executing the setting. And wherein the Rukoto.
  • the map tile to be downloaded is appropriately determined for the main route from the current position to the destination and the branch road that may depart from the main route and travel. Can be selected.
  • FIG. 4 is a diagram illustrating an example of information included in map data. It is a conceptual diagram which shows an example of the structure of map data.
  • 9 is a flowchart illustrating probe data upload control by a main processor. It is a figure which shows a low frequency area (or prohibited area).
  • 5 is a flowchart illustrating an example of control executed by a server processor. It is a figure showing dispersion of a landmark.
  • FIG. 4 is a diagram illustrating a relationship between a reference mark and other landmarks.
  • FIG. 9 is a flowchart illustrating a correction process using a reference mark.
  • 5 is a flowchart illustrating an example of control executed by a server processor.
  • 5 is a flowchart illustrating an example of control executed by a main processor.
  • 5 is a flowchart illustrating an example of control executed by a main processor.
  • 5 is a flowchart illustrating an example of control executed by a main processor.
  • 5 is a flowchart illustrating an example of control executed by a main processor.
  • FIG. 4 is a diagram showing a blind spot portion when the inter-vehicle distance between a host vehicle and a preceding vehicle is short.
  • FIG. 4 is a diagram illustrating a blind spot portion when the inter-vehicle distance between the host vehicle and a preceding vehicle is relatively long.
  • FIG. 5 is a flowchart illustrating an example of control executed by a main processor. It is a figure which shows the light distribution state in anti-glare light distribution. It is a flowchart which shows an example of the control when a destination is not set.
  • FIG. 5 is a diagram illustrating an example of a map tile to be downloaded. It is a flowchart which shows the control when a destination is set.
  • FIG. 5 is a diagram illustrating an example of a map tile to be downloaded. It is a block diagram showing a modification of the composition of in-vehicle system 2. It is a block diagram showing a modification of the composition of in-vehicle system 2.
  • the map system 1 includes at least one vehicle equipped with an on-vehicle system 2 described later, and a server 3 storing map information (hereinafter also referred to as map data).
  • FIG. 1 shows only one block representing a vehicle on which the in-vehicle system 2 is mounted, the map system 1 may include a plurality of mounted vehicles. Each vehicle is configured to be able to wirelessly communicate with the server 3. The vehicle can travel on a road, and travel while sequentially collecting probe data as described later. Further, the vehicle includes an automatic driving system or a driving support system, and performs driving control using map data acquired from the server 3.
  • the vehicle to which the in-vehicle system 2 is applied may be a passenger car, a transport vehicle such as a truck, or a service vehicle such as a taxi.
  • the service vehicle includes a shared bus (in other words, a route bus), a long-distance bus, and a vehicle provided for a sharing service such as car sharing or ride sharing.
  • the shared bus may be an automatic driving bus that automatically runs on a predetermined route.
  • the map system 1 corresponds to a system for autonomously driving a vehicle along a road segment using map data including coordinate information of a plurality of features existing along a road on one side.
  • the expression “along the road” includes not only the side of the road but also the upper part of the road and the road surface.
  • a direction signboard or a beacon station located 3 m or more above a road surface also corresponds to a feature installed along the road.
  • Road markings such as marking lines using paint or road studs also correspond to features existing along the road. Along the road can be paraphrased on and around the road.
  • the above features include the road edge itself.
  • the level of automatic driving using map data is not limited to level 3 or higher, and may be equivalent to level 2.
  • the automation level 2 is one in which the system including the ECU executes the subtasks of the vehicle motion control in both the vertical direction and the horizontal direction in a limited area, for example, automatically performs steering correction for maintaining a lane and speed adjustment.
  • the automation level 3 here is one in which the system performs all the driving operations in a specific place (ODD: Operational Design Domain), and when it is difficult to continue the operation, the authority is transferred to the driver's seat occupant. Point to.
  • the automation level 4 is a level at which the duty of monitoring by the driver occupant specified in the level 3 is released.
  • the automation level 5 indicates a level at which fully automatic driving on all roads is possible.
  • the in-vehicle system 2 mounted on each vehicle includes a camera 10 as an imaging device, an image processor 20, a sensor 30 as a state acquisition unit for acquiring a state of the vehicle, a main processor 40, a communication module 50, a human machine An interface 60 (hereinafter, HMI), an actuator 70, and a memory 80 are provided.
  • the in-vehicle system 2 corresponds to a vehicle-side device or a vehicle control device.
  • a vehicle on which the main processor 40 is mounted is also referred to as a host vehicle.
  • the map system 1 functions additionally to a function for specifying the position of the own vehicle such as GPS, and is effective in specifying the position of the own vehicle with higher accuracy.
  • the map system 1 is roughly divided into two functions: map utilization and map update.
  • map utilization the map information stored in the server 3 is downloaded to the vehicle, and the vehicle uses the map information and the position of a landmark such as a sign included in the image captured by the camera 10 based on the downloaded map information. Locate the vehicle.
  • map update information obtained by the camera 10 or the sensor 30 mounted on the vehicle is uploaded to the server 3 as probe data, and the map information in the server 3 is sequentially updated.
  • the camera 10 is mounted on a vehicle and captures an image of the environment around the vehicle at a wavelength in the visible light region.
  • the camera 10 captures, for example, an environment in front of the vehicle.
  • the camera 10 may be configured to capture an image of at least one of the rear side and the side, not limited to the front of the vehicle.
  • the vehicle may include a plurality of cameras 10.
  • the vehicle may include four cameras 10, a front camera for imaging a predetermined range in front, a rear camera for imaging a predetermined range in the rear, a right camera for imaging right, and a left camera for imaging left.
  • a distant camera for imaging a relatively distant place and a short-distance camera for imaging a short distance may be provided.
  • the camera 10 may be a wide-angle camera having an angle of view exceeding 100 °.
  • the wavelength of light captured by the camera 10 is not limited to visible light, but may include ultraviolet and infrared light.
  • the camera 10 may be an infrared camera.
  • the vehicle may include a visible light camera that captures visible light and an infrared camera.
  • the camera 10 is configured as a camera module including, for example, a CMOS image sensor serving as an image sensor (not shown) and an image processing engine (not shown).
  • Information on the environment around the vehicle captured by the camera 10 is stored in the memory 80 in the form of a still image or a moving image (hereinafter, these are collectively referred to as images).
  • An image processor 20 described later executes various processes based on data stored in the memory 80.
  • the image processor 20 may be provided for each camera 10, or one image processor 20 may process image data of the plurality of cameras 10. It may be.
  • the configuration and arrangement of functions of the camera 10 can be changed as appropriate.
  • the image processor 20 analyzes an image captured by the camera 10.
  • the image processor 20 detects a predetermined feature, for example, by analyzing an image.
  • the feature to be detected is, for example, a feature required for vehicle control.
  • the feature to be detected corresponds to an element to be recorded in map data (hereinafter, also referred to as a map element) from another viewpoint.
  • the features detected by the image processor 20 include, for example, road markings (in other words, road markings) and landmarks.
  • Road marking refers to paint painted on the road mainly for traffic control and traffic regulation.
  • Road markings include regulation markings and instruction markings.
  • lane boundaries so-called lane markings, lane marks
  • Road markings include those realized by road studs such as chatter bars and botsdots.
  • the landmark 63 includes a sign, a traffic light, a pole, a sign, and the like corresponding to a traffic sign such as a regulation sign, a guide sign, a warning sign, an instruction sign, and the like.
  • the guidance sign refers to a direction sign, a sign indicating a region name, a sign indicating a road name, a notice sign indicating a doorway of a highway, a service area, and the like.
  • the landmarks 63 may include symbolic buildings such as street lamps, mirrors, telephone poles, commercial advertisements, shops, and historic buildings.
  • the poles include streetlights and telephone poles. Further, a part of the road surface display (for example, a lane mark or a pause line) can be treated as a landmark.
  • Landmarks also include road pavement, undulations, joints, and the like.
  • the image processor 20 separates and extracts the background and the landmark 63 from the captured image based on image information including color, luminance, contrast regarding the color and luminance, and the like. Further, the landmark 63 may be extracted based on the size, shape, and installation position.
  • the image processor 20 uses the SfM (Structure from Motion) technology to obtain the yaw rate, the longitudinal acceleration, the lateral acceleration, the wiper operation state, and the like of the vehicle from the image captured by the camera 10.
  • SfM Structure from Motion
  • a state quantity indicating behavior (hereinafter, behavior information) is also detected.
  • the camera 10 corresponds to an example of the surrounding monitoring sensor.
  • the in-vehicle system 2 of the present embodiment includes the camera 10 as a peripheral monitoring sensor, but the peripheral monitoring sensor configuring the map system 1 is not limited to the camera 10.
  • the peripheral monitoring sensor may be a millimeter-wave radar or a LiDAR (Light Detection and Ranging / Laser Imaging and Detection and Ranging).
  • the LiDAR may be a scanning LiDAR or a flash LiDAR. It is preferable that the LiDAR is SPAD LiDAR (Single Photon Avalanche Diode Light Detection And Ranging) from the viewpoint of resolution and the like.
  • various object detection devices such as a sonar can be used as the periphery monitoring sensor.
  • the three-dimensional ranging point group data generated by the LiDAR, the detection result of the millimeter wave radar, the detection result of the sonar, and the like correspond to the peripheral object data.
  • the three-dimensional ranging point group data is also called a distance image.
  • the detection target may be recognized using the distance information for each ranging point / direction and the reception intensity information.
  • Various methods can be used as an object recognition method using LiDAR, millimeter wave radar, sonar, or the like.
  • the map system 1 may include a plurality of types of devices as surrounding monitoring sensors.
  • the map system 1 may include, as a surrounding monitoring sensor, a LiDAR configured to include the front of the vehicle in a detection range in addition to the front camera serving as the camera 10.
  • a technology that uses detection results of a plurality of types of sensors together may be used.
  • the accuracy of detecting the distance to the landmark can be improved.
  • the landmark recognition rate can be secured by using the detection results of the millimeter wave radar complementarily.
  • the camera 10 that captures an image of the front of the vehicle, a millimeter wave radar, a LiDAR, and the like correspond to the forward monitoring device.
  • the sensor 30 serving as the state acquisition unit includes, for example, a speed sensor, an acceleration sensor, a yaw rate sensor (a gyro sensor in a broad sense), a steering angle sensor, an illuminance sensor, and a positioning sensor (for example, a GPS receiver) 30a.
  • a speed sensor for example, a speed sensor, an acceleration sensor, a yaw rate sensor (a gyro sensor in a broad sense), a steering angle sensor, an illuminance sensor, and a positioning sensor (for example, a GPS receiver) 30a.
  • a speed sensor for example, an acceleration sensor, a yaw rate sensor (a gyro sensor in a broad sense), a steering angle sensor, an illuminance sensor, and a positioning sensor (for example, a GPS receiver) 30a.
  • a yaw rate sensor for example, a gyro sensor in a broad sense
  • a steering angle sensor for example, a steering angle sensor, an
  • the speed sensor acquires the speed of the vehicle.
  • the acceleration sensor acquires the traveling direction of the vehicle and acceleration in a direction orthogonal to the traveling direction.
  • the yaw rate sensor acquires a yaw rate acting on the vehicle.
  • the steering angle sensor acquires a steering angle of the steering.
  • the illuminance sensor acquires the brightness around the vehicle.
  • the GPS receiver as the positioning sensor 30a sequentially acquires and outputs coordinate information (latitude, longitude, altitude) indicating the current position of the vehicle.
  • the GPS receiver is configured to output data such as the GPS Doppler velocity, the number and elevation angles of the positioning satellites being captured, the pseudorange, the SN ratio of the received satellite signal, and whether or not correction information is used. It may be.
  • GNSS Global Navigation Satellite System
  • the GNSS used by the vehicle may be GLONASS, Beidou, Galileo, IRNSS, or the like.
  • a pavement state or undulation of a road on which the vehicle is traveling, a joint between a bridge and other roads, and the like can be detected by a sensor or the like that detects vibration of the vehicle.
  • These road pavement states, undulations, joints, and the like can also be employed as landmarks 63 for specifying positions on a map.
  • the main processor 40 is communicably connected to the image processor 20 and the sensor 30, and calculates and processes various information input from the image processor 20 and the sensor 30.
  • the main processor 40 generates a traveling trajectory that the vehicle is predicted to travel based on, for example, the speed, acceleration, and yaw rate of the vehicle. That is, the main processor 40 generates a travel plan (so-called path plan) for automatic driving.
  • the path plan includes not only setting the traveling trajectory but also determining the steering control amount at each time point, the target vehicle speed, and the timing of transferring the driving authority to the occupant.
  • the yaw rate, the longitudinal acceleration, and the lateral acceleration values detected by the image processor 20 from the image captured by the camera 10 using the SfM technology may be used.
  • the main processor 40 may be configured to use the output value of the yaw rate sensor as the sensor 30 when the image processor 20 cannot detect the yaw rate.
  • a yaw rate determined from a captured image of a camera has higher accuracy than a yaw rate detected by a yaw rate sensor. Therefore, by using the value detected by the image processor 20 as the yaw rate, the main processor 40 can improve, for example, the accuracy of dead reckoning.
  • the yaw rate based on the image analysis and the yaw rate derived from the sensor 30 may be used in a complementary combination.
  • the main processor 40 generates a travel history indicating a track on which the vehicle has actually traveled based on the history of the own vehicle position specified by dead reckoning or localization described later.
  • the main processor 40 also generates a trajectory (specifically, shape data such as curvature and width) based on lane marks detected based on images acquired by the camera 10. Further, the main processor 40 calculates the position coordinates (hereinafter, also referred to as observation coordinates) of features such as the landmark 63 and the lane mark extracted by the image processor 20 in the global coordinate system.
  • a trajectory specifically, shape data such as curvature and width
  • observation coordinates the position coordinates of features such as the landmark 63 and the lane mark extracted by the image processor 20 in the global coordinate system.
  • the position coordinates of the feature may be specified by combining the current position of the own vehicle and the relative position information of the feature with respect to the own vehicle.
  • the relative position (distance and direction) of the feature with respect to the host vehicle may be specified based on the size and posture (for example, degree of inclination) of the feature in the image.
  • the main processor 40 roughly estimates the initial coordinates of the vehicle in the global coordinate system by, for example, GPS. Then, relative coordinates from the initial coordinates of the vehicle calculated by integrating the speed vector of the vehicle are estimated. As a result, the approximate current position of the vehicle is obtained in the global coordinate system.
  • a relative distance and an azimuth of a feature such as a landmark or a lane mark from the vehicle are calculated from an image including SfM (Structure @ from @ Motion) information.
  • SfM Structure @ from @ Motion
  • global coordinates of a position where a feature such as a landmark exists are obtained.
  • the relative distance and direction of the landmark from the vehicle may be calculated using information of a millimeter wave radar or a laser radar (not shown). Note that the coordinate calculation of the feature may be executed by the image processor 20.
  • the feature information and the runway information obtained as a result of calculation, processing, or acquisition by the main processor 40 are temporarily stored in the memory 80.
  • the feature information is information indicating the position coordinates, shape, and size of the feature specified by the image recognition.
  • Each feature in the memory 80 is represented, for example, by a set of coordinate points arranged along the contour of the feature.
  • Various forms can be adopted as the expression form of the shape and position of the feature.
  • the shape and position of a feature may be represented by a polynomial expression.
  • the feature information can be roughly classified into landmark information and lane mark information.
  • the landmark information includes landmark types, coordinate information, colors, sizes, shapes, and the like. As types of landmarks, signboards, signals, signs, poles, pedestrian crossings, road markings (for example, stop lines), manholes, and the like can be adopted. Also, lane marks can be adopted as landmarks.
  • the lane mark information includes, for example, position information of the lane mark and information indicating whether the lane mark is realized by a solid line, a broken line, or a Bots-Dots pattern.
  • the position information of the lane mark is expressed as a group of coordinates (that is, a group of points) of the point where the lane mark is formed. In another embodiment, the position information of the lane mark may be represented by a polynomial expression.
  • the position information of the lane mark may be a set of line segments represented by a polynomial expression (that is, a line group).
  • the main processor 40 also executes various processes related to map utilization and map update (or generation). As a process related to map update, the main processor 40 executes, for example, download of map information, upload of probe data, selection of landmarks used for localization, and the like. Some specific examples of various processes related to map utilization and map update (or generation) will be described in detail later.
  • the communication module 50 is interposed between the main processor 40 and the server 3 so that the main processor 40 and the server 3 described below can communicate with each other.
  • the communication module 50 transmits the probe data input from the main processor 40 to the server 3. Further, the communication module 50 receives the map information stored in the server 3 and related information, and stores the received information in the memory 80.
  • the main processor 40 is configured to be able to execute various controls such as steering control, acceleration, and braking of the vehicle based on map information received via the communication module 50 and stored in the memory 80.
  • the HMI 60 is a user interface for notifying a user of various types of information or transmitting a predetermined operation to the vehicle.
  • a display attached to a car navigation device for example, a display attached to a car navigation device, a display built in an instrument panel, a head-up display projected on a windshield, a microphone, a speaker, and the like can be adopted.
  • a mobile terminal such as a smartphone communicably connected to a vehicle can be the HMI 60 in the map system 1.
  • the user can obtain information displayed on the HMI 60 visually, and can also obtain information by voice, warning sound, or vibration. Further, the user can request a desired operation from the vehicle by a touch operation on the display or a voice.
  • the user when the user intends to receive advanced driving support services such as automatic steering utilizing map information, the user activates the function via the HMI 60.
  • the map utilization function is activated, and the download of the map information is started.
  • the map utilization function is activated by giving a command by voice.
  • the upload of the map information relating to the map update may be performed all the time while the communication between the vehicle and the server 3 is established, or the function of utilizing the map is enabled by tapping the “map cooperation” button. It may be made to be performed during the conversion. It may be activated by another UI reflecting the user's intention.
  • the actuator 70 includes, for example, a braking device (so-called brake actuator), an electronic throttle, a steering actuator, and the like.
  • the actuator 70 is a hardware element related to at least one of acceleration, deceleration, and steering of the vehicle.
  • the memory 80 is realized using a volatile memory such as a RAM.
  • the memory 80 may be realized using a nonvolatile memory such as a flash memory.
  • the memory 80 may include both a volatile memory and a nonvolatile memory.
  • the memory 80 includes a temporary storage unit 81 using a volatile memory and a storage unit 82 using a non-volatile memory.
  • the storage unit 82 stores a program (hereinafter, a vehicle program) for causing the main processor 40 to execute processing such as generation of probe data.
  • the vehicle program only needs to be stored in a non-transitional substantial storage medium.
  • the main processor 40 collates the coordinates of the landmark calculated based on the image captured in real time with the coordinates of the landmark included in the map information downloaded from the server 3 to specify the detailed position of the host vehicle ( That is, localization).
  • the main processor 40 performs localization in the vertical direction using landmarks such as a direction sign, a traffic light, a road sign, and a stop line.
  • the vertical direction corresponds to the front-back direction of the vehicle.
  • the vertical direction corresponds to a direction in which the road extends when viewed from the own vehicle (hereinafter, also referred to as a road extension direction).
  • the vertical localization corresponds to a process of specifying the position of the vehicle in the direction in which the road extends.
  • the distance to the direction signboard present in front of the own vehicle is specified as 100 m
  • detailed remaining distances to characteristic points on the road such as intersections, curve entrances / exits, tunnel entrances / exits, and the end of traffic jams are reduced. Specified.
  • the main processor 40 performs localization in the horizontal direction using landmarks such as lane marks, road edges, and guardrails.
  • Lateral localization refers to specifying a traveling lane and specifying a detailed position of the host vehicle in the traveling lane (the amount of offset from the center of the lane in the left-right direction).
  • the lateral direction refers to a vehicle width direction or a road width direction.
  • the horizontal localization is realized, for example, based on the distance from the left / right road edge / compartment line recognized by the image processor 20.
  • the position of the host vehicle as a result of the localization may be expressed in the same coordinate system (for example, latitude, longitude, and altitude) as the map data.
  • the vehicle position information can be expressed in an arbitrary absolute coordinate system such as WGS84 (World Geodetic System 1984). Further, the vehicle position information may be represented by a local coordinate system indicating a position in a map tile described later.
  • the main processor 40 may be configured to perform localization in the vertical and horizontal directions using one landmark.
  • localization may be performed using a landmark closest to the own vehicle.
  • the main processor 40 detects a plurality of landmarks (for example, a direction signboard) ahead of the own vehicle, the main processor 40 performs localization in the vertical direction by using the landmark closest to the own vehicle among the plurality of landmarks.
  • the recognition accuracy of the type and distance of an object based on an image or the like is higher for an object closer to the vehicle. That is, when a plurality of landmarks are detected, according to the configuration in which localization is performed using the landmark closest to the vehicle, the position estimation accuracy can be improved.
  • the main processor 40 sequentially performs localization at predetermined position calculation intervals.
  • the position calculation interval is, for example, 100 milliseconds.
  • the position calculation interval may be 200 ms or 400 ms.
  • the position calculation interval may be dynamically changed according to the type of the road on which the vehicle is traveling (hereinafter, the traveling road), the vehicle speed, and the external environment. For example, when the vehicle is traveling on a road section where the remaining distance to a curve or an intersection is within 0.5 km, a value (for example, 100 milliseconds) shorter than a predetermined standard interval (for example, 200 milliseconds) is set. You may.
  • the accuracy of the vehicle position information can be improved by setting the position calculation intervals densely.
  • relatively high / precision vehicle control for example, steering control
  • the processing load on the main processor 40 can be reduced.
  • the main processor 40 sequentially performs localization when the map utilization function has been activated by the user and detailed map data near the current position has been obtained. Whether or not the main processor 40 performs the localization may be changed according to the type of the traveling path.
  • the configuration may be such that localization is performed when the travel path is an automobile exclusive road, while localization is not performed when the travel path is a general road.
  • the motorway is a road on which pedestrians are basically prohibited from entering, and includes, for example, a toll road such as an expressway.
  • Automobile roads include general roads where traffic other than automobiles is prohibited.
  • the execution / non-execution of the localization by the main processor 40 may be determined by the main processor 40 or controlled by the server 3 based on the maintenance state of the map data, the type of the traveling road, and the like.
  • the main processor 40 uses the yaw rate and the vehicle speed to perform dead reckoning (Dead Reckoning /) when localization cannot be performed (for example, no landmark has been detected) or when the map utilization function has not been activated. Autonomous navigation).
  • the yaw rate used for dead reckoning may be one that is recognized by the image processor 20 using SfM technology or one that is detected by a yaw rate sensor.
  • the main processor 40 outputs a corresponding command to the actuator 70 for operating the hardware mounted on the vehicle based on the current position of the own vehicle and the map data specified by the localization or dead reckoning. I do. Thereby, automatic driving and driving support are realized.
  • the main processor 40 also controls lighting of lamps such as direction indicators, hazard lamps, and headlights in accordance with a travel plan generated by the main processor 40 or another ECU (for example, an automatic driving ECU).
  • the main processor 40 obtains POI information of a tollgate or the like located at a certain distance (for example, 200 m) ahead of the current position from the map data, so that vehicle control such as lane change and deceleration can be performed with a margin (in other words, more Safe) executable.
  • the POI refers to a point that needs attention in terms of vehicle control and a path plan.
  • the POIs include map elements that affect the running control of the vehicle, such as a curve entrance / exit, a tunnel entrance / exit, and the beginning and end of a traffic jam.
  • the POI includes a static POI corresponding to the static map information and a dynamic POI corresponding to the dynamic map information.
  • the dynamic POI indicates a tail position or a head position of a traffic jam.
  • ACC is an abbreviation of Adaptive ⁇ Cruise ⁇ Control, and refers to a function of automatically driving a vehicle such that a distance between the vehicle and a preceding vehicle is constant within a range where the traveling speed of the vehicle does not exceed a predetermined upper limit.
  • an application related to automatic control of the vehicle includes an application that drives the vehicle so as to maintain the center of the lane (hereinafter, a lane keeping application) and an operation related to a lane change.
  • a lane keeping application includes an application that drives the vehicle so as to maintain the center of the lane and an operation related to a lane change.
  • There is a function to execute automatically hereinafter, lane change application).
  • the camera 10 may not be able to recognize the shape of the front lane.
  • ACC it is necessary to grasp the curvature of the road ahead of the vehicle in advance and adjust the speed. For example, control may be performed to decelerate to a predetermined target speed before the curve so that the vehicle can travel smoothly and safely on the curve.
  • control may be performed to decelerate to a predetermined target speed before the curve so that the vehicle can travel smoothly and safely on the curve.
  • the curvature of the front road is acquired in advance using the map data, even when the camera 10 cannot detect the curvature of the front road, it is necessary to wait until the vehicle enters the curved section.
  • the vehicle when the ACC function is on, the vehicle may be traveling at a speed lower than a predetermined target speed in accordance with the speed of the preceding vehicle.
  • the ACC function when the preceding vehicle becomes absent due to a lane change or the like (in other words, when the preceding vehicle leaves the front of the own vehicle), the ACC function usually accelerates to a predetermined target speed. Operates.
  • the map utilization function it can be determined based on the map data whether the current position is a road section where acceleration is preferable.
  • the acceleration to the target speed can be canceled. That is, by using the map data, the risk of performing unnecessary acceleration can be reduced.
  • the section in which acceleration to the ACC set speed is not preferable refers to the vicinity of a tollgate, the exit of an expressway, the vicinity of an intersection, a sharp curve, or the like.
  • map data is also useful when the driver in the driver's seat (the so-called driver) has driving authority.
  • POI information such as traffic congestion at a certain distance from the current position may be reported to the driver as driving operation support information.
  • the main processor 40 transmits a data set including the traveling trajectory information, the traveling path information, and the feature information stored in the memory 80 to the server 3 as probe data.
  • the traveling trajectory information is information indicating a trajectory on which the vehicle has traveled.
  • the traveling trajectory information is represented as a sequence of points of the own vehicle position.
  • the runway information is information indicating the end of the runway and the trajectory of the center line. The end of the travel path and the like may also be represented by a group of coordinate points.
  • the runway information directly or indirectly indicates the road shape such as the curvature and width of the road.
  • the main processor 40 sequentially obtains feature information, road information, and own vehicle position coordinates (hereinafter, a recognition result) obtained by image recognition and the like, and corresponds to the acquisition time (in other words, the observation time). Then, the data is stored in the memory 80 in chronological order. Recognition results such as feature information and the like are sequentially provided (for example, every 100 milliseconds) from the image processor 20. The feature information may be sequentially generated by the main processor 40 in cooperation with the image processor 20.
  • the recognition result data at each time stored in the memory 80 is uploaded collectively at a predetermined upload interval.
  • the upload interval is set to, for example, K (K is a natural number) times the execution cycle of the image recognition process. If K ⁇ 2, the main processor 40 uploads, as probe data, data obtained by packaging recognition results within a certain time stored in the memory 80.
  • K 4 is set as an example. That is, the main processor 40 uploads, as probe data, data obtained by packaging recognition results within 400 milliseconds. Note that the data including the vehicle positions at a plurality of time points corresponds to the above-described travel path information.
  • the probe data sequentially transmitted by the vehicle is stored in the server 3 in a predetermined storage / management format.
  • the server 3 stores the probe data sequentially transmitted by the same vehicle linked to a length including a predetermined number of landmarks.
  • the length of the probe data may be in units of road segments.
  • the road segment is a road management unit in the map data.
  • a road segment is a road segmented according to a predetermined rule.
  • the road segment may correspond to, for example, a road link.
  • the road link refers to a road section connecting road nodes indicating intersections or road end points.
  • the road segments may be further subdivided road links.
  • the road segment may be a road segmented by a predetermined length (for example, every 10 m).
  • each vehicle uploads data representing the running trajectory or road edge of the vehicle in a coordinate point sequence. Edges, lane center lines, and the like may be represented by polynomials.
  • the main processor 40 uploads the positioning result by GPS, the SfM information, and the vehicle speed information, and the server 3 calculates the vehicle position at each time based on the information. May be.
  • the feature information and the runway trajectory information included in the probe data correspond to information for the server 3 to generate a static map (hereinafter, static information).
  • the probe data includes dynamic information (hereinafter, vehicle behavior information) indicating the behavior of the vehicle, such as the vehicle speed, steering angle, yaw rate, turn signal operation information, lane ID, relative position to the lane, etc. within the latest fixed time. You may go out.
  • vehicle behavior information includes wiper operation information, shift position, vehicle body direction, pitch angle and roll angle of the vehicle body, inter-vehicle distance to a preceding vehicle, and the like.
  • the relative position information with respect to the lane indicates the amount of offset to the left and right with respect to the lane center line, whether the vehicle crosses the lane, and the like.
  • the server 3 acquires the POI information corresponding to the quasi-dynamic map information, such as a congested section or a point where there is an obstacle such as a falling object or a parked vehicle on the road, by acquiring the vehicle behavior information.
  • the server 3 adopts the end of the vehicle group whose vehicle speed is equal to or lower than the predetermined threshold as the end of the congestion section, and sets a point corresponding to the end as the dynamic POI related to the congestion.
  • the head position of a group of vehicles whose vehicle speed is equal to or lower than a predetermined threshold is adopted as the head position of the congestion section, and the head position is set as the dynamic POI.
  • the server 3 determines a point where a certain number (for example, 10) or more vehicles are temporarily traveling across lanes or changing lanes at a point where there is an obstacle such as a falling object or a parked vehicle on the road. (Hereinafter, obstacle location). Then, the obstacle present point is set as a dynamic POI.
  • the map data stored in the server 3 includes, as schematically shown in FIG. 2, for example, a road segment 62 representing the shape of a road by a cubic spline curve, a landmark 63 existing around the road segment 62, and including.
  • the road segment 62 and the landmark 63 have values of latitude, longitude and altitude, respectively.
  • the landmark 63 includes, for example, traffic signs and the like, and in addition to information obtained in real time by the camera 10 and the various sensors 30 serving as a state acquisition unit, those whose positions have already been determined are integrally configured on a map. I have.
  • the map information is sequentially updated based on information obtained in real time.
  • FIG. 3 conceptually shows an example of the structure of map data.
  • the map data includes road network data, lane network data, feature data, and POI data, as shown in FIG. Each data is hierarchically configured.
  • the road network data includes a link ID, a link length, the number of lanes, and connection node information (for example, a node ID) for each road link, a node ID, a position coordinate, and connection link information (for example, a link ID) for each road node.
  • the lane network data includes a lane ID, a link ID at the lane level, a link length, and connection node information, a node ID for each lane node, position coordinates, and connection link information (for example, a link ID).
  • the link information at the lane level included in the lane network data is associated with the road link included in the road network data.
  • the feature data includes lane marking data and landmark data.
  • the lane marking data includes a lane marking ID for each lane marking, and a group of coordinate points representing the installation portion.
  • the lane marking data includes pattern information such as dashed lines, solid lines, and road studs.
  • the lane marking data is associated with lane information (for example, a lane ID or a link ID at a lane level).
  • the landmark data indicates the position and type of each landmark.
  • the shape and position of each feature are represented by a set of coordinate points.
  • the POI data affects the travel plan of the vehicle, such as junctions for exiting the main road on expressways, junctions, speed limit changes, lane changes, traffic congestion sections, construction sections, intersections, tunnels, and toll gates. Is the data indicating the position and type of a feature that exerts
  • the POI data includes type and position information.
  • the map data may include a traveling trajectory model.
  • the traveling trajectory model is trajectory data generated by statistically integrating traveling trajectories of a plurality of vehicles.
  • the traveling trajectory model is, for example, an average of traveling trajectories for each lane.
  • the traveling trajectory model corresponds to data indicating a traveling trajectory that is a reference during automatic driving.
  • Map data can include static map information and dynamic map information.
  • the static map information is information on a feature that is unlikely to change, such as a road network, a road shape, a road surface display, a structure such as an X guardrail, and a building (for example, within one week to one month).
  • Static map information is also called a base map.
  • the dynamic map information refers to information on map elements whose state changes in a relatively short period of time, such as road construction information and traffic regulation information.
  • the dynamic map information includes quasi-static information, quasi-dynamic information, and dynamic information (hereinafter referred to as super- Dynamic information).
  • the quasi-static information is information that needs to be updated within one hour to several hours, for example.
  • Road construction information, traffic regulation information, congestion information, and wide area weather information correspond to the quasi-static information.
  • Semi-dynamic information is information that is required to be updated in units of 10 minutes, for example.
  • the last position of traffic jam, accident information, narrow-range weather information, falling objects on the road, and the like correspond to the quasi-dynamic information.
  • the super dynamic information includes, for example, position information of a moving body such as a vehicle or a pedestrian, and ITS (Intelligent Transport Systems) information such as a lighting state of a traffic light.
  • the map data handled by the map system 1 includes static map information, quasi-static map information, and quasi-dynamic information.
  • the map information handled by the map system 1 may be only static map information. It may also include super dynamic information.
  • the static map information and the dynamic map information may be configured to be separately managed (updated and distributed).
  • the server 3 stores the map information and updates the map information by the server processor 31 attached to the server 3. All map data corresponding to all map recording regions is managed by being divided into a plurality of patches. Each patch corresponds to map data of a different area. Note that each patch may partially overlap an adjacent patch and a corresponding area.
  • the map data is stored in units of map tiles obtained by dividing the map recording area into a rectangular shape of 2 km square.
  • the real space range (rectangular divided area) to be recorded by the map tile is also simply referred to as a map tile.
  • the server 3 stores a plurality of map tiles together with corresponding latitude, longitude, and altitude information.
  • Each map tile has a unique ID (hereinafter referred to as a tile ID).
  • the map tile corresponds to the subordinate concept of the patch described above.
  • the size of the map tile can be changed as appropriate, and is not limited to a rectangular shape of 2 km square. It may have a rectangular shape of 1 km square or 4 km square.
  • the map tile may be hexagonal or circular. Each map tile may be set so as to partially overlap an adjacent map tile.
  • the size of the map tile may be different for each layer. For example, the tile size of the static map data may be set to 2 km square, while the tile size of the dynamic map data (particularly, the map data relating to the quasi-dynamic information) may be set to 1 km square.
  • the tile size may be different depending on the type of road such as an expressway and a general road. It is preferable that the tile size of the general road is set smaller than the tile size of the expressway.
  • general roads may also be distinguished between main roads and narrow streets. In that case, it is preferable that the tile size of the narrow street is set smaller than that of the main road.
  • the map recording area may be the whole country where the vehicle is used, or may be only a part of the area. For example, the map recording area may be only an area where automatic driving of a general vehicle is permitted or an area where an automatic driving movement service is provided.
  • the server 3 manages (generates, updates, and distributes) map data in units of map tiles obtained by dividing a map recording area into a rectangular shape having a uniform size, but is not limited thereto.
  • the size and shape of the map tile may be non-uniform. That is, the size and shape of the patch corresponding area, which is the range of the real world (in other words, the range to be recorded) corresponding to each patch, may be uniform or may be various. For example, a map tile in a rural area where the density of map elements such as landmarks is likely to be relatively sparse is higher than a map tile in an urban area where map elements such as landmarks are more likely to exist densely. May also be set large.
  • the map tiles in the rural area may have a rectangular shape of 4 km square, while the map tiles in the urban area may have a rectangular shape of 1 km or 0.5 km square.
  • the urban area here refers to, for example, an area where the population density is equal to or higher than a predetermined value, or an area where offices and commercial facilities are concentrated.
  • Rural areas can be areas other than urban areas. Rural areas may be read as rural areas. Note that the area classification mode is not limited to the two stages of the urban area and the rural area.
  • the map recording area may be divided into four stages of an urban area, a rural area, a rural area, and a depopulated area in descending order of population density.
  • the degree of urban area in the area may be determined by combining a plurality of types of indices.
  • the division mode of all map data may be defined by the data size.
  • the map recording area may be divided and managed in a range defined by the data size.
  • each patch is set so that the data amount is less than a predetermined value.
  • the data size in one delivery can be set to a certain value or less. It is assumed that the real space range corresponding to the patch in the urban area is smaller than the real space range corresponding to the sub-block in the rural area. As described above, it is expected that map elements such as landmarks and lane marks will be denser in urban areas than in rural areas.
  • the vehicle travels based on the downloaded map data while sequentially obtaining information on the map tile to which the passing road belongs from the server 3.
  • Various rules can be applied to the handling of the map data downloaded to the in-vehicle system 2 in consideration of the capacity of the memory 80 and the like.
  • the main processor 40 may be configured to delete the map data of the map tile from which the host vehicle has already left, as soon as the host vehicle leaves or at a timing separated by a predetermined distance or more.
  • the in-vehicle system 2 can be realized using the memory 80 having a small capacity. That is, the introduction cost of the vehicle-mounted system 2 can be reduced.
  • the map data downloaded to the memory 80 may be deleted at a timing when a predetermined time (for example, one day) has elapsed from the time of download.
  • Map data about roads that are used daily such as commuting roads and school roads, may be configured to be cached in the memory 80 as much as possible (for example, as long as the free space does not fall below a predetermined value).
  • the storage period of the downloaded map data may be changed according to the attribute of the data. For example, a fixed amount of static map data is stored in the storage unit 82.
  • dynamic map data such as construction information is not stored in the storage unit 82 but is deleted from the temporary storage unit 81 at the timing when the vehicle passes through the area corresponding to the dynamic map data. Is also good.
  • the running power source here is a power source for running the vehicle, and refers to an ignition power source when the vehicle is a gasoline-powered vehicle.
  • the vehicle is an electric vehicle or a hybrid car, it indicates a system main relay.
  • the map system 1 uploads information on a map collected by a vehicle to a server 3 included in the map system 1 so that the map information stored in the server 3 can be updated.
  • Uploading is usually performed at a predetermined frequency.
  • the normal upload interval is set to 400 millisecond intervals.
  • the upload interval may be 200 milliseconds, 500 milliseconds, 1 second, 2 seconds, or the like.
  • the main processor 40 may have an operation mode for stopping the upload of the probe data, reducing the frequency of the upload, or narrowing down the type of information to be uploaded.
  • FIG. 4 shows an example of a processing flow executed by the main processor 40. Note that the normal mode corresponds to the first mode, and the low frequency mode corresponds to the second mode.
  • Step S100 is a step in which the main processor 40 determines a rough position of the own vehicle based on the information of the positioning result by GPS.
  • Step S101 is a step in which the main processor 40 downloads the map information corresponding to the approximate position of the host vehicle from the server 3. The determination and acquisition of the map information corresponding to the approximate position of the vehicle will be described later.
  • Step S102 is a step in which the main processor 40 determines a detailed position of the host vehicle.
  • the detailed position of the vehicle is global coordinates including latitude, longitude and altitude on the earth.
  • the main processor 40 determines detailed global coordinates of the host vehicle based on the map information downloaded from the server 3 together with rough position information using, for example, GPS.
  • Step S103 is a step of determining whether or not the situation where the host vehicle is placed satisfies a predetermined low frequency condition. Specifically, this is a step in which the main processor 40 determines whether or not the position of the own vehicle is present in a predetermined low-frequency area. That is, the low frequency condition in the example shown in FIG. 4 is whether or not the position of the own vehicle exists in a predetermined low frequency area determined in advance.
  • the low frequency area is set in advance on the map as shown in FIG.
  • the low-frequency area may be set as a line along the road segment 62, or may be set as a plane having a predetermined area as illustrated in FIG.
  • Step S105 is a normal frequency mode in which the upload frequency is the normal frequency.
  • Step S104 is a step of transmitting probe data including map information to the server 3.
  • the probe data is uploaded to the server 3 at a predetermined frequency according to the frequency mode related to the upload of the probe data set in step S104 or step S105. Thereafter, this flow ends.
  • the low frequency mode When the low frequency mode is enabled, the amount of communication data related to map information between the communication module 50 and the server 3 decreases as compared with the normal frequency mode. Therefore, the load on the line can be reduced, and the cost related to communication can be reduced.
  • the low-frequency area includes, for example, a large number of other vehicles around the own vehicle such as an arterial road in an urban area, and a sufficient amount of map information can be uploaded to the server 3 from the many other vehicles. It is an area that is in the environment. In such an area, since a large amount of map information is uploaded from other vehicles, a sufficient amount of information for updating the map information can be secured even if the frequency of uploading the own vehicle is reduced.
  • Another example is an area where the frequency of changing the landmark 63 such as a traffic sign or a road sign is relatively low, such as a motorway.
  • the landmark 63 itself as hardware on the road and near the road is expected to be updated infrequently. Therefore, even if the frequency of uploading the own vehicle is reduced, it is necessary to update the map information. Can secure a sufficient amount of information.
  • the upload frequency can be set to zero in the low frequency mode.
  • the state in which the upload frequency is zero is a state in which uploading of map information to the server 3 is substantially prohibited, and is particularly referred to as a prohibited mode. That is, the low frequency mode includes the prohibition mode.
  • the prohibited area which is a low-frequency area in which the prohibited mode is enabled, is, for example, a highly confidential facility such as a military facility or an in-house facility, or a road such as a safari park that can be driven by a vehicle, but also has automatic steering. This is an area in a facility that is not suitable for steering. In such a prohibited area, uploading of map information from the vehicle to the server 3 is prohibited, and the server 3 does not generate a map. Therefore, download for vehicle control is not performed.
  • the mode may shift from the normal frequency mode to the low frequency mode at night. It tends to be more difficult for the camera 10 to recognize traffic signs and road markings at night than in daytime, and the reliability of determining the position of the landmark 63 is lower than in daytime. Therefore, it may be preferable to reduce the frequency of uploading the position information of the landmark 63 to the server 3.
  • a time zone in which the vehicle is placed at night or in an environment of low illuminance corresponding thereto is set in advance, and the upload frequency is preferably set to the low frequency mode in that time zone. In a region where a season exists, the time zone defined as night varies depending on the season. Therefore, it is preferable that the time zone defined as night is variable in accordance with the season. For example, in an area with white nights, the time zone at night is relatively short, and there is little opportunity for the surrounding environment of the vehicle to have extremely low illuminance. In such an area, the time during which the low frequency mode is effective is also reduced.
  • the mode may be shifted from the normal frequency mode to the low frequency mode based on weather conditions in a region where the vehicle travels. For example, in bad weather such as heavy rain, heavy snow, dense fog, and sand storm, it tends to be difficult to recognize traffic signs and road signs, and the reliability of determining the position of the landmark 63 is lower than that in fine weather. Therefore, it may be preferable to reduce the frequency of uploading the position information of the landmark 63 to the server 3.
  • a method of determining what weather condition the vehicle is under for example, measuring the reflectance of a road surface using an image captured by the camera 10 or determining the weather based on the contrast of the image And can be implemented.
  • an area that satisfies predetermined weather conditions such as heavy rain, heavy snow, heavy fog, and sand storm may be designated as a low-frequency area in real time based on information disclosed by a public organization.
  • the setting itself of the low-frequency area may be dynamically changed in accordance with the weather conditions, the time zone, and the collection state of the probe data in the server 3 (in other words, the degree of collection).
  • an area that is not a low-frequency area hereinafter, a normal area
  • the setting change of the area may be performed by the server 3 or may be performed by the vehicle.
  • the frequency of uploading may be gradually reduced based on the years of use of the main processor 40 and the image processor 20 (in other words, the total operating time).
  • the performance of the main processor 40 and the image processor 20 is evolving day by day, and it is presumed that a newer processor has a shorter time required for image processing and uploading and can be performed with higher accuracy. Therefore, it is preferable to reduce the frequency of uploading map information as the years of use of the processor become longer. Conversely, by actively uploading map information to a processor that has been used for a short period of time, map information can be efficiently collected.
  • the main processor 40 includes, as operation modes, an all transmission mode and a suppression mode.
  • the all transmission mode corresponds to an operation mode in which a data set including information on all items specified in advance to be transmitted as probe data is uploaded as probe data.
  • the suppression mode is an operation mode in which only a part of all items set as upload targets in the all transmission mode is uploaded as probe data.
  • the all transmission mode corresponds to a first mode in which a feature of a predetermined type is to be uploaded, and the suppression mode corresponds to a second mode in which the number of features to be uploaded is smaller than that in the all transmission mode as the first mode. .
  • the type of information to be uploaded in the suppression mode may be set in advance or may be specified by the server 3.
  • the type of information to be uploaded to the vehicle in the suppression mode can be, for example, an item that is insufficient from the viewpoint of map data generation / update. Note that the number of items to be uploaded in the suppression mode may be zero.
  • the suppression mode may also include a prohibition mode. If the information to be uploaded in the all transmission mode includes both static information and dynamic information, the information to be uploaded in the suppression mode may be only dynamic information.
  • the server processor 31 instructs the vehicles existing in the predetermined upload suppression section to operate in the suppression mode. Further, the server processor 31 instructs a vehicle existing in a section other than the upload suppression section to operate in the all transmission mode.
  • the main processor 40 of each vehicle operates in the operation mode specified by the server 3.
  • the upload suppression section may be, for example, a road segment in which a necessary and sufficient amount of probe data for generating / updating map data has been collected.
  • the server 3 instructs the suppression mode for the vehicle running on the road segment on which a sufficient amount of probe data is collected, and uploads only the dynamic information, You can upload only feature information.
  • the upload suppression section may be a road segment in bad weather. According to this setting, it is possible to reduce the possibility that probe data with low accuracy will be uploaded.
  • the upload suppression section may be dynamically changed according to the degree of gathering of probe data, time zone, and weather conditions.
  • the upload suppression section may be determined by the server processor 31.
  • the server processor 31 may distribute information on the determined upload suppression section to the vehicle, determine whether the current position corresponds to the upload suppression section on the vehicle side, and determine the operation mode. According to such a configuration, the server processor 31 does not need to specify an operation mode for each vehicle, and the processing load on the server processor 31 can be reduced.
  • the main processor may be configured to spontaneously switch the operation mode based on at least one of a traveling area of the vehicle, weather conditions, and a time zone.
  • the main processor 40 may be configured to shift from the first mode to the second mode based on at least one of an instruction from the server 3, a traveling area of the vehicle, weather conditions, and a time zone.
  • the suppression mode the information types to be reported are reduced as compared with the all transmission mode. As a result, the communication amount from the vehicle to the server 3 can be reduced as a whole system. Further, since uploading of unnecessary items can be suppressed, not only the communication equipment but also the load on the main processor 40 and the server processor 31 can be reduced.
  • the server processor 31 sets any one of the plurality of vehicles as a vehicle in charge of transmission, and sets the probe data only in the vehicle in charge of transmission. May be uploaded. According to such a configuration, vehicles other than the vehicle in charge of transmission traveling in the upload suppression section do not upload the probe data. Therefore, as a whole system, the amount of data communication from the vehicle to the server 3 can be reduced.
  • the vehicle in charge of transmission may be, for example, a vehicle with a sufficient distance from the preceding vehicle or a tall vehicle such as a truck. According to these vehicles, it is easy to recognize feature information. Therefore, high-quality feature information can be efficiently collected.
  • the vehicle in charge of transmission is a vehicle corresponding to the leading vehicle, a vehicle using the image processor 20 having the best object recognition performance, among a plurality of vehicles (hereinafter, a vehicle group) configuring one group, A vehicle with the highest positioning accuracy by GPS may be used.
  • the server 3 may determine the vehicle in charge of transmission based on the position information sequentially reported from each vehicle. Of course, as another aspect, the determination may be made by sharing the information by the vehicles in charge of transmission by the inter-vehicle communication.
  • each vehicle sequentially reports to the server 3 vehicle information such as position information, performance of the image processor 20, and information indicating GPS positioning accuracy.
  • vehicle information such as position information, performance of the image processor 20, and information indicating GPS positioning accuracy.
  • the vehicle group is preferably set for each lane.
  • a set whose inter-vehicle distance is less than a predetermined threshold can be set as one vehicle group.
  • the above configuration includes a collection mode in which the server 3 transmits the probe data to all vehicles located on the road segment as operation modes, and a save mode in which the server 3 does not request transmission of probe data to some vehicles. , Is equivalent to a configuration including:
  • the operation mode of the server 3 may be different for each road segment or each map tile.
  • the operation mode of the server 3 for each road segment / map tile may be determined according to the collection status of the probe data for the road segment / map tile. For example, the server 3 may operate in the collection mode for a road segment in which probe data is insufficient, and may operate in the save mode for a road segment in which a necessary and sufficient amount of data is collected. Note that the server 3 may update the map data in map tile units or may update the map data in road segment units. If the map is updated in map tile units, the concept of the upload suppression section described above may be extended to the concept of map tile. That is, the upload suppression tile corresponding to the upload suppression section may be appropriately set. Each vehicle may be configured to operate in the suppression mode based on its presence in the upload suppression tile.
  • ⁇ ⁇ ⁇ It is assumed that there is a vehicle traveling in an area corresponding to a certain map tile, and the vehicle constitutes the map system 1. That is, at least one image representing the environment of the vehicle is acquired by the camera 10 mounted on the vehicle, and global coordinates of landmarks included in the image are calculated and uploaded to the server 3.
  • Step S200 is first executed as shown in FIG.
  • Step S200 is a step in which the server processor 31 configuring the server 3 acquires probe data.
  • the server processor 31 acquires probe data from a plurality of vehicles traveling on the same map tile. That is, the server processor 31 acquires a plurality of coordinate data for one landmark. Note that probe data is sequentially uploaded to the server 3 from a plurality of vehicles.
  • the server processor 31 stores the probe data provided from each vehicle in a state of being connected or divided into a predetermined length for each provider.
  • Step S201 is a step in which the server processor 31 calculates the variance of the coordinates for each landmark and determines whether or not the calculated variance is larger than a predetermined threshold.
  • the variance is calculated for each of the coordinates of latitude, longitude, and altitude, and each is compared with a predetermined threshold.
  • the server processor 31 calculates the variance ⁇ 2 of each landmark 63 based on the probe data received from a plurality of vehicles, as shown in FIG. In the example shown in FIG. 7, there are four landmarks 63a ⁇ 63d on the map tiles for each landmark, ⁇ a 2, ⁇ b 2 , ⁇ c 2, ⁇ d 2, is calculated.
  • step S201 if the variance of the coordinates of all the landmarks 63 is equal to or smaller than the predetermined threshold, the determination in this step is NO, and the process proceeds to step S202.
  • Step S202 is a step in which the server processor 31 statistically calculates the coordinates of each landmark 63. That the variance of the coordinates of each landmark 63 is equal to or less than a predetermined threshold value indicates that the coordinates of the landmark 63 have been detected with a certain degree of accuracy. That is, even if the coordinates of each landmark 63 are statistically calculated without using a reference mark described later, a map can be generated with relatively high accuracy.
  • step S202 for example, a process of obtaining an average is performed on each landmark 63 using the probe data received from a plurality of vehicles to calculate global coordinates. Then, this flow ends.
  • Step S203 is a step of determining whether or not high-precision positioning data exists for the landmark 63 whose variance is larger than a predetermined threshold.
  • the high-precision positioning data is coordinate data measured by a method different from probe data, such as real-time kinematics (RTK) and precise single positioning (PPP).
  • RTK real-time kinematics
  • PPP precise single positioning
  • High-precision positioning data refers to data generated by a dedicated mobile mapping system vehicle equipped with a laser radar (LiDAR), an optical camera, a GNSS receiver, an acceleration sensor, and the like, or positioning work.
  • reference coordinates coordinate data determined by precise surveying
  • the landmark 63 to which the reference coordinates are assigned is referred to as a reference mark or a reference landmark.
  • the reference mark serves as a ground control point (GCP: Ground ⁇ Control ⁇ Point).
  • GCP Ground ⁇ Control ⁇ Point
  • the reference mark is a point where the above-described high-accuracy positioning is performed.
  • signboards that correspond to traffic signs such as regulatory signs and information signs, toll gates on expressways, junctions between expressways and general roads, and corners (edges) of structures such as buildings are used as reference marks. It is possible.
  • characteristic points such as corners of the lane markings, branches / merging points with other lane markings, and ends of guardrails can also be adopted as reference marks.
  • Points where lanes increase or decrease can also be adopted as reference marks.
  • the reference mark is preferably a fixed three-dimensional structure.
  • a feature such as a guide sign, which is arranged at a position relatively higher than the road surface and is arranged at a position where the camera 10 can easily take an image is set as a reference mark.
  • a feature other than the reference mark is also described as a normal feature.
  • step S203 is a step of determining whether reference coordinates are given to the corresponding landmark 63.
  • reference coordinates exist at the landmark 63b shown in FIG. 7 (in FIG. 8, indicated by black triangle marks). That is, the landmark 63b is a reference mark. If the reference mark exists, the determination in step S203 is YES, and the process proceeds to step S204.
  • Step S204 is a step in which the server processor 31 makes the coordinates of the landmark 63b measured in real time by the camera 10 or the sensor 30 coincide with the reference coordinates for the landmark 63b as the reference mark.
  • reference coordinates exist in the landmark 63b.
  • the reference coordinates are Xref.
  • the coordinates X match the coordinates Xref. That is, translation is performed by Xref-X.
  • the coordinates of the landmarks 63b as all reference marks recorded in the plurality of probe data become Xref.
  • the coordinates of the landmarks 63a, 63c, and 63d other than the landmark 63b are also translated by Xref-X.
  • the coordinates are expressed as one dimension, but actually the coordinates are calculated in three dimensions of latitude, longitude and altitude.
  • step S202 is executed.
  • the coordinates of the landmark 63b as the reference mark match the reference coordinates.
  • a process of obtaining an average is performed to calculate global coordinates. Then, this flow ends.
  • Step S205 is a step in which the server processor 31 sets a flag indicating that there is no reference mark.
  • the map system 1 uses the high-precision positioning data as the reference coordinates for the landmark 63 for which the coordinate accuracy cannot be obtained only by accumulating the GPS and probe data.
  • the coordinates can be calculated with high accuracy for other landmarks 63 that do not have the accuracy positioning data. As a result, the accuracy of the map tile to which the corresponding landmark 63 belongs can be improved.
  • the above-described configuration acquires probe data including observation coordinates of a plurality of map elements, which are associated with information about a traveling road segment, from a plurality of vehicles, and acquires a reference mark.
  • the probe data is included in the probe data so that the observation coordinates of the map element defined as the reference mark coincide with the absolute coordinates of the reference mark or the degree of deviation is minimized.
  • This is equivalent to a configuration for executing the correction of the observation coordinates of the map element to be performed and the determination of the coordinates of the map element by statistically processing the corrected observation coordinates of the map element.
  • the server processor 31 may be configured to update the coordinate information of a feature such as a landmark or a lane mark in a procedure as shown in FIG.
  • map data updating processing may be periodically executed, for example, for each map tile.
  • the process shown in FIG. 9 is executed at midnight every day.
  • the process shown in FIG. 9 may be executed at a timing when a predetermined number or more of probe data for the map tile to be processed is accumulated. Updating the map data may be performed for each road segment.
  • the map data updating process shown in FIG. 9 includes steps T201 to T215.
  • step T201 a plurality of probe data provided from a plurality of vehicles traveling on the same road segment and stored in a database (not shown) is read, and the process proceeds to step T202.
  • the plurality of probe data is probe data for the same road segment may be determined based on movement trajectory information and GPS coordinate information contained in the probe data.
  • step T202 any one of the plurality of probe data extracted in step T201 is selected, and the process proceeds to step T203.
  • map data of the corresponding road segment is extracted based on the coordinate information of various features included in the selected probe data, and the process proceeds to step T204.
  • step T204 based on the map data read in step T203, it is determined whether or not the probe data includes information of a predetermined number (for example, three) or more of reference marks.
  • the predetermined number is preferably three or more for the sake of convenience of the positioning process at the subsequent stage.
  • the number of reference marks to be included may be set to four or five.
  • the server processor 31 executes Step T205.
  • step T207 is executed.
  • step T205 a reference mark common to the map data and the probe data (hereinafter, a common reference mark) is set, and step T206 is executed.
  • a positioning process is performed on the probe data using the common reference mark.
  • the positioning process is a process of correcting the observation position coordinates of various features included in the probe data using the observation coordinates of the common reference mark and the map registration coordinates.
  • the map registration coordinates refer to the coordinates registered in the current (latest) map data.
  • the observation coordinates here are coordinates calculated by the vehicle and indicate the coordinates described in the probe data.
  • an ICP Intelligent Closest Point
  • the ICP method is a method of repeating parallel movement and rotation of each point group so that the two point groups are most matched as a whole. Specifically, a step of searching and associating the nearest point in the other point group (hereinafter, the second point group) from each point of the one side point group (hereinafter, the first point group), Adjusting the position and orientation of each point group in the coordinate system so as to minimize the difference.
  • SVD Single Value Decomposition
  • a steepest descent method or the like.
  • a rotation matrix and a translation vector for bringing a certain point group closer to a target point group for example, an average point group of the first point group and the second point group
  • a target point group for example, an average point group of the first point group and the second point group
  • a minimum value of a function representing a mean square error of a distance between corresponding points can be searched, and a rotation matrix and a translation vector can be obtained.
  • the method of correcting the position coordinates of the feature included in the probe data based on the position coordinates of the common reference mark is not limited to the above method.
  • a correction method described in Japanese Patent Application No. 2018-163076 filed separately may be adopted. That is, the center of gravity (hereinafter, the first center of gravity) of the observation coordinates of the common reference mark is calculated, and the plurality of common reference marks are planarly approximated by, for example, the least square method. Then, a normal vector to the approximate plane (hereinafter, a first normal vector) is calculated.
  • the server processor 31 calculates the center of gravity (hereinafter, the second center of gravity) of the map registration coordinates of the common reference mark, and calculates an approximate plane corresponding to the common reference mark. Then, a normal vector to the plane (hereinafter, a second normal vector) is calculated. Next, the position coordinates of the feature in the probe data are translated so that the first center of gravity coincides with the second center of gravity. In addition, the position coordinates of each feature in the probe data are rotated so that the first normal vector matches the second normal vector. Then, the position coordinates of the feature in the probe data are rotated and moved around the first normal vector passing through the first center of gravity so that the sum of squares of the error of the common reference mark is minimized, and the alignment processing is completed. .
  • the center of gravity hereinafter, the second center of gravity
  • the server processor 31 obtains probe data in which the position and orientation of the feature have been corrected by ending the alignment processing.
  • Such a positioning process corresponds to a process of correcting the position coordinates of various features included in the probe data based on the position coordinates of the common reference mark.
  • the observation coordinates of a normal feature associated with the observation coordinates of a certain reference mark indicate the observation coordinates of the normal feature detected and uploaded by the same vehicle.
  • the observation coordinates of the normal feature associated with a certain reference mark refer to the observation coordinates of the normal feature provided by the vehicle that provided the observation coordinates of the reference mark.
  • step T207 the positioning process for the selected probe data is omitted, and the process proceeds to step T208.
  • the probe data in which the number of reference marks included in the probe data is less than three be excluded from the target of the integration processing described later.
  • the server processor 31 may be configured to divide / concatenate various probe data into a length including three or four or more reference marks.
  • step T209 it is determined whether unprocessed probe data remains in the probe data read in step T201. If unprocessed probe data remains, the process proceeds to step T209, and the process from step T203 is performed on any unprocessed probe data. On the other hand, when there is no unprocessed probe data, step T210 is executed.
  • a process of integrating the corrected probe data is performed.
  • the integration process of the probe data is a process of statistically calculating the coordinates of the feature, as in S201. For example, the variance of the coordinates of each feature is calculated, and if the variance is less than a predetermined threshold, the median / average value is adopted as the coordinates of the feature.
  • a verification flag is set for a feature whose variance is equal to or larger than a predetermined threshold.
  • the verification flag corresponds to a flag indicating that the data is uncertain to be registered as a map.
  • the server processor 31 may calculate coordinates for each landmark after excluding outliers so that the variance is equal to or less than a predetermined threshold.
  • the evaluation of the dispersion in the integration processing is not essential, and may be omitted.
  • the position coordinates of each feature are statistically determined after being corrected using a predetermined reference landmark.
  • the server processor 31 determines the position coordinates of the lane mark by performing statistical processing on a plurality of pieces of coordinate information after correcting the coordinates of the lane mark included in each probe data using a predetermined reference landmark.
  • the data indicating the statistical position coordinates of each feature (hereinafter, integrated data) generated in step T210 is compared with the map data to detect a change point.
  • integrated data itself or map data reflecting the contents of the integrated data corresponds to the provisional map data.
  • the change point here is a part of the integrated data that is different from the current map data, and indicates a place where a feature may have been relocated, added, or deleted. For example, among the features included in the integrated data, those having a landmark corresponding to the current map data (that is, existing features) have a position coordinate determined statistically and a position coordinate in the map data. Is calculated.
  • the displacement amount of the position coordinates exceeds a predetermined error range, it is detected as a feature suspected to be relocated.
  • a predetermined error range for example, less than 3 cm
  • a deviation of 1 cm or more may be detected as a change point. The size of the allowable error can be appropriately changed.
  • step T212 is executed.
  • step T212 the validity of the change point is determined. If the number of pieces of probe data that detect a feature detected as a change point is equal to or greater than a predetermined threshold, or if the change is detected continuously for a predetermined period (for example, three days), the change point is temporarily set. It is judged that it is not valid but valid and reflected in the map data. When the deletion of the lane marking is detected as a change point, it is determined whether or not the deleted section is shorter than a predetermined distance (for example, 5 m). Since the lane markings are usually extended continuously, it is unlikely that only some sections will be deleted. When an object (vehicle, puddle, snow) temporarily exists on the lane marking, the lane marking is not detected.
  • a predetermined distance for example, 5 m
  • the deleted section is shorter than a predetermined distance, for example, it is highly likely that the originally existing lane marking is no longer detected due to a temporary event such as on-street parking, snowfall, or rainfall.
  • a temporary event such as on-street parking, snowfall, or rainfall.
  • a change point is detected over a wide range, such as when the deleted section is longer than a predetermined distance, it may be determined that the change has been made by road construction or the like (that is, there is validity).
  • the construction information indicating that the construction was performed within the latest predetermined period (for example, within three days) at the change point can be obtained from the external server or detected from the probe data, the change point is appropriate. It may be determined that there is a possibility. The above idea can be applied not only to the lane marking but also to other features such as a signboard.
  • the change point determined to be valid is reflected on the map data (step T215).
  • the update is suspended or a verification flag is set. According to the configuration in which the validity of the change point is determined based on the duration and scale of the change point and the presence or absence of construction information near the change point, the contents of the map data are incorrectly updated due to temporary factors. Can be reduced.
  • the position of each feature is aligned using the reference mark for each probe data, and then a plurality of probe data are integrated to calculate the position coordinates of various features.
  • the procedure for calculating the coordinates is not limited to this. Instead of correcting the coordinates of features on a probe data basis, generate integrated data that statistically calculates the position coordinates of each feature, and then correct the coordinate information of each feature based on the reference mark. It may be configured.
  • the server processor 31 statistically calculates the coordinates of each feature based on a plurality of probe data by the method described in step S201 or step T210. calculate.
  • the server processor 31 may be configured to adjust the observation data to be used so that the variance is equal to or less than a predetermined threshold.
  • the statistically calculated coordinates correspond to the representative observation coordinates.
  • the representative observation coordinates indicate an average value or a median of a plurality of observation coordinates.
  • the server processor 31 corrects the representative observation coordinates of various landmarks included in the integrated probe data based on the reference mark coordinate information. For the method of correction, the same method as in step T206 can be used. Even with such a configuration, the accuracy of the coordinates of each feature can be increased.
  • the traveling trajectory data included in the probe data may be configured as a map element for generating the traveling trajectory model.
  • the server processor 31 corrects the traveling trajectory of each vehicle using the reference mark associated with the traveling trajectory, and integrates a plurality of corrected traveling trajectory data to form a traveling trajectory model. It may be configured to generate. According to the configuration in which the traveling trajectories are corrected and integrated using the reference marks and the traveling trajectory model is generated, it is possible to generate a more accurate traveling trajectory for automatic driving.
  • the server processor 31 may be configured to generate a traveling trajectory model obtained by averaging traveling trajectories of a plurality of vehicles, and then correct the traveling trajectory model using the reference mark.
  • the various processes described above may be configured to be shared and executed by a plurality of servers / processors.
  • the integration process of the probe data may be configured to be performed by a server different from the server that acquires and corrects the probe data.
  • the above-described map system 1 corrects the observation coordinates of the same feature provided by a plurality of vehicles using the observation coordinates of the reference mark and the map registration coordinates, and corrects the corrected features of the feature.
  • the plurality of observation coordinates for the same feature may be provided from a plurality of different vehicles, respectively, or may be generated by the same vehicle passing the same point a plurality of times.
  • the observation coordinates of various features included in the probe data are corrected so that the observation coordinates of the reference mark match the absolute coordinates as the map registration coordinates of the reference mark. Not limited to matches. Substantial matches are also included.
  • the above correction may be performed so that the degree of deviation between the observation coordinates and the absolute coordinates of the reference mark is minimized.
  • the correction processing described above may be executed in units of road segments, or may be executed in units of map tiles.
  • Step S300 is a step in which the server processor 31 configuring the server 3 acquires probe data.
  • the server processor 31 acquires probe data from a plurality of vehicles traveling on the same map tile. That is, the server processor 31 acquires a plurality of coordinate data for one landmark.
  • Step S301 is a step in which the server processor 31 calculates the variance of coordinates for each landmark.
  • the variance is calculated for latitude, longitude, and altitude coordinates.
  • the server processor 31 calculates the variance ⁇ 2 of each landmark 63 based on the probe data received from a plurality of vehicles, as shown in FIG. In the example shown in FIG. 7, there are four landmarks 63a ⁇ 63d on the map tiles for each landmark, ⁇ a 2, ⁇ b 2 , ⁇ c 2, calculates the sigma d 2.
  • step S302 is executed.
  • Step S302 the dispersion sigma a 2 server processor 31 is calculated, ⁇ b 2, ⁇ c 2 , to calculate the median value p of sigma d 2, a step of comparing with a predetermined threshold value T1.
  • the calculation of the median of the variance is an example, and any method may be used as long as the degree of dispersion of the coordinates of the landmarks belonging to the map tile can be statistically indexed. For example, an average value may be used. If the median p satisfies the relationship of 0 ⁇ p ⁇ T1 with a predetermined threshold value T1, a YES determination is made in step S302, and the process proceeds to step S303.
  • Step S303 is a step of giving the accuracy level “High” to the map tile for which the determination of YES is made in step S302.
  • the map tile to which the accuracy level “High” is assigned is the map tile determined to have the highest accuracy.
  • Step S304 is a step in which the server processor 31 calculates a median p (or an average value) and compares it with predetermined thresholds T1 and T2. If the median p satisfies the relationship of T1 ⁇ p ⁇ T2 between the predetermined threshold T1 and the threshold T2, the determination in step S304 is YES, and the process proceeds to step S305.
  • Step S303 is a step of giving an accuracy level “Middle” to the map tile for which the determination of YES is made in step S302.
  • Step S306 is a step of giving the accuracy level “Low” to the map tile for which the determination is NO in step S304.
  • the map tile to which the accuracy level “Low” is assigned is the map tile determined to have the lowest accuracy.
  • the accuracy level of the map tile is higher in the order of “High”, “Middle”, and “Low”.
  • the higher the accuracy level the more accurately the current position of the vehicle can be determined, and more advanced driving assistance can be realized. That is, while traveling in an area corresponding to a map tile having a high accuracy level, it is possible to provide advanced driving support such as automatic driving.
  • the application is restricted so as not to be used for automatic driving. More specifically, the main processor 40 allows the automatic driving application to use the map data with respect to the map data of the map tile for which the accuracy level is set to the highest level, while the map has the highest accuracy level.
  • the use of the map data by the automatic driving application is prohibited.
  • the map tiles obtained with high accuracy can be effectively utilized, and the low-accuracy map tiles can be prevented from being erroneously provided to an application requiring more safety, such as an automatic driving application.
  • the main processor 40 preferably notifies the user via the HMI 60 of the restriction. Prohibiting the provision of map data to an application is equivalent to indirectly prohibiting the execution of the application itself or degrading functions.
  • the map system 1 specifies the approximate position of the vehicle by positioning using a satellite such as a GPS, and uses the map information downloaded from the server 3 and the image captured in real time by the vehicle.
  • the detailed position of the own vehicle is determined based on the calculated coordinates of the landmark 63.
  • the vehicle is located in a tunnel or between high-rise buildings, and it is difficult to specify the position by satellite.
  • the map system 1 can employ, as the positioning sensor 30a, for example, a radio wave detector that detects a radio wave intensity provided for a wireless LAN. Positioning by radio waves emitted from a wireless LAN base station (access point) corresponds to the alternative positioning means. Among the base stations that transmit radio waves of the wireless LAN, the vehicle receives radio waves whose global coordinates at which the base station is installed are known. Thereby, the position of the own vehicle is estimated based on the coordinates of the base station and the intensity of the received radio wave. An operation flow of the map system 1 will be described with reference to FIG.
  • Step S400 is a step in which the main processor 40 compares the reception intensity of the radio wave from the GPS satellite with a predetermined threshold.
  • the threshold value for example, the GPS radio wave intensity when the position of the own vehicle can be sufficiently specified by the GPS positioning and the map information obtained by downloading is specified. If the reception intensity of the radio wave from the GPS satellite is greater than this threshold, the determination in this step is YES, and the process proceeds to step S401. That is, the radio wave detector of the wireless LAN radio wave is invalidated as the alternative positioning means. Then, the process proceeds to step S402, where the position of the own vehicle is specified by the positioning by GPS and the map information obtained by downloading. Further, map information such as the landmark 63 obtained by the camera 10 is uploaded to the server 3. After the position of the host vehicle is specified, the host vehicle position is used for driving support such as automatic steering.
  • step S400 if the reception intensity of the radio wave from the GPS satellite is equal to or smaller than the threshold, the determination in this step is NO, and the process proceeds to step S403. That is, a radio wave detector for wireless LAN radio waves is effective as an alternative positioning means. Then, the process proceeds to step S404.
  • Step S404 is a step in which the main processor 40 determines the security level of the base station emitting radio waves of the wireless LAN.
  • the security level is an index of the reliability of information generated by the base station. If the security level is high, the main processor 40 trusts the value of the coordinates of the installation location of the base station, and determines the global coordinates of the base station, the reception strength of the radio wave received by the radio wave detector mounted on the vehicle, and the SfM.
  • the current position of the own vehicle is specified based on the position prediction of the own vehicle by the above method. For example, the distance from the base station is estimated based on the reception strength, and it is determined that the distance from the base station installation position is within the estimated distance.
  • the distance to each base station is estimated based on the reception strength of the signal from each base station, and the installation position of each base station and each base station are used.
  • the current position is calculated using the distance of.
  • a position estimation method using a radio wave emitted from a base station an AOA (Angle Of Arrival) method using a direction of arrival of a radio wave, a arrival time (TOA: Time Of Arrival) method, a time difference (TDOA: Time Difference Of Arrival).
  • TOA Time Of Arrival
  • TDOA Time Difference Of Arrival
  • Various methods such as a) method can be adopted.
  • the level of the security level can be set arbitrarily. For example, it can be inferred that a base station provided by a public organization or a public infrastructure provider has a high security level. On the other hand, it can be inferred that a base station provided by an individual has a low security level.
  • step S404 If it is determined in step S404 that the security level of the base station, and thus the security level of the alternative positioning means, is low, the determination in this step is NO, and the process proceeds to step S405.
  • step S405 map information such as the landmark 63 obtained by the camera 10 is uploaded to the server 3.
  • the information on the positioning by the alternative positioning unit is stored as the position of the own vehicle as in step S402. Served for identification.
  • the position of the own vehicle is not specified, and the positioning information is provided only for uploading the map information to the server 3.
  • step S404 in FIG. 11 may not be performed.
  • positioning information may be provided only for uploading map information to the server 3.
  • the positioning of the coordinates of the installation position is not limited to the radio wave emitted from the known wireless LAN base station, but the positioning of the coordinates of the installation position is the radio wave emitted from the base station of the known short-range wireless communication. Positioning by IMES, positioning by geomagnetism, and the like can be adopted.
  • the radio wave detector for receiving radio waves of wireless LAN, short-range wireless communication, and IMES, and the magnetic detector for detecting terrestrial magnetism do not necessarily need to be fixed to the vehicle.
  • the above-described detector is mounted on a mobile device such as a smartphone and the mobile device is linked to the map system 1, the positioning information obtained by the mobile device is used for the map system 1. be able to.
  • the map system 1 specifies the approximate position of the vehicle by positioning using a satellite such as a GPS, and uses the map information downloaded from the server 3 and the image captured in real time by the vehicle. The detailed position of the own vehicle is determined based on the calculated coordinates of the landmark 63. However, there are situations where the map information does not exist in the server 3 or the map information is old and the current state is not accurately reflected.
  • Step S500 is first executed as shown in FIG.
  • Step S500 is a step in which the main processor 40 specifies a rough position of the own vehicle by GPS or the like. By this step, the area where the own vehicle exists is grasped.
  • Step S501 is a step in which the main processor 40 determines whether or not the map information of the map tile corresponding to the area where the vehicle is located is stored in the server 3. For example, the main processor 40 transmits the current position information of the own vehicle to the server 3. When the map information of the map tile in the area where the vehicle exists is present, the server 3 returns a signal indicating that the map information exists. When the server 3 does not have the map tile of the area where the vehicle exists, the server 3 returns a signal indicating that the server 3 does not have the map data of the requested area. As described above, step S501 may be performed in cooperation with the main processor 40 and the server 3.
  • step S502 If the map information is not stored as a map tile, the determination in this step is NO, and the process proceeds to step S502. Note that the case where there is no map data of the map tile in which the vehicle is present also includes the case where the map data of the map tile has expired.
  • Step S502 is a step in which the main processor 40 sets the map tile corresponding to the corresponding area to the “no map” mode. Thereafter, the process proceeds to step S503.
  • Step S503 is a step in which the main processor 40 sets the update flag to ON for the map tile corresponding to the area.
  • the main processor 40 sequentially uploads feature information such as white line information as probe data.
  • the map tile corresponding to the area for which the update flag is set to ON is preferentially generated by the server processor 31.
  • the main processor 40 uploads the vehicle behavior information in addition to the feature information while traveling on the map tile for which the update flag is set to ON.
  • step S501 If it is determined in step S501 that the map information is present in the map tile corresponding to the area where the vehicle is located, the determination in this step is YES, and the process proceeds to step S504.
  • Step S504 is a step of determining whether or not the latest information is publicly disclosed with respect to the map information recorded on the map tile.
  • public disclosure is, for example, map information disclosed by the Geographical Survey Institute of the Ministry of Land, Infrastructure, Transport and Tourism.
  • map information provided by a specific map vendor may be used as public map information.
  • Public disclosure is not limited to disclosure by government agencies, but also includes semi-public disclosure by certain map vendors.
  • Step S504 may be performed by either the vehicle or the server.
  • the main processor 40 or the server processor 31 communicates with an external server managed by a map vendor or a government agency to determine whether or not the latest map information is disclosed for the map tile where the vehicle exists. I just need.
  • Step S505 is a step in which the main processor 40 sets the map tile corresponding to the corresponding area to the “map exists but is old” mode. Thereafter, the process proceeds to step S503.
  • Step S503 is a step in which the main processor 40 sets the update flag to ON for the map tile corresponding to the area as described above.
  • the map tiles corresponding to the area for which the update flag is set to ON are updated with priority because the feature information is sequentially uploaded from the vehicle to the server 3. This flow ends after step S503.
  • step S504 If it is determined in step S504 that the latest information for the map information recorded on the map tile has not been publicly disclosed, the determination in this step is NO, and the process proceeds to step S506.
  • Step S506 is a step in which the main processor 40 downloads the map information of the map tile corresponding to the current position from the server 3.
  • step S507 is executed.
  • the main processor 40 collates the coordinates of the landmark 63 included in the map information downloaded from the server 3 with the coordinates of the landmark 63 calculated based on the image captured in real time. This is a step of specifying (that is, localizing) the position of the vehicle.
  • Step S508 is a step of determining whether or not the main processor 40 has detected a deviation in the coordinates of the host vehicle (hereinafter, also referred to as a positional deviation).
  • the own vehicle specified based on the coordinates of the landmark 63 included in the map information downloaded from the server 3 and the relative coordinates of the landmark 63 with respect to the own vehicle position calculated from the image captured in real time by the camera 10. Is referred to as a first position.
  • the position of the own vehicle specified using the GPS radio wave which does not depend on the map information stored in the server 3 is referred to as a second position.
  • the means for calculating the coordinates of the landmark 63 in real time is not limited to the one using the camera 10, and for example, a radar or LiDAR may be used.
  • the means for specifying the position of the vehicle without depending on the map information is not limited to GPS, but may be, for example, odometry, dead reckoning, wireless LAN, short-range wireless communication, position identification using radio waves of IMES, geomagnetism, or the like. Position specification may be adopted.
  • Detection of the deviation of the coordinates of the vehicle means that the deviation between the first position and the second position is detected to be equal to or more than a predetermined distance, for example.
  • the detection of the deviation of the coordinates of the host vehicle indicates that a state in which the deviation between the first position and the second position is equal to or more than a predetermined distance has occurred a predetermined number of times.
  • the number and frequency of occurrence of the displacement correspond to index information (in other words, an error signal) indicating that the map data needs to be updated.
  • the displacement corresponds to an event for transmitting the index information (hereinafter, a transmission event).
  • the detection of the displacement may be performed by the main processor 40 itself, or may be performed by another device (for example, the image processor 20).
  • the main processor 40 may detect that a position shift (in other words, a transmission event) has occurred by inputting a signal indicating that the position shift has occurred from the device.
  • a driver's steering intervention occurs at a predetermined amount or frequency when driving assistance such as automatic driving and lane keeping is performed using map information.
  • driving assistance such as automatic driving and lane keeping is performed using map information
  • a point where the driver's steering / deceleration operation intervenes is also referred to as a malfunction point.
  • the intervention of speed adjustment such as depressing a brake pedal, can also be employed as an index for determining the occurrence of a position shift.
  • the amount and frequency of operation intervention such as steering intervention and deceleration operation by the driver correspond to index information indicating that map data needs to be updated.
  • the driver's operation intervention during automatic driving corresponds to an event for transmitting index information to the server 3.
  • step S508 If the displacement of the host vehicle is detected in step S508, the determination in this step is YES, and the process proceeds to step S505. It should be noted that the number of times the position shift is detected is counted up, and when the number of times the position shift is detected is equal to or larger than a predetermined threshold value, the determination in step S508 is YES and the step S505 is executed. You may.
  • Step S505 is a step of setting the map tile corresponding to the corresponding area to the “map exists but is old” mode as described above, and then the update flag is set to ON via step S503.
  • the state where the displacement of the host vehicle is detected is assumed to be a situation in which the terrain and the position of the landmark 63 have changed before the information has been publicly updated due to, for example, a natural disaster.
  • the update flag By setting the update flag to ON after step S508, the update of the map information stored in the server 3 can be promoted prior to the official map update.
  • step S508 determines whether the displacement of the own vehicle is detected. If the displacement of the own vehicle is not detected, the determination in step S508 is NO, and the process proceeds to step S509.
  • Step S509 is a step in which the main processor 40 sets the map tile corresponding to the corresponding area to the “latest map exists” mode. Thereafter, the process proceeds to step S510.
  • Step S510 is a step in which the main processor 40 sets the update flag to OFF for the map tile corresponding to the area as described above.
  • the map tile corresponding to the area for which the update flag is set to OFF does not require the latest map update, and can be actively used for driving support and the like.
  • the update flag is set to OFF, the main processor 40 uploads vehicle behavior information without sending feature information such as white line information as probe data.
  • the server 3 can detect the occurrence of traffic congestion and the like.
  • the main processor 40 determines, based on the predetermined conditions according to steps S501, S504, and S508, "no map”, "no map” One of the three modes of “existing but old” and “the latest map exists” is set, and an update flag corresponding to each mode is set.
  • the update flag is set to ON for the map tiles to which the modes “no map” and “the map exists but are old” are given priority to update or generate the map information included in the map tile. It can be executed in a targeted manner.
  • the server 3 may determine whether the map data of each map tile needs to be updated based on probe data from a plurality of vehicles. For example, when the main processor 40 detects a displacement or an intervention of the occupant in the automatic running of the vehicle in step S508, the main processor 40 reports the fact to the server 3. Alternatively, a signal indicating that updating is necessary is transmitted to the server 3.
  • the server processor 31 sets the update flag of the map tile for which the number of times the positional deviation is detected exceeds a predetermined threshold value to ON. Then, a request is made to a vehicle traveling in an area for which the update flag is set to ON to transmit probe data including feature information. With such a configuration, the map data can be updated quickly.
  • the unit of updating the map data is not limited to the map tile unit. The necessity of updating may be managed for each road segment. For example, ON / OFF of the update flag may be set for each road segment.
  • the map data generated / updated based on the probe data may be provisionally distributed to each vehicle as provisional map data. For example, it is verified whether the provisionally distributed map can be used for automatic control by a plurality of vehicles.
  • the main processor 40 of each vehicle verifies the provisional map based on whether a displacement has been detected when the vehicle position has been calculated using the provisional map data, whether a driver's operation has been performed, and the like.
  • the provisional map data may be verified based on whether or not the traveling trajectory planned based on the image recognition result matches the traveling trajectory planned using the temporary map. It is preferable that the temporary map data is not used for actual automatic driving until the verification is completed and it becomes official map data.
  • a method of verifying the provisional map data As a method of verifying the provisional map data, a method described in Japanese Patent Application No. 2018-163077, which was separately filed, may be used.
  • each vehicle determines that there is no problem as a result of verifying the provisional map data, it reports this to the server 3. If it is determined that the provisional map data has a defect, the fact is reported to the server 3.
  • the server 3 finally determines whether or not there is a problem with the provisional map data based on the verification results of the plurality of vehicles, and adopts the provisional map data as an official map when it determines that there is no problem.
  • the map data adopted as the official map is distributed to each vehicle. Note that a vehicle to which the provisional map data has been distributed may be notified so that the provisional map data is used as official map data.
  • the map system 1 specifies the approximate position of the vehicle by positioning using a satellite such as a GPS, and uses the map information downloaded from the server 3 and the image captured in real time by the vehicle.
  • the detailed position of the own vehicle is determined based on the calculated coordinates of the landmark 63.
  • there are obstacles around the own vehicle that hinder the imaging of the landmark 63 and there are situations where the coordinates of the landmark 63 cannot be specified.
  • the map system 1 behaves so that at least one landmark 63 that is not obstructed by an obstacle exists within the angle of view of the camera 10 that is the imaging device. May be controlled.
  • an operation flow of the map system 1 based on the technical idea will be described with reference to FIG.
  • a camera 10 installed to capture an environment in front of a vehicle will be described.
  • a rear camera that performs rear monitoring and a camera 10 that performs side monitoring may coexist.
  • the preceding vehicle will be described as an example of the obstacle that blocks the landmark 63.
  • the obstacle may be a succeeding vehicle corresponding to the camera 10 for monitoring the rear side, or the obstacle may be a camera 10 for monitoring the side. It may be a compatible parallel car. Objects other than vehicles may act as obstacles.
  • a part or all of a front camera, a rear camera, and a side camera can be adopted as the surrounding monitoring sensor configuring the map system 1.
  • a rear camera that captures an image of a predetermined rear area corresponds to a rear monitoring device.
  • Step S600 is a step in which the main processor 40 specifies a rough position of the own vehicle by GPS or the like. By this step, the area where the own vehicle exists is grasped.
  • Step S601 is executed.
  • Step S ⁇ b> 601 is a step in which the main processor 40 detects a preceding vehicle based on an image captured by the camera 10.
  • Step S602 is a step in which the main processor 40 acquires the vehicle type of the preceding vehicle.
  • the vehicle type is recorded in the memory 80 or a database stored in the server 3 and determines the type based on target silhouette information or the like obtained from an image.
  • Step S603 is a step in which the main processor 40 acquires the vehicle height of the preceding vehicle that becomes an obstacle based on the vehicle type.
  • the vehicle height information is associated with the vehicle type, and the vehicle height corresponding to the vehicle type of the preceding vehicle is acquired.
  • the information on the vehicle height may be calculated from the captured image.
  • Step S604 is a step in which the main processor 40 determines whether or not the preceding vehicle as an obstacle is a tall vehicle.
  • the determination as to whether the preceding vehicle is a tall vehicle is made, for example, by comparing the vehicle height acquired in step S603 with a predetermined threshold, and determining that the preceding vehicle is a tall vehicle when the vehicle height is higher than the threshold. I do.
  • a vehicle type classified as a tall vehicle may be determined in advance, and if the preceding vehicle is the corresponding vehicle type, it may be determined that the vehicle is a tall vehicle.
  • step S603 of acquiring the vehicle height can be omitted.
  • the vehicle type determined as a tall vehicle corresponds to, for example, a truck or a fire truck. If it is determined in step S604 that the preceding vehicle is a tall vehicle, the process proceeds to step S605.
  • Step S605 is a step in which the main processor 40 controls the actuator 70 to change the relative position between the host vehicle and the obstacle so that the camera 10 can recognize the landmark 63.
  • the actuator 70 is a braking device, and the main processor 40 drives the braking device to perform braking of the host vehicle.
  • the inter-vehicle distance between the host vehicle and the preceding vehicle, which is an obstacle increases, and the area occupied by the preceding vehicle with respect to the angle of view decreases. This makes it possible to realize a situation in which the landmark 63 such as a sign is easily reflected in the angle of view, so that the main processor 40 can recognize the landmark 63 and calculate the coordinates of the landmark 63.
  • the detection frequency of the landmark 63 is improved by going through step S605. Can be done. Accordingly, the frequency of calculating the coordinates of the landmarks 63 from the image is also improved, so that the coordinates of the landmarks 63 included in the map information can be compared for a longer time, and the position of the own vehicle can be specified more accurately. be able to.
  • the control of increasing the inter-vehicle distance from the preceding vehicle by deceleration or the like corresponds to an example of vehicle control for making it easier for the surroundings monitoring sensor to detect a landmark.
  • the actuator 70 controlled by the main processor 40 is not limited to the braking device, and may be, for example, a steering. Specifically, when the preceding vehicle is a tall vehicle, the main processor 40 may change the lane by controlling the steering to create a situation where there is no preceding vehicle serving as an obstacle in front of the own vehicle. .
  • step S605 this flow ends.
  • step S604 the preceding vehicle is not a tall vehicle.
  • the map system 1 also increases the frequency of calculating the coordinates of the landmarks 63 from the image, so that it is possible to collate with the coordinates of the landmarks 63 included in the map information for a longer time, The position of the own vehicle can be specified more accurately.
  • the map system 1 may control the vehicle so that the landmark 63 can be recognized based on the inter-vehicle distance measured in real time and the vehicle height calculated by image recognition.
  • An operation example of the map system 1 based on the technical idea will be described with reference to FIGS.
  • the camera 10 installed to capture the environment in front of the vehicle will be described as an example.
  • the camera 10 that performs the rear monitoring and the side monitoring may coexist.
  • the preceding vehicle will be described as an example of the obstacle that blocks the landmark 63.
  • the obstacle may be a succeeding vehicle corresponding to the camera 10 for monitoring the rear side, or the obstacle may be a camera 10 for monitoring the side. It may be a compatible parallel car.
  • the obstacle may be an object other than the vehicle.
  • Step S700 is a step in which the main processor 40 specifies a rough position of the own vehicle by GPS or the like. By this step, the area where the own vehicle exists is grasped.
  • Step S701 is a step in which the main processor 40 detects a preceding vehicle based on an image captured by the camera 10.
  • Step S702 is a step in which the main processor 40 measures the distance to the preceding vehicle, that is, the inter-vehicle distance.
  • the inter-vehicle distance can be measured by a radar, a LIDAR, or a fusion configuration between the radar and the LIDAR and an imaging device.
  • Step S703 is a step in which the main processor 40 measures the height of the preceding vehicle.
  • the height of the preceding vehicle is uniquely determined based on the distance to the preceding vehicle acquired in step S702 and the V-direction coordinates on the image of the upper end of the preceding vehicle displayed in the image captured by the camera 10. Can be measured.
  • Step S704 is a step in which the main processor 40 acquires the coordinates of the landmark 63 assumed to be within the angle of view from the map information.
  • the main processor 40 specifies the area where the vehicle is located from the approximate position of the own vehicle specified in step S700, and reads the map tile corresponding to the area. Then, the coordinates of the landmark 63 recorded on the map tile are obtained.
  • the landmark 63 includes a white line (in other words, a lane mark). However, it is more effective if the landmark 63 is a target that is difficult to image with the camera 10 due to the large physique of the preceding vehicle. Display, speed limit sign, etc. are suitable.
  • the coordinates of the landmark 63 include, for example, coordinate information of the four corners of the rectangle if the landmark 63 is a rectangular plate shape orthogonal to the traveling direction of the vehicle.
  • Step S705 is a step of determining whether or not the preceding vehicle is at a position that blocks the landmark 63 that is supposed to be within the angle of view.
  • a blind spot of the camera 10 is obtained based on the inter-vehicle distance acquired in step S702, the height of the preceding vehicle acquired in step S703, and the angle of view of the camera 10 mounted on the own vehicle. (A hatched portion in FIG. 15) is determined. If at least a part of the coordinates forming the landmark 63 is included in the blind spot, it is determined that the preceding vehicle blocks the landmark 63, and the determination in this step is YES.
  • the example shown in FIG. 15 is an example in which all the landmarks 63 are included in the blind spot created by the preceding vehicle and this step is determined as YES.
  • Step S706 is a step in which the main processor 40 controls the actuator 70 to change the relative position between the host vehicle and the obstacle so that the landmark 63 can be recognized by the camera 10.
  • the actuator 70 is a braking device, and the main processor 40 drives the braking device to perform braking of the host vehicle.
  • the main processor 40 increases the inter-vehicle distance between the host vehicle and the preceding vehicle by this braking so that the landmarks 63 are all out of the blind spot. Specifically, the host vehicle is braked at a portion above the upper end of the preceding vehicle until the inter-vehicle distance becomes such that all of the landmarks 63 can be visually recognized from the upper end to the lower end. As will be described later, a configuration in which braking is performed until a part of the landmark 63 can be visually recognized is also possible. Accordingly, the main processor 40 can recognize the landmark 63, and can calculate the coordinates of the landmark 63 based on the image.
  • the landmark 63 can be easily recognized without acquiring the vehicle type. This is particularly effective when sufficient time for acquiring the vehicle type of the preceding vehicle cannot be secured due to a sudden interruption or the like, or when an obstacle other than the vehicle suddenly jumps in front of the vehicle.
  • the actuator 70 controlled by the main processor 40 is not limited to the braking device, and may be, for example, a steering. Specifically, when the preceding vehicle is a tall vehicle, the main processor 40 may change the lane by controlling the steering to create a situation where there is no preceding vehicle serving as an obstacle in front of the own vehicle. .
  • Various controls such as deceleration, lane change, and position change in the lane can be adopted as vehicle control (hereinafter, detection rate improvement control) for making it easier for the peripheral monitoring sensor to detect landmarks.
  • detection rate improvement control corresponds to control for reducing a possibility that a state in which a landmark cannot be recognized continues.
  • the preceding vehicle blocks the landmark 63 if the blind spot of the preceding vehicle includes even a part of the landmark 63. However, a part of the landmark 63 is visually recognized outside the blind spot. If possible, it may be determined that the preceding vehicle does not block the landmark 63. Alternatively, these criteria may be made variable depending on the type of the landmark 63.
  • the vehicle control for facilitating the detection of the landmark by the peripheral monitoring sensor is performed. May not be performed. For example, even if the preceding vehicle is a tall vehicle or the preceding vehicle is located at a position blocking the landmark in front of the host vehicle, if the rear camera can capture the landmark behind the vehicle, Alternatively, a configuration may be adopted in which vehicle control such as expansion of the inter-vehicle distance or lane change is not performed.
  • the detection rate improvement control is not limited to the case where the preceding vehicle is a tall vehicle or the case where the preceding vehicle blocks a landmark in front of the own vehicle. More specifically, when the inter-vehicle distance with the preceding vehicle is less than a predetermined distance (for example, 20 m), the detection rate improvement control may be executed.
  • a predetermined distance for example, 20 m
  • This configuration corresponds to a configuration in which the detection rate improvement control is executed when a preceding vehicle as an obstacle exists in a predetermined area (in this case, an area within 20 m ahead of the vehicle) within the imaging range of the camera 10.
  • a look-up angle ⁇ When the angle formed by the straight line from the camera 10 to the upper end of the rear surface of the preceding vehicle with respect to the road surface (hereinafter referred to as a look-up angle ⁇ ) is equal to or greater than a predetermined threshold value (for example, 15 degrees), It may be configured to execute improvement control. If the ratio occupied by the portion corresponding to the preceding vehicle in the image frame is equal to or more than a predetermined threshold (for example, 30% or more), the detection rate improvement control such as deceleration may be executed.
  • a predetermined threshold value for example, 15 degrees
  • the detection rate improvement control may be executed when the detection success rate for a landmark to be originally observed becomes less than a predetermined threshold.
  • the detection success rate may be expressed as a ratio of the number of times that the landmark was detected within a certain period of time to the number of times that the landmark was successfully detected.
  • the main processor 40 may be configured to acquire the current position of the other vehicle and the peripheral image acquired by the other vehicle from the other vehicle by inter-vehicle communication as the detection rate improvement control. According to such a configuration, the main processor 40 can detect a landmark based on a peripheral image provided from another vehicle. Further, the own vehicle can be indirectly localized based on the position information of the landmark, the position information of the other vehicle, and the relative position of the own vehicle with respect to the other vehicle.
  • the main processor 40 may be configured to acquire a localization result (that is, detailed position information of another vehicle) of another vehicle (for example, a preceding vehicle) by inter-vehicle communication as detection rate improvement control. According to such a configuration, the main processor 40 indirectly performs, based on the detailed position information of the other vehicle (hereinafter referred to as a reference vehicle) that is the provider of the localization result and the relative position of the own vehicle with respect to the reference vehicle. You can localize your own vehicle. The relative position of the host vehicle with respect to the reference vehicle may be specified based on the detection result of the peripheral monitoring sensor such as the camera 10.
  • the main processor 40 performs localization using a landmark present in front of the host vehicle (in other words, using a front camera image).
  • a landmark present in front of the host vehicle in other words, using a front camera image
  • localization may be performed using the image of the rear camera as the rear monitoring device. That is, as the detection rate improvement control, the surrounding monitoring sensor used for detecting the landmark may be changed. Changing the number or combination of surrounding monitoring sensors used for detecting landmarks also corresponds to an example of detection rate improvement control.
  • the above control may be configured to be executed on condition that the host vehicle is traveling on a predetermined road (for example, a general road).
  • a predetermined road for example, a general road
  • the distance between other vehicles is shorter on a general road than on a highway, and landmarks are less visible. Therefore, the above control may be more useful when traveling on a general road than when traveling on a highway.
  • the main processor 40 may be configured not to execute the above-described control when the traveling road corresponds to a predetermined road type (for example, a motorway). It is unlikely that other vehicles will make it difficult to detect landmarks while driving on a motorway such as an expressway. While traveling on the motorway, the load on the main processor 40 can be reduced by canceling the detection rate improvement control.
  • the main processor 40 may be configured to make the setting distance of the ACC longer when the map utilization function is activated than when the map utilization function is not activated. According to this configuration, it is possible to further reduce the possibility that the detection of the landmark becomes difficult.
  • the situation in which the coordinates of the landmark 63 are difficult to specify is not limited to the case where there is an obstacle that hinders the imaging of the landmark 63 around the own vehicle. For example, even when the surrounding environment of the vehicle is relatively dark, such as in a tunnel or at night, it may be difficult for the camera 10 to detect the landmark 63 and calculate the coordinates.
  • the map system 1 may be configured to facilitate the detection of the landmark 63 and the calculation of the coordinates even when the surrounding environment of the vehicle is relatively dark, and thus to more accurately specify the position of the vehicle.
  • control of a headlight installed to irradiate illumination light in front of a vehicle will be described as an example.
  • the light that irradiates the front of the vehicle, as well as the light that irradiates the rear of the vehicle and the side of the vehicle, may coexist, and the control target may be the light that irradiates the rear of the vehicle and the side of the vehicle.
  • Step S800 is a step in which it is determined whether an application that uses map information is running.
  • the application using the map information is, for example, an automatic steering realized by identifying the position of the host vehicle by comparing the coordinates of the landmark 63 calculated based on the image with the map information. If the application using the map information has not been executed, the determination in this step is NO, and this flow ends. If the application has been executed, the determination in this step is YES, and the process proceeds to step S801.
  • Step S801 is a step of determining whether or not headlight control in the vehicle is set to the auto mode.
  • the state in which the automatic mode is set is a state in which the light distribution control such as up and down or left and right of the headlight is automatically set, for example, an active high beam system (AHS).
  • AHS active high beam system
  • Step S802 is a step in which the main processor 40 determines whether or not the brightness of the surrounding environment of the vehicle is equal to or less than a predetermined threshold. Specifically, the illuminance detected by the illuminance sensor mounted on the vehicle is compared with a predetermined threshold. If the illuminance is larger than the threshold value, it is determined that the headlight does not necessarily need to be turned on, so that the determination in this step is NO, and this flow ends. On the other hand, if the illuminance is equal to or smaller than the threshold, the process proceeds to step S803, and the headlight is turned on.
  • Step S804 is a step in which the main processor 40 determines whether there is a preceding vehicle or an oncoming vehicle to the own vehicle.
  • the presence of the preceding vehicle is recognized by detecting the light of the rear light from an image captured by the camera 10, for example.
  • the presence of a preceding vehicle is recognized by a fusion configuration using the camera 10 and radar or LIDAR.
  • the presence of the oncoming vehicle is detected by detecting the light of the headlight from an image captured by the camera 10, for example.
  • the presence of an oncoming vehicle is recognized by a fusion configuration using the camera 10 and radar or LIDAR.
  • Step S805 is a step of setting the irradiation mode of the headlight to the low beam.
  • the low beam mode in order to suppress glare of a driver of a preceding vehicle or an oncoming vehicle, the headlights are illuminated so that the corresponding vehicle is not directly illuminated at least in a direction where the preceding vehicle or oncoming vehicle exists. Adjust the direction.
  • Step S806 is a step in which the main processor 40 controls the light distribution of the headlights so as to selectively emit illumination light in the direction in which the landmark 63 is assumed to exist.
  • the main processor 40 controls the headlight as the actuator 70 to maintain the headlight on the right side of the vehicle with a low beam so that the illumination light is not excessively applied to the oncoming lane on which the oncoming vehicle travels.
  • the light distribution of some light sources of the headlights on the left side of the vehicle is controlled so that the illumination light does not excessively irradiate the preceding vehicle.
  • Part of the light source of the headlight on the left side of the vehicle is arranged so that the illumination light of the headlight is applied to the outside of the lane where the mark 63 will exist, that is, the outside of the lane including the shoulder on the lane side on which the vehicle runs.
  • the light is controlled to be a high beam. This makes it possible to irradiate the headlight illumination light in a direction in which the landmark 63 is expected to be present, while suppressing glare of the driver of the oncoming vehicle or the preceding vehicle.
  • the frequency of detection of the landmark 63 is increased, and the frequency of calculating the coordinates of the landmark 63 from the image is also improved. Therefore, it is possible to collate with the coordinates of the landmark 63 included in the map information for a longer time, The position of the own vehicle can be specified more accurately.
  • Step S807 is a step of setting the irradiation mode of the headlight to the high beam. In the high beam mode, since the preceding vehicle and the oncoming vehicle do not exist in the vicinity of the own vehicle, the irradiation direction of the headlight is adjusted so that a distant place can be visually recognized.
  • Step S808 is a step in which the main processor 40 controls the light distribution of the headlights so as to selectively emit illumination light in the direction in which the landmark 63 is assumed to be present.
  • a wide light distribution or a distant light distribution can be adopted as the light distribution of the headlight.
  • the wide light distribution is a light distribution mode that illuminates a wider area than a high beam or an irradiation range of the high beam in the left-right direction.
  • the distant light distribution is a light distribution mode in which, during high-speed traveling, illumination light is concentrated farther than the high beam and the illumination light reaches farther away. This makes it possible to more easily detect the landmark 63 such as a destination sign even during high-speed traveling.
  • the map system 1 In the above-described map system 1, an example has been described in which light distribution in which the landmark 63 is easily detected is implemented under one condition that the light control is in the auto mode.
  • the map system 1 In the dependent mode, when the headlight is set to the low beam and the illuminance of the environment around the vehicle is equal to or less than the predetermined threshold, the map system 1 changes the light distribution of the headlight to the high beam to the driver. A proposal to do so may be made.
  • the light distribution change can be proposed by, for example, displaying the fact on the HMI 60 or transmitting the sound by voice.
  • landmarks without lighting or the like are difficult to recognize from images captured by the camera 10. Therefore, in the daytime, localization is performed based on a variety of landmarks, while in the nighttime, landmarks used for localization are internally illuminated signs, signs with traffic lights such as street lights, traffic lights, and electric lights. It is preferably limited to a bulletin board or the like.
  • the internally illuminated sign refers to a sign provided with a light source inside a sign plate.
  • the map data include, as attribute information of the landmark, whether or not the map data can be detected even at night. It should be noted that whether or not detection is possible even at night may be set based on probe data at night. For example, a landmark detected with a predetermined probability in probe data collected at night may be set as a landmark that can be recognized even at night.
  • This map system 1 realizes downloading of map tiles in different flows depending on whether or not a destination to which the vehicle should go is set.
  • a destination to which the vehicle should go is set.
  • Step S900 is a step in which the main processor 40 specifies a rough position of the own vehicle by GPS or the like. By this step, the area where the own vehicle exists is grasped.
  • the series of processing illustrated in FIG. 19 may be started by turning on the traveling power supply as a trigger.
  • the series of processes illustrated in FIG. 19 may be started when the map utilization function or the automatic driving function is activated.
  • Step S901 is a step in which the main processor 40 downloads a map tile corresponding to the area where the vehicle is located.
  • the main processor 40 requests the server 3 to distribute the map data of the map tile corresponding to the own vehicle position.
  • the main processor 40 transmits the tile ID of the area to which the own vehicle position belongs to the server 3.
  • the server 3 distributes the map tile requested by the vehicle.
  • the server 3 does not have the map data of the requested area (for example, when the expiration date has expired)
  • the server 3 returns a signal indicating that the server 3 does not have the map data of the requested area. I do.
  • a signal in which an invalid value for example, “NULL” is set at a predetermined position in the data format is returned.
  • step S901 may be omitted.
  • necessary map data is specified from the vehicle side and requested to the server 3, but the distribution mode of the map data is not limited to this.
  • the vehicle may be configured to transmit its current location to the server 3, which determines the map data corresponding to the reported vehicle location and distributes it to the vehicle.
  • FIG. 20 shows an example of a map tile.
  • FIG. 20 shows 80 map tiles.
  • a unique ID is assigned to each of the map tiles stored in the server 3, but here, for convenience, serial symbols a to y are assigned to the 25 map tiles.
  • the map tile m corresponds to the first tile.
  • the map tile corresponding to the area where the vehicle is located is particularly referred to as a first tile.
  • Step S902 is a step in which the main processor 40 divides the first tile into subtiles. As shown in FIG. 20, the main processor 40 divides the map tile m, which is the first tile, into four rectangular areas, and sets so as to execute the subsequent processing.
  • Step S903 is a step in which the main processor 40 specifies a subtile to which the host vehicle belongs from among the plurality of subtiles.
  • the own vehicle belongs to the upper right sub-tile among the divided map tiles m.
  • Step S904 is a step of designating a map tile adjacent to the subtile to which the own vehicle belongs as a download target.
  • the first tile is the map tile m
  • the subtile to which the host vehicle belongs is the subtile located at the upper right
  • the map tiles designated as download targets in step S904 are the map tiles h and i. , N.
  • the map tile adjacent to the sub tile corresponds to a candidate for a map tile that can be moved next when the vehicle moves across the map tile.
  • the map tile adjacent to the subtile corresponds to a map tile through which the vehicle may pass.
  • a map tile existing at a position where the vehicle can enter within a predetermined time may correspond to a map tile having a relatively high possibility that the vehicle will pass.
  • a map tile existing within a predetermined distance from the current position of the vehicle may also correspond to a map tile to which the vehicle may pass.
  • Step S905 is a step of downloading a map tile that is designated as a download target and is not cached in the memory 80 (that is, a map tile that has not been acquired).
  • the map tiles h, i, and n correspond to the download target. If there is any of these that has already been downloaded and stored in the memory 80, the corresponding map tile is not downloaded.
  • Step S906 is a step of caching the downloaded map tile in the memory 80. As long as the corresponding data remains in the memory 80, the cached map tile can be used without downloading.
  • Step S907 is a step of determining whether or not the vehicle has moved to a second tile different from the first tile. For example, if the vehicle moves from the map tile m to the map tile i, the determination in this step is YES. In this example, the map tile i corresponds to the second tile. If the vehicle continues to exist on the first tile, the process of step S907 is continued. If the determination in this step is YES, the process proceeds to step S908.
  • Step S908 is a step of designating map tiles around the second tile as download targets. Assuming that the vehicle has moved from map tile m to map tile i, the second tile is map tile i, and the map tiles specified as download targets are map tiles c, d, e, h, j, m, n. , O.
  • Step S909 is a step of downloading a map tile designated as a download target and not cached in the memory 80.
  • a map tile designated as a download target and not cached in the memory 80.
  • eight map tiles c, d, e, h, j, m, n, and o are to be downloaded, but map tiles h, m, and n were downloaded and cached in the previous step. It is not downloaded because it is a map tile. That is, the number of map tiles downloaded in step S909 is substantially five. Then, as long as the vehicle runs continuously after the first tile is set, the maximum number of map tiles downloaded after moving to the second tile is 5 in any situation.
  • Step S310 is a step of caching the downloaded map tile in the memory 80. As long as the corresponding data remains in the memory 80, the cached map tile can be used without downloading.
  • Step S310 is a step in which the main processor 40 determines whether an application that requires map information is being executed.
  • the application that requires map information is an application related to map utilization, such as automatic steering control while specifying the position of the vehicle. While the application requiring the map information is running, it is necessary to continuously download the map tile at the destination where the vehicle travels, and the flow from step S907 to step S311 is repeated. That is, map tiles around the destination map tile are set as download candidates, and download of uncached map tiles is continuously executed. On the other hand, if the application utilizing the map information is stopped and the application is not running at the time of execution of step S311, this flow ends.
  • the map tiles to be downloaded are map tiles h, i, and n when the vehicle is located in the upper right subtile, and map tiles n and r when the vehicle moves to the lower right subtile. , S. If the vehicle subsequently moves to the map tile r, the map tiles downloaded in step S909 are five map tiles l, q, v, w, and x.
  • map tiles corresponding to the area where the vehicle is assumed to run can be exhaustively downloaded with a minimum number of downloads.
  • a download policy in the case where the destination is not set after the power for traveling is turned on, three sub-tiles obtained by dividing the first tile into four sub-tiles adjacent to the sub-tile to which the vehicle belongs belong. This corresponds to a configuration in which a tile and a first tile are set as download targets.
  • the processing related to the map download described above is triggered by the activation of the automatic driving function or the activation of the map utilization function based on a user input to the HMI 60 after the power for driving is turned on. It only needs to be executed.
  • the first tile is, on one side, a map tile corresponding to the position of the vehicle at the time when the driving power is turned on.
  • the first tile corresponds to a map tile to which the vehicle position at the time when the automatic driving function is activated or when the map utilization function is activated.
  • Step S920 is a step in which the main processor 40 specifies a rough position of the own vehicle by GPS or the like. By this step, the area where the own vehicle exists is grasped. In the example shown in FIG. 22, the position of the host vehicle is indicated by a point A (black diamond).
  • Step S921 is a step of acquiring global coordinates of the set destination.
  • the destination can be set by an automatic instruction from an external instruction system or other means in addition to the active instruction of the driver as the user.
  • the destination may be set by the map system 1 receiving the destination set by the mobile communication device in addition to the operation by the car navigation system mounted on the vehicle. In the example shown in FIG. 22, the destination is indicated by a point B (white diamond).
  • Step S922 is a step of calculating the main route L based on the position of the own vehicle specified in step S920 and the coordinates of the destination acquired in step S921.
  • the main route L is a traveling route recommended for the own vehicle to move from the current position to the destination so as to satisfy the specified condition. In the example shown in FIG. 22, it is shown as a solid line connecting the current position A and the destination B.
  • Step S923 is a step of calculating the branch road R.
  • the branch road R is a route connected to the main route L, and is a route that may cause a vehicle to leave the main route L and travel.
  • the branch path R includes a first branch path R1 that branches directly from the main path L, and a second branch path R2 that is not directly connected to the main path L but branches from the first branch path R1.
  • Step S924 is a step of designating a map tile to which the main route L and the branch route R belong as download targets. All map tiles to which the main route L belongs are to be downloaded. On the other hand, as for the map tile to which the branch route R belongs, two tiles which are the map tiles to which the branch route R belongs and are continuous from the map tile to which the main route L belongs are designated as download targets. Note that, regarding the map tile related to the branch route R, the number of tiles continuous from the map tile to which the main route L belongs is not limited, and the number of two tiles is one example. In the example shown in FIG. 22, the map tiles to be downloaded are hatched.
  • Step S925 is a step of downloading a map tile designated as a download target and not cached in the memory 80. If there is a previously downloaded one stored in the memory 80, the corresponding map tile is not downloaded.
  • Step S926 is a step of caching the downloaded map tile in the memory 80. As long as the corresponding data remains in the memory 80, the cached map tile can be used without downloading.
  • a download priority for a plurality of map tiles designated as download targets For example, a map tile closer to the map tile to which the vehicle belongs preferably has a higher download priority.
  • the map tiles are downloaded in the order of arrival of the map tiles, so that the map tiles can be efficiently downloaded without omission while effectively utilizing the communication band.
  • the map tile to which the main route L belongs is downloaded with higher priority than the map tile to which the branch route R belongs. Since the probability that the vehicle travels on the main route L is higher than that of the branch route R, the map tile can be efficiently downloaded while effectively utilizing the communication band.
  • the vehicle when there is an area in which the communication state between the vehicle and the server 3 is known in advance to be deteriorated among the routes including the main route L and the branch road R where the vehicle is predicted to travel, It is good to download the map tile to be given priority.
  • a difficult communication section such as a mountain area or a tunnel where the communication state is deteriorated
  • the vehicle travels in a mountain area / tunnel section (that is, a difficult communication section) while traveling in an urban area having a good communication state.
  • the download priority may be set higher, for example, in the order of the map tile (that is, the first tile) corresponding to the current position, the map tile adjacent to the first tile and passing through the main route, and the communication difficult section.
  • map tile designated as the download target an example in which the map tile corresponding to both the main route L and the branch road R is downloaded has been described. However, a system that downloads the map tile corresponding to only the main route L is described. You may adopt it.
  • not all routes connecting to the main route L need to be adopted as the branch routes R.
  • a narrow street connected to the main route may not be used as the branch road R.
  • a road having the same rank or higher as the road constituting the main route may be set as the branch road R.
  • a road having the same rank as a certain road indicates, for example, a road having the same road type (national road, prefectural road, narrow street).
  • roads of equal or higher rank include national roads and prefectural roads.
  • road types can be classified into interstate highways, US highways, state roads, and municipal roads in ascending order of road rank.
  • a road having the same number of lanes or more than the main route may be adopted as a road of the same rank or higher.
  • Whether or not to adopt the road connected to the main route as the branch road R is determined by comparing the road scale of the main route near the connection point (substantially the intersection) with the scale of the connection road. Good.
  • the road scale corresponds to the above-mentioned road rank and the number of lanes.
  • the configuration in which the map tile to which the road on which the own vehicle is likely to go belongs is preliminarily and preliminarily downloaded when the destination is set, but the present invention is not limited to this.
  • a map tile through which the host vehicle travels may be set as a download target. More specifically, a predetermined number (for example, three) of map tiles present on the side of the own vehicle traveling direction from the current position among map tiles passing through the own vehicle traveling path are set as download targets. You may. According to such a configuration, even when a destination is not set, a map tile to which a road to which the host vehicle can pass can be downloaded in advance, similarly to the case where the destination is set.
  • the planned map tile download method described above may be applied only to static map data.
  • the dynamic map data of the map tiles that the vehicle may pass through may be downloaded all at once. This is because the dynamic map data can be expected to have a smaller data amount than the static map information.
  • the main processor 40 may be configured to change the timing and rules for downloading such data according to types such as static information and dynamic information. For example, the configuration may be such that dynamic map data is sequentially downloaded in real time, while static map data is downloaded on a monthly or weekly basis.
  • the various configurations and methods described above can be applied not only to a case where the vehicle is traveling on a dedicated road such as an expressway, but also to a case where the vehicle is traveling on a general road.
  • the main processor 40 stores the vehicle position coordinates specified at that time in the storage unit 82 or the like. It is preferable that the information is stored in the storage device.
  • the vehicle even when the vehicle is parked in a location where GPS radio waves do not reach, by referring to the position information recorded in the storage unit 82, the approximate timing at which the traveling power is turned on is obtained.
  • the position of the vehicle can be specified.
  • the first tile can be specified.
  • the user may be notified of the restriction via the HMI 60, for example, by displaying an icon on a display.
  • the main processor 40 notifies the occupant via the HMI 60 that the automatic driving function of level 3 or higher cannot be executed.
  • the configuration of the in-vehicle system 2 constituting the map system 1 is not limited to the configuration shown in FIG.
  • the in-vehicle system 2 may be realized by using a front camera module 90, a locator 100, an Ethernet switch 110, and a communication module 50.
  • Ethernet is a registered trademark.
  • FIG. 23 the illustration of the sensor 30 and the HMI 60 is omitted.
  • the front camera module 90 includes a camera body 91, a camera processor 92, and a memory (not shown).
  • the camera body 91 has a configuration corresponding to the camera 10 described above.
  • the camera processor 92 corresponds to the image processor 20. Further, as a more preferable embodiment, the camera processor 92 is configured to be able to perform position calculation (that is, localization) using the image recognition result and the map data complementarily.
  • the camera processor 92 has a function of controlling the vehicle (for example, steering control) using at least one of the image recognition result and the map data.
  • the front camera module 90 sequentially provides the locator 100 with data of feature information and vehicle information (for example, a current position and a yaw rate) as a result of image recognition (for example, every 100 milliseconds).
  • Data communication between the front camera module 90 and the locator 100 may be realized by CAN (Controller Area Network), FlexRay (registered trademark), Ethernet (registered trademark), USB, UART, or the like.
  • the locator 100 is a device that specifies a current position using map data provided from the server 3.
  • the locator 100 includes a locator processor 101, a volatile memory 102, and a nonvolatile memory 103.
  • Locator 100 has a function as positioning sensor 30a.
  • the locator processor 101 sequentially acquires the image recognition result and the vehicle information provided from the front camera module 90 and uploads them to the server 3 as probe data.
  • the locator 100 also downloads the map data corresponding to the vehicle position from the server 3 via the communication module 50 or the like, and stores the map data in the nonvolatile memory 103.
  • Such a locator 100 corresponds, in one aspect, to an ECU that executes processing related to transmission and reception of map data.
  • the locator 100 sequentially develops, in the volatile memory 102, the data of the section of the map downloaded from the server 3 in which the vehicle travels, and provides the data to the front camera module 90.
  • the developed map data around the own vehicle is used for localization, steering control, and the like by the front camera module 90. Note that localization and steering control may be performed by the locator 100 instead of the front camera module 90.
  • the arrangement of functions included in each configuration can be changed as appropriate.
  • the section that the vehicle travels refers to, for example, a road within a predetermined distance from the current position in the traveling direction (basically forward) of the vehicle.
  • the predetermined distance here is a parameter that defines the map data readout range, and is, for example, 200 m.
  • the read range may be 100 m in front of the host vehicle, 400 m, 500 m, or the like.
  • the reading range may be adjusted according to the vehicle speed or the type of the traveling road. For example, the higher the vehicle speed, the longer the readout range is set. Further, when the traveling road is an expressway, the reading range may be set longer than when the traveling road is a general road.
  • front map data local map data including detailed road shape information (curvature, gradient, width, etc.) within a predetermined distance ahead of the vehicle is referred to as front map data.
  • the forward map data corresponds to map data around the current position.
  • the locator 100 is connected to the communication module 50 via, for example, an Ethernet switch 110.
  • the locator 100 is also communicably connected to an automatic driving ECU, a body ECU, a driving support device, and the like via the Ethernet switch 110 or directly.
  • the locator 100 performs overall transmission and reception of map data. Further, according to the configuration shown in FIG. 23, the image processor 20 and the locator processor 101 share the processing that the main processor 40 was responsible for. Therefore, the load on each processor can be reduced.
  • the front camera module 90 may be configured so that the captured image frame is output to the multimedia ECU as a continuous video signal.
  • the video signal may be transmitted in a predetermined format such as LVDS (Low voltage differential differential signaling).
  • the Ethernet switch 110 existing between the locator 100 and the communication module 50 is an optional element.
  • Locator 100 and communication module 50 may be directly connected by a USB cable or the like.
  • locator 100 may be implemented with the aid of a navigation ECU or a multimedia ECU.
  • the navigation ECU is an ECU that executes map display and route guidance processing for the occupant.
  • the multimedia ECU is an ECU that provides functions such as audio, moving image reproduction, and web browsing. According to the configuration in which the function of transmitting and receiving map data and the function of managing the map data are added to the existing ECU, the introduction cost of the system can be reduced.
  • a system for downloading map data from the server 3 and a system for uploading probe data to the server 3 may be separated.
  • the multimedia ECU 120 acquires the image recognition result and the vehicle information from the front camera module 90, performs packaging as probe data, and uploads it to the server 3 via the communication module 50. That is, the multimedia ECU 120 controls uploading of the probe data to the server 3.
  • locator 100 downloads map data from server 3 via communication module 50 and sequentially provides front map data to front camera module 90 via driving support device 130. That is, the locator 100 controls the download of the map.
  • the driving support device 130 sequentially provides the front map data provided from the locator 100 to the front camera module 90.
  • the driving support device 130 automatically drives the vehicle for a predetermined time / a predetermined distance using the front map data instead of the front camera module 90. Let it. According to the above configuration, it is possible to enhance robustness against a system abnormality.
  • the disclosure in this specification and drawings is not limited to the illustrated embodiment.
  • the disclosure includes the illustrated embodiments and variations thereon based on those skilled in the art.
  • the disclosure is not limited to the combination of parts and / or elements shown in the embodiments.
  • the disclosure can be implemented in various combinations.
  • the disclosure may have additional parts that can be added to the embodiments.
  • the disclosure encompasses embodiments that omit parts and / or elements.
  • the disclosure encompasses the replacement or combination of parts and / or elements between one embodiment and another.
  • the disclosed technical scope is not limited to the description of the embodiments. Some of the disclosed technical ranges are indicated by the description of the claims, and should be construed to include all modifications within the meaning and scope equivalent to the description of the claims.
  • control unit and the method thereof according to the present disclosure may be realized by a dedicated computer programmed to execute one or a plurality of functions embodied by a computer program.
  • device and the method described in the present disclosure may be realized by a dedicated hardware logic circuit.
  • apparatus and the technique described in the present disclosure may be implemented by one or more dedicated computers configured by a combination of a processor executing a computer program and one or more hardware logic circuits.
  • the computer program may be stored in a computer-readable non-transitional tangible recording medium as instructions to be executed by a computer.
  • control unit refers to various processors such as the main processor 40, the image processor 20, the server processor 31, the camera processor 92, the locator processor 101, and the multimedia processor 121.
  • the means and / or functions provided by the various processors described above can be provided by software recorded in a substantial memory device and a computer executing the software, software only, hardware only, or a combination thereof.
  • Some or all of the functions of the communication microcomputer 123 may be realized as hardware.
  • a mode in which a certain function is realized as hardware includes a mode in which one function or a plurality of ICs are used.
  • processors such as a CPU, an MPU (Micro Processor Unit), a GPU (Graphics Processing Unit), and a data flow processor (DFP: Data Flow Processor) can be employed.
  • one device for example, the front camera module 90
  • the front camera module 90 may be realized by combining a plurality of types of processors such as a CPU, an MPU, a GPU, and a DFP.
  • a part of the functions to be provided by the main processor 40 may be realized using an FPGA (Field-Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit).
  • SYMBOLS 1 Map system, 2 ... In-vehicle system (vehicle side apparatus, vehicle control device), 3 ... Server, 31 ... Server processor, 10 ... Imaging device (camera, peripheral monitoring sensor), 20 ... Image processor, 30 ... State acquisition part (Sensor), 40 main processor, 50 communication module, 60 HMI, 62 road segment, 70 actuator, 80 memory, 90 front camera module, 100 locator, 110 Ethernet switch, 120 multimedia ECU , 130 ... Driving support device

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

Selon l'invention, un système de cartes destiné à la navigation autonome d'un véhicule sur un segment de route, comprend au moins un processeur et un serveur dans lequel sont stockées des informations de cartes. Le processeur télécharge les informations de cartes à partir du serveur et détermine l'emplacement du véhicule en fonction des informations de cartes. Les informations de cartes sont stockées dans le serveur pour chaque zone divisée sous forme de carreau de carte rectangulaire, et sont téléchargées vers le véhicule pour chaque carreau de carte. Lorsque la destination du véhicule n'est pas définie, le processeur télécharge, sous forme de premier carreau, le premier carreau de carte correspondant à la zone dans laquelle le véhicule donné est initialement présent, puis divise le premier carreau en quatre sous-carreaux et prend, comme objets à télécharger, trois carreaux de carte adjacents au sous-carreau auquel appartient le véhicule.
PCT/JP2019/033210 2018-08-31 2019-08-26 Dispositif côté véhicule, procédé et support de stockage WO2020045324A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112019004323.4T DE112019004323T5 (de) 2018-08-31 2019-08-26 Fahrzeugseitige vorrichtung, verfahren und speichermedium
US17/185,694 US11835361B2 (en) 2018-08-31 2021-02-25 Vehicle-side device, method and non-transitory computer-readable storage medium for autonomously driving vehicle

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018-162471 2018-08-31
JP2018162471 2018-08-31
JP2019143135A JP7251394B2 (ja) 2018-08-31 2019-08-02 車両側装置、方法および記憶媒体
JP2019-143135 2019-08-02

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/185,694 Continuation US11835361B2 (en) 2018-08-31 2021-02-25 Vehicle-side device, method and non-transitory computer-readable storage medium for autonomously driving vehicle

Publications (1)

Publication Number Publication Date
WO2020045324A1 true WO2020045324A1 (fr) 2020-03-05

Family

ID=69644375

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/033210 WO2020045324A1 (fr) 2018-08-31 2019-08-26 Dispositif côté véhicule, procédé et support de stockage

Country Status (1)

Country Link
WO (1) WO2020045324A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111866739A (zh) * 2020-06-30 2020-10-30 通号城市轨道交通技术有限公司 适用于cbtc系统的电子地图实时传输方法及系统
CN111953755A (zh) * 2020-07-31 2020-11-17 中国第一汽车股份有限公司 一种地图存储方法、装置、车辆及计算机存储介质
US11410332B2 (en) 2018-08-31 2022-08-09 Denso Corporation Map system, method and non-transitory computer-readable storage medium for autonomously navigating vehicle
US11821750B2 (en) 2018-08-31 2023-11-21 Denso Corporation Map generation system, server, vehicle-side device, method, and non-transitory computer-readable storage medium for autonomously driving vehicle
US11840254B2 (en) 2018-08-31 2023-12-12 Denso Corporation Vehicle control device, method and non-transitory computer-readable storage medium for automonously driving vehicle
US11920948B2 (en) 2018-08-31 2024-03-05 Denso Corporation Vehicle-side device, method, and non-transitory computer-readable storage medium for uploading map data
US11979792B2 (en) 2018-08-31 2024-05-07 Denso Corporation Method for uploading probe data

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011117740A (ja) * 2009-11-30 2011-06-16 Fujitsu Ten Ltd ナビゲーションシステムおよび車載装置
US20170138743A1 (en) * 2015-01-19 2017-05-18 Here Global B.V. Updating Navigational Map Data
JP2017090548A (ja) * 2015-11-04 2017-05-25 トヨタ自動車株式会社 地図更新判定システム
JP2017111158A (ja) * 2017-01-31 2017-06-22 株式会社ゼンリン ナビゲーションシステム
JP2017125869A (ja) * 2016-01-12 2017-07-20 株式会社トヨタマップマスター 地図更新装置、地図更新方法、コンピュータプログラム及びコンピュータプログラムを記録した記録媒体
JP2018004791A (ja) * 2016-06-29 2018-01-11 アイシン・エィ・ダブリュ株式会社 サーバ装置、通信端末、情報配信システム及びコンピュータプログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011117740A (ja) * 2009-11-30 2011-06-16 Fujitsu Ten Ltd ナビゲーションシステムおよび車載装置
US20170138743A1 (en) * 2015-01-19 2017-05-18 Here Global B.V. Updating Navigational Map Data
JP2017090548A (ja) * 2015-11-04 2017-05-25 トヨタ自動車株式会社 地図更新判定システム
JP2017125869A (ja) * 2016-01-12 2017-07-20 株式会社トヨタマップマスター 地図更新装置、地図更新方法、コンピュータプログラム及びコンピュータプログラムを記録した記録媒体
JP2018004791A (ja) * 2016-06-29 2018-01-11 アイシン・エィ・ダブリュ株式会社 サーバ装置、通信端末、情報配信システム及びコンピュータプログラム
JP2017111158A (ja) * 2017-01-31 2017-06-22 株式会社ゼンリン ナビゲーションシステム

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11410332B2 (en) 2018-08-31 2022-08-09 Denso Corporation Map system, method and non-transitory computer-readable storage medium for autonomously navigating vehicle
US11821750B2 (en) 2018-08-31 2023-11-21 Denso Corporation Map generation system, server, vehicle-side device, method, and non-transitory computer-readable storage medium for autonomously driving vehicle
US11840254B2 (en) 2018-08-31 2023-12-12 Denso Corporation Vehicle control device, method and non-transitory computer-readable storage medium for automonously driving vehicle
US11920948B2 (en) 2018-08-31 2024-03-05 Denso Corporation Vehicle-side device, method, and non-transitory computer-readable storage medium for uploading map data
US11979792B2 (en) 2018-08-31 2024-05-07 Denso Corporation Method for uploading probe data
CN111866739A (zh) * 2020-06-30 2020-10-30 通号城市轨道交通技术有限公司 适用于cbtc系统的电子地图实时传输方法及系统
CN111866739B (zh) * 2020-06-30 2022-09-20 通号城市轨道交通技术有限公司 适用于cbtc系统的电子地图实时传输方法及系统
CN111953755A (zh) * 2020-07-31 2020-11-17 中国第一汽车股份有限公司 一种地图存储方法、装置、车辆及计算机存储介质

Similar Documents

Publication Publication Date Title
JP7251394B2 (ja) 車両側装置、方法および記憶媒体
US11410332B2 (en) Map system, method and non-transitory computer-readable storage medium for autonomously navigating vehicle
JP7167876B2 (ja) 地図生成システム、サーバ、方法
JP7067536B2 (ja) 車両制御装置、方法および記憶媒体
JP7156206B2 (ja) 地図システム、車両側装置、およびプログラム
JP7147712B2 (ja) 車両側装置、方法および記憶媒体
WO2020045323A1 (fr) Système de génération de carte, serveur, dispositif côté véhicule, procédé, et support de stockage
US11846522B2 (en) Warning polygons for weather from vehicle sensor data
WO2020045324A1 (fr) Dispositif côté véhicule, procédé et support de stockage
WO2020045318A1 (fr) Dispositif côté véhicule, serveur, procédé et support de stockage
WO2020045322A1 (fr) Système de carte, dispositif côté véhicule, procédé et support d'informations
WO2020045319A1 (fr) Dispositif, procédé de commande de véhicule, et support de stockage
JP7414150B2 (ja) 地図サーバ、地図配信方法
US20230175863A1 (en) Traffic signal recognition device, traffic signal recognition method and vehicle control device
WO2022009900A1 (fr) Dispositif de conduite automatisée et procédé de commande de véhicule
CN115769050A (zh) 本车位置估计装置、行驶控制装置
JP7315101B2 (ja) 障害物情報管理装置、障害物情報管理方法、車両用装置
WO2022009847A1 (fr) Dispositif et procédé de détermination d'environnement défavorable
JP7247491B2 (ja) 自律的ナビゲーションのための地図システム、方法および記憶媒体
US20230256992A1 (en) Vehicle control method and vehicular device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19856277

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19856277

Country of ref document: EP

Kind code of ref document: A1