WO2022024569A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2022024569A1
WO2022024569A1 PCT/JP2021/022629 JP2021022629W WO2022024569A1 WO 2022024569 A1 WO2022024569 A1 WO 2022024569A1 JP 2021022629 W JP2021022629 W JP 2021022629W WO 2022024569 A1 WO2022024569 A1 WO 2022024569A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
gradient
unit
map
vehicle
Prior art date
Application number
PCT/JP2021/022629
Other languages
French (fr)
Japanese (ja)
Inventor
陸也 江副
達也 石川
諒 渡辺
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022024569A1 publication Critical patent/WO2022024569A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Definitions

  • This technology makes it possible to obtain map information with correct gradient information set for information processing equipment, information processing methods and programs.
  • a sensor such as LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing) is used to measure the surrounding environment and generate map information. Further, in Patent Document 1, the cleaning robot is controlled so as not to enter the non-travelable gradient area by using the map information in which the gradient information is set.
  • LiDAR Light Detection and Ringing, Laser Imaging Detection and Ringing
  • the map information used in Patent Document 1 is the map information in the home associated with the gradient information indicating the gradient and the like capable of cleaning operation, and shows only a limited narrow area, and indicates the outdoors and the like. It is not a large area map information. Further, the easily available map information indicating a wide area generally indicates the position in a two-dimensional plane, and the gradient information is not provided.
  • the purpose of this technique is to provide an information processing device, an information processing method, and a program capable of obtaining map information in which correct gradient information is set.
  • the information processing device includes a map information processing unit that corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
  • the map information processing unit corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device. Further, when the slope information is not set in the map information, the map information processing unit sets the slope information based on the map information in which the slope information is not set and the traveling information acquired by the moving device. To generate.
  • the traveling information includes the sensor information acquired by the sensor provided in the moving device, and the gradient information is calculated based on the sensor information.
  • the sensor information is information that detects at least one of translational motion and rotational motion of the moving device, and is, for example, translation by an acceleration sensor using an inertial measurement unit including at least one of an acceleration sensor and an angular velocity sensor. Rotational motion is detected by motion and angular velocity sensors, respectively.
  • the map information processing unit uses the gradient detection information generated based on the travel information acquired by the moving device to correct the gradient information of the position of the map information corresponding to the moving position that generated the gradient detection information. For example, the map information processing unit uses the gradient detection information generated based on the travel information acquired by the moving device and the gradient information after filtering using the gradient detection information already generated at the same moving position. Correct the gradient information of the position of the map information corresponding to the movement position.
  • the route planning department creates a route plan to the destination based on the map information generated or modified by the map information processing department.
  • the route planning unit creates a route plan using the gradient information set in the map information.
  • the mobile device includes a self-position estimation unit that estimates the self-position using the gradient information set in the map information.
  • the second aspect of this technology is The information processing method includes modifying the gradient information set in the map information by the map information processing unit based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
  • the third aspect of this technology is a program that causes a computer to execute processing using information acquired by a mobile device.
  • the program of the present technology is, for example, a storage medium, a communication medium, for example, a storage medium such as an optical disk, a magnetic disk, or a semiconductor memory, which is provided in a computer-readable format to a general-purpose computer capable of executing various program codes. It is a program that can be provided by a medium or a communication medium such as a network. By providing such a program in a computer-readable format, processing according to the program can be realized on the computer.
  • FIG. 1 illustrates a configuration of a system using the information processing apparatus of the present technology.
  • the system 10 has a mobile device 20 and a server 30, and the server 30 is provided with the function of an information processing device.
  • the mobile device 20 performs a movement operation based on the route plan supplied from the server 30.
  • the server 30 corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device 20.
  • the server 30 sets the gradient information based on the map information in which the gradient information is not set and the traveling information acquired by the moving device 20.
  • Generate map information Further, the server 30 creates a route plan by using map information (hereinafter referred to as “route search map”) in which road surface gradient information (also referred to as road surface gradient information) is set.
  • route search map map information
  • road surface gradient information also referred to as road surface gradient information
  • the moving device 20 has a wheel odometry 21, a gradient detection unit 22, a self-position estimation unit 23, a communication unit 24, a path tracking unit 25, a drive control unit 26, and a drive unit 27.
  • the wheel odometry 21 measures the amount of rotation of the wheels (rotational speed and angle of rotation) and calculates the amount of movement of the moving device 20.
  • the wheel odometry 21 outputs the movement amount information indicating the calculated movement amount to the self-position estimation unit 23 and the communication unit 24.
  • the amount of movement is not limited to the amount of rotation of the wheels, but the amount of rotation of the drive gear or the like that rotates the wheels may be used.
  • the gradient detection unit 22 detects the gradient of the road surface.
  • the gradient detection unit 22 is configured by using, for example, an inertial measurement unit (IMU), and detects the gradient of the road surface based on the sensor information generated by the IMU.
  • the sensor information is information that detects at least one of the translational motion or the rotational motion of the moving device, and the IMU includes at least an acceleration sensor and an angular velocity sensor, detects the translational motion by the acceleration sensor, and detects the translational motion by the angular velocity sensor. Detects rotational motion.
  • the gradient detection unit 22 generates gradient detection information indicating the gradient of the road surface on which the moving device 20 is traveling from at least one of the detected translational motion or rotational motion.
  • the gradient detection unit 22 generates gradient detection information indicating a gradient calculated by using the methods described in Patent Documents "Japanese Patent Laid-Open No. 2003-09945", “Japanese Patent Laid-Open No. 2013-044562” and the like. Output to the communication unit 24.
  • the self-position estimation unit 23 estimates the self-position based on the movement amount information calculated by the wheel odometry 21. Further, the self-position estimation unit 23 corrects the estimated self-position based on the road surface gradient information from the server 30 received by the communication unit 24. The self-position estimation unit 23 outputs the result of self-position estimation using the gradient information, that is, the corrected self-position information indicating the corrected self-position based on the road surface gradient information to the communication unit 24 and the route tracking unit 25. Further, the self-position estimation unit 23 may request or update the road surface gradient information (or the route search map information) from the server 30 when the road surface gradient information is not obtained.
  • the communication unit 24 performs wireless communication with the server 30.
  • the wireless communication may include cellular communication using any of LTE, WCDMA (registered trademark), 5G and the like, and short-range wireless communication using any of Wi-Fi, Bluetooth (registered trademark) and the like. It may be included.
  • the communication unit 24 wirelessly communicates with the server 30, for example, the movement amount information generated by the wheel odometry 21, the gradient detection information generated by the gradient detection unit 22, and the corrected self-position information generated by the self-position estimation unit 23. To the server 30. Further, the communication unit 24 receives the road surface gradient information (or the route search map information) and the route plan transmitted from the server 30, and uses the road surface gradient information as the self-position estimation unit 23, the route plan or the route plan and the route search map. Information is output to the path tracking unit 25, respectively.
  • the route following unit 25 causes the moving device 20 to follow the route indicated by the route plan supplied from the server 30.
  • the route following unit 25 recognizes the surrounding environment and searches for a trajectory to the destination based on the route plan acquired from the server 30 and the corrected self-position information generated by the self-position estimation unit 23, and collides with an obstacle or the like.
  • the trajectory of the moving device 20 is planned so that the moving operation can be performed while avoiding the above.
  • the path following unit 25 outputs the planned trajectory to the drive control unit 26.
  • the surrounding environment may be recognized by using the route search map information acquired from the server 30, and an external sensor (for example, a range sensor (for example, LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing)) provided in the mobile device 20 may be used. ), ToF (Time of Flight), stereo camera, etc.) may be used.
  • a range sensor for example, LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing)
  • ToF
  • the drive control unit 26 generates a drive signal so as to move in the trajectory planned by the path follower unit 25 and outputs the drive signal to the drive unit 27.
  • the drive unit 27 is configured by using wheels, a drive source, and the like, and drives the wheels with the drive source based on the drive signal from the drive control unit 26 to move the moving device 20.
  • the server 30 has a communication unit 31, a map information processing unit 32, and a route planning unit 33.
  • the communication unit 31 is configured to be able to perform wireless communication with the communication unit 24 of the mobile device 20.
  • the wireless communication may include cellular communication as described above, or may include short-range wireless communication.
  • the communication unit 31 wirelessly communicates with the mobile device 20 and receives information generated by the mobile device 20, such as movement amount information and gradient detection information.
  • the communication unit 31 outputs the received movement amount information, the corrected self-position information, and the gradient detection information to the map information processing unit 32. Further, the communication unit 31 transmits the route plan or the like created by the route planning unit 33 to the mobile device 20.
  • the map information processing unit 32 generates route search map information or route search map information in which road surface gradient information is set, based on the map information acquired in advance and the gradient detection information generated by the moving device 20. Correct the road surface slope information.
  • the map information processing unit 32 has an information storage unit 321 and a road surface gradient information generation unit 322.
  • the information storage unit 321 stores map information of a two-dimensional plane that is easily incident, for example, acquired from an external device or the like. Further, the information storage unit 321 stores the generated route search map information when the route search map information in which the road surface gradient information is set is generated in the map information of the two-dimensional plane. Further, the information storage unit 321 stores the gradient detection information generated by another device different from the mobile device 20 and the gradient detection information generated in the past by the mobile device 20 together with the position information indicating the gradient detection position. ..
  • the road surface gradient information generation unit 322 corrects the road surface gradient information set in the route search map information by using the gradient detection information generated by the moving device 20. Further, when the route search map information is not generated and the gradient detection information is not stored in the information storage unit 321, the road surface gradient information generation unit 322 uses the gradient detection information generated by the moving device 20 as the road surface gradient information.
  • the route search map information is generated by setting the map information as.
  • the road surface gradient information generation unit 322 When generating the road surface gradient information, the road surface gradient information generation unit 322 averages the gradient detection information for a predetermined traveling period, for example, in order to reduce the influence of vibration of the moving device 20 and noise in the gradient detection unit 22. .. In this case, the road surface gradient information becomes information in which the resolution in the traveling direction is lowered. Therefore, the road surface gradient information generation unit 322 performs filter processing or statistical processing using the gradient detection information generated by the moving device 20 and the gradient detection information of the same moving position that has already been generated and stored in the information storage unit 321. (Hereinafter, collectively referred to as "filter processing") is performed to generate road surface gradient information that is less affected by vibration of the moving device 20 and noise in the gradient detecting unit 22 without causing a decrease in resolution in the traveling direction.
  • filter processing is performed to generate road surface gradient information that is less affected by vibration of the moving device 20 and noise in the gradient detecting unit 22 without causing a decrease in resolution in the traveling direction.
  • the moving position is set with the self-position estimation unit 23 of the moving device 20 by using the moving amount information acquired from the moving device 20 and the gradient detection information stored in the moving position information storage unit 321 indicated by the moving amount information.
  • the correction is performed in the same manner, and the filter processing is performed using the gradient detection information of the self-position after the correction.
  • the road surface gradient information generation unit 322 may use the slope detection information of the position indicated by the corrected self-position information generated by the self-position estimation unit 23 of the mobile device 20.
  • the road surface gradient information generation unit 322 may generate road surface gradient information by performing filter processing using a plurality of gradient detection information using, for example, a Kalman filter or a particle filter, and statistically performs a plurality of gradient detection information. , The mode value, the median value, or the average value may be calculated, and the calculation result may be used as the road surface gradient information.
  • the road surface gradient information generation unit 322 sets the generated road surface gradient information in the map information acquired from an external device or the like, or sets the road surface gradient information in the route search map information by the road surface gradient information after the filtering process. Correct. Further, the road surface gradient information generation unit 322 updates the road search map information stored in the information storage unit 321 to the route search map information in which the road surface gradient information is corrected. Further, the road surface gradient information generation unit 322 outputs the road surface gradient information to the transmission source of the gradient detection information.
  • the route planning unit 33 starts from the corrected self-position indicated by the corrected self-position information generated by the moving device 20 based on the road surface gradient information indicated by the route search map information whose road surface gradient information is the latest information. Create a route plan to your destination. Further, the route planning unit 33 creates a route plan showing the optimum route by using the road surface gradient information set in the route search map information. The route planning unit 33 outputs the created route plan from the communication unit 31 to the mobile device 20.
  • FIG. 2 is a sequence diagram illustrating the operation of the system.
  • the server 30 acquires the map information provided by the external device or the like and stores it in the information storage unit 321 of the map information processing unit 32.
  • step ST2 the moving device 20 outputs the moving amount information indicating the moving amount calculated by the wheel odometry 21 to the server 30.
  • step ST3 the server 30 outputs the gradient detection information to the road surface gradient information generation unit 322.
  • the information storage unit 321 of the map information processing unit 32 extracts from the gradient detection information storing the gradient detection information at the position equal to the movement position based on the movement amount information supplied from the movement device 20, and generates the road surface gradient information. Output to unit 322.
  • the road surface gradient information is stored in the information storage unit 321
  • Gradient detection information at a position equal to is extracted and output to the road surface gradient information generation unit 322.
  • step ST4 the mobile device 20 outputs the gradient detection information to the server 30.
  • the moving device 20 outputs the gradient detection information generated by the gradient detection unit 22 to the server 30 at a position based on the movement amount information and the road surface gradient information.
  • the gradient detection unit 22 of the moving device 20 generates gradient detection information according to the posture of the moving device based on, for example, the acceleration generated by the IMU, the angular acceleration, and the like. If the posture of the mobile device changes according to the vibration of the mobile device or the loading condition of the article, the gradient detection information causes an error with respect to the gradient of the road surface.
  • step ST5 the server 30 outputs the road surface gradient information to the mobile device 20.
  • the road surface gradient information generation unit 322 of the map information processing unit 32 performs filter processing using the gradient detection information supplied from the information storage unit 321 in step ST3 and the gradient detection information supplied from the moving device 20 in step ST4, and moves.
  • Road surface gradient information indicating the slope of the road surface at the moving position of the device 20 is generated and output to the moving device 20.
  • the road surface gradient information generation unit 322 corrects the road surface gradient information of the route search map information stored in the information storage unit 321 by using the generated road surface gradient information.
  • the road surface gradient information generation unit 322 sets the road surface gradient information in the map information acquired from an external device or the like and stores the route search map information. It is stored in the unit 321.
  • FIG. 3 is a diagram for explaining the operation of creating route search map information.
  • FIG. 3A exemplifies a state in which the moving device 20 is moving on a road surface having a gradient ⁇ of “ ⁇ a”, and
  • FIG. 3B is a gradient of the moving device 20 at the moving position Pm.
  • the gradient detection information generated by the detection unit 22 is illustrated.
  • (C) of FIG. 3 exemplifies the gradient detection information already acquired at the same moving position by the moving device 20 or another moving device.
  • the road surface gradient information generation unit 322 performs a filter process from the gradient detection information shown in FIG. 3 (b) and the gradient detection information shown in FIG. 3 (c) using the gradient detection information of the same moving position Pm. As shown in FIG. 3D, road surface gradient information for each position is generated. Note that FIG. 3 (e) shows the distribution of the gradient shown in the gradient detection information at the moving position Pm.
  • the road surface gradient information generation unit 322 adds new gradient detection information and performs filtering processing every time new gradient detection information is supplied, and corrects the road surface gradient information.
  • the information storage unit 321 sets the road surface gradient information generated by the road surface gradient information generation unit 322 to the movement position Pm, and generates the route search map information.
  • step ST6 the moving device 20 outputs the moving amount information to the self-position estimation unit 23.
  • the wheel odometry 21 of the moving device 20 outputs the movement amount information indicating the calculated movement amount to the self-position estimation unit 23.
  • step ST7 the mobile device 20 outputs the corrected self-position information to the server 30.
  • the self-position estimation unit 23 calculates the amount of movement on the map based on the road surface gradient information supplied from the server 30 in step ST5 and the movement amount information supplied from the wheel odometry 21 in step ST6, and the influence of the road surface gradient. Calculate the self-position excluding. Further, the self-position estimation unit 23 outputs the corrected self-position information indicating the self-position excluding the influence of the slope of the road surface to the server 30.
  • step ST8 the server 30 outputs the corrected self-position information and the route search map information to the route planning unit 33.
  • the information storage unit 321 of the map information processing unit 32 is a map of a predetermined range including the destination based on the corrected self-position information supplied from the mobile device 20 and the position indicated by the corrected self-position information from the route search map information. The information is output to the route planning unit 33.
  • the server 30 outputs the route plan to the mobile device 20.
  • the route planning unit 33 of the server 30 generates a route plan indicating a route from the current position of the mobile device indicated by the corrected self-position information to the destination. Further, the route planning unit 33 generates a route plan showing the optimum route by using not only the distance on the map from the current position to the destination but also the road surface gradient information included in the route search map information. For example, the route planning unit 33 uses a distance cost Ca set to increase as the distance increases and a gradient cost Cb set to increase as the gradient becomes steeper, and sets a cost value for each route. Calculate and select the route with the lowest cost value.
  • the route planning unit 33 determines the route to be traced to the destination by using the movable route based on the route search map information.
  • the route planning unit 33 calculates the cost values CR-1 to CR-n of each route based on the cost function f shown in the equation (1). do.
  • the cost Ce indicates the cost of factors related to the road surface gradient (for example, ease of operation control and energy required for movement), unlike the distance and the road surface gradient, and the cost values CR-1 to CR- It does not have to be included in the calculation of n.
  • CR-x f (Ca-x, Cb-x, Ce-x) ... (1)
  • the route planning unit 33 selects the minimum cost value from the cost values CR-1 to CR-n, and creates a route plan for moving to the destination according to the route corresponding to the selected cost value.
  • FIG. 4 shows an example of creating a route plan.
  • FIG. 4A exemplifies, for example, a route R-a traveling over a hill and a route R-b avoiding a hill as a route from the movement start position Ps to the destination Pg.
  • FIG. 4B shows the relationship between the travel distance on the map from the travel start position Ps to the destination Pg when the route R-a is selected and the slope of the road surface.
  • c) shows the relationship between the movement distance on the map from the movement start position Ps to the destination Pg when the route R-b is selected and the slope of the road surface.
  • the server 30 When map information that does not include road surface gradient information is used, the server 30 creates a route plan that follows route R-a, which has a short distance to the destination. However, if the route search map information including the road surface gradient information is used as in the present technology, the server 30 may have an actual travel distance longer than the distance indicated by the map information not including the road surface gradient information. It is possible to determine what will occur and how much energy will be required to move to the destination.
  • the server 30 calculates the cost value based on the equation (1), and when the cost value is smaller than the route R-a, the server 30 creates a route plan for the route R-b.
  • the route planning unit 33 uses the current position and destination of the moving device 20 and the road surface gradient information included in the route search map information to obtain an optimum route, for example, a route that can be easily moved with few ups and downs.
  • a route plan indicating the above is generated and output to the mobile device 20.
  • step ST10 the moving device 20 outputs the corrected self-position information to the path tracking unit 25.
  • the self-position estimation unit 23 of the moving device 20 outputs the corrected self-position information excluding the influence of the slope of the road surface to the path tracking unit 25.
  • FIG. 5 is a diagram for explaining the corrected self-position information.
  • the road surface SF is an inclined surface having a gradient ⁇ a and the movement amount calculated by the wheel odometry 21 when the moving device 20 moves on the road surface SF is “ML”, as shown in FIG. 5 (a).
  • the self-position estimated based on the movement amount "ML” is the position P1 on the map.
  • FIG. 5C exemplifies the height difference and the road surface gradient information based on the position MPa of the traveling path (position MPa to position MPb) on the map.
  • the moving device 20 can correct the self-positioning error based on the movement amount information generated by the wheel odometry 21 and the road surface gradient information of the road surface on which the moving device 20 has moved, the position on the map of the moving device 20 can be corrected. It can be detected accurately.
  • step ST11 the mobile device 20 outputs control information to the drive control unit 26.
  • the route following unit 25 of the mobile device 20 moves the route indicated by the route plan based on the current position indicated by the corrected self-position information supplied from the self-position estimation unit 23 and the route plan supplied from the server 30. Control information for performing movement control so that 20 follows is generated and output to the drive control unit 26.
  • the server 30 can accurately calculate the slope of the road surface by using the gradient detection information generated by the mobile device, and can obtain the route search map information including the correct road surface gradient information. Can be easily generated. Further, the server 30 can create an optimum route plan in consideration of the slope of the road surface by using the route search map information including the road surface gradient information. Further, since the road surface gradient information is supplied to the mobile device 20, the mobile device 20 uses the road surface gradient information of the traveled road surface to accurately determine its own position as compared with the case where the road surface gradient information is not obtained. You will be able to detect it.
  • the application example illustrates a case where the mobile device 20 is a vehicle.
  • FIG. 6 is a diagram showing a configuration example of a vehicle control system 111, which is an example of a mobile device control system to which the present technology is applied.
  • the vehicle control system 111 is provided in the vehicle 100 and performs processing related to driving support and automatic driving of the vehicle 100.
  • the vehicle control system 111 includes a vehicle control ECU (Electronic Control Unit) 121, a communication unit 122, a map information storage unit 123, a position information receiving unit 124, an external recognition sensor 125, an in-vehicle sensor 126, a vehicle sensor 127, a recording unit 128, and a traveling unit. It includes a support / automatic driving control unit 129, a DMS (Driver Monitoring System) 130, an HMI (Human Machine Interface) 31, and a vehicle control unit 132.
  • a vehicle control ECU Electronic Control Unit
  • Vehicle control ECU 121, communication unit 122, map information storage unit 123, position information receiving unit 124, external recognition sensor 125, in-vehicle sensor 126, vehicle sensor 127, recording unit 128, driving support / automatic driving control unit 129, driver monitoring system ( The DMS) 130, the human machine interface (HMI) 131, and the vehicle control unit 132 are connected to each other so as to be communicable with each other via the communication network 141.
  • the communication network 141 is in-vehicle compliant with digital bidirectional communication standards such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), FlexRay (registered trademark), and Ethernet (registered trademark). It consists of a communication network and a bus.
  • the communication network 141 may be used properly depending on the type of information to be communicated. For example, CAN is applied for information related to vehicle control, and Ethernet is applied for large-capacity information. It should be noted that each part of the vehicle control system 111 does not go through the communication network 141, but wireless communication assuming relatively short-distance communication such as short-range wireless communication (NFC (Near Field Communication)) and Bluetooth (registered trademark). In some cases, it is directly connected using.
  • NFC Near Field Communication
  • Bluetooth registered trademark
  • the description of the communication network 141 shall be omitted.
  • the vehicle control ECU 121 and the communication unit 122 communicate with each other via the communication network 141, it is described that the processor 121 and the communication unit 122 simply communicate with each other.
  • the vehicle control ECU 121 is composed of various processors such as a CPU (Central Processing Unit) and an MPU (Micro Processing Unit), for example.
  • the vehicle control ECU 121 controls the functions of the entire vehicle control system 111 or a part of the vehicle control system 111.
  • the communication unit 122 communicates with various devices inside and outside the vehicle, other vehicles, servers, base stations, etc., and transmits and receives various information. At this time, the communication unit 122 can perform communication using a plurality of communication methods.
  • the communication with the outside of the vehicle which can be executed by the communication unit 122, will be briefly described.
  • the communication unit 122 is mounted on an external network via a base station or an access point by a wireless communication method such as 5G (5th generation mobile communication system), LTE (Long Term Evolution), DSRC (Dedicated Short Range Communications), etc. Communicates with a server (hereinafter referred to as an external server) that exists in.
  • the external network with which the communication unit 122 communicates is, for example, the Internet, a cloud network, a network peculiar to a business operator, or the like.
  • the communication method for communicating with the external network by the communication unit 122 is not particularly limited as long as it is a wireless communication method capable of digital bidirectional communication at a communication speed of a predetermined value or higher and a distance of a predetermined distance or more.
  • the communication unit 122 can communicate with a terminal existing in the vicinity of the own vehicle by using P2P (Peer To Peer) technology.
  • Terminals that exist in the vicinity of the vehicle are, for example, terminals that are attached to mobile devices that move at relatively low speeds such as pedestrians and bicycles, terminals that are fixedly installed in stores, or MTC (Machine Type Communication).
  • MTC Machine Type Communication
  • the communication unit 122 can also perform V2X communication.
  • V2X communication is, for example, vehicle-to-vehicle (Vehicle to Vehicle) communication with other vehicles, road-to-vehicle (Vehicle to Infrastructure) communication with roadside devices, and home (Vehicle to Home) communication.
  • And communication between the vehicle and others such as vehicle-to-vehicle (Vehicle to Pedestrian) communication with terminals owned by pedestrians.
  • the communication unit 122 can receive, for example, a program for updating the software that controls the operation of the vehicle control system 111 from the outside (Over The Air).
  • the communication unit 122 can further receive map information, traffic information, information around the vehicle 100, and the like from the outside. Further, for example, the communication unit 122 can transmit information about the vehicle 100, information around the vehicle 100, and the like to the outside.
  • Information about the vehicle 100 transmitted by the communication unit 122 to the outside includes, for example, information indicating the state of the vehicle 100, recognition result by the recognition unit 173, and the like. Further, for example, the communication unit 122 performs communication corresponding to a vehicle emergency call system such as eCall.
  • the communication unit 122 can roughly explain the communication with the inside of the vehicle.
  • the communication unit 122 can communicate with each device in the vehicle by using, for example, wireless communication.
  • the communication unit 122 performs wireless communication with a device in the vehicle by a communication method such as wireless LAN, Bluetooth, NFC, WUSB (WirelessUSB), which enables digital bidirectional communication at a communication speed higher than a predetermined value by wireless communication. Can be done.
  • the communication unit 122 can also communicate with each device in the vehicle by using wired communication.
  • the communication unit 122 can communicate with each device in the vehicle by wired communication via a cable connected to a connection terminal (not shown).
  • the communication unit 122 uses wired communication such as USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), and MHL (Mobile High-definition Link) to perform digital bidirectional communication at a communication speed equal to or higher than a predetermined value. It is possible to communicate with each device in the car by the communication method capable of.
  • USB Universal Serial Bus
  • HDMI registered trademark
  • MHL Mobile High-definition Link
  • the device in the vehicle refers to, for example, a device that is not connected to the communication network 141 in the vehicle.
  • the equipment in the vehicle for example, mobile equipment and wearable equipment possessed by passengers such as drivers, information equipment brought into the vehicle and temporarily installed, and the like are assumed.
  • the communication unit 122 receives an electromagnetic wave transmitted by a vehicle information and communication system (VICS (registered trademark) (Vehicle Information and Communication System)) such as a radio wave beacon, an optical beacon, and FM multiplex broadcasting.
  • VICS vehicle Information and Communication System
  • the map information storage unit 123 stores one or both of the map acquired from the outside and the map created by the vehicle 100.
  • the map information storage unit 123 stores a three-dimensional high-precision map, a global map that is less accurate than the high-precision map and covers a wide area, and the like.
  • High-precision maps are, for example, dynamic maps, point cloud maps, vector maps, etc.
  • the dynamic map is, for example, a map including four layers of dynamic information, quasi-dynamic information, quasi-static information, and static information, and is provided to the vehicle 100 from an external server or the like.
  • the point cloud map is a map composed of point clouds (point cloud information).
  • the vector map refers to a map conforming to ADAS (Advanced Driver Assistance System) in which traffic information such as lanes and signal positions are associated with a point cloud map.
  • ADAS Advanced Driver Assistance System
  • the point cloud map and the vector map may be provided, for example, from an external server or the like, or may be described later based on the sensing results of the radar 152, LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing) 153, etc., which will be described later. It may be created by the vehicle 100 as a map for matching with the map and stored in the map information storage unit 123. Further, when a high-precision map is provided from an external server or the like, in order to reduce the communication capacity, for example, map information of several hundred meters square regarding the planned route on which the vehicle 100 will travel is acquired from the external server or the like. ..
  • the position information receiving unit 124 receives a GNSS signal from, for example, a GNSS (Global Navigation Satellite System) satellite, and acquires the position information of the vehicle 100.
  • the received GNSS signal is supplied to the driving support / automatic driving control unit 129.
  • the position information receiving unit 124 is not limited to the method using the GNSS signal, and may acquire the position information by using, for example, a beacon.
  • the external recognition sensor 125 includes various sensors used for recognizing the external situation of the vehicle 100, and supplies sensor information from each sensor to each part of the vehicle control system 111.
  • An external recognition sensor for example, an external recognition sensor 125, includes a camera 151, a radar 152, a LiDAR 153, and an ultrasonic sensor 154.
  • the external recognition sensor 125 may be configured to include one or more of the camera 151, the radar 152, the LiDAR 153, and the ultrasonic sensor 154.
  • the number of cameras 151, radar 152, LiDAR 153, and ultrasonic sensor 154 is not particularly limited as long as it can be practically installed in the vehicle 100.
  • the type of sensor included in the external recognition sensor 125 is not limited to this example, and the external recognition sensor 125 may include other types of sensors. An example of the sensing area of each sensor included in the external recognition sensor 125 will be described later.
  • the shooting method of the camera 151 is not particularly limited as long as it is a shooting method capable of distance measurement.
  • cameras of various shooting methods such as a ToF (Time of Flight) camera, a stereo camera, a monocular camera, and an infrared camera can be applied as needed.
  • the camera 151 may be simply for acquiring a captured image regardless of the distance measurement.
  • the external recognition sensor 125 can be provided with an environment sensor for detecting the environment for the vehicle 100.
  • the environment sensor is a sensor for detecting the environment such as weather, weather, and brightness, and may include various sensors such as a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and an illuminance sensor.
  • the external recognition sensor 125 includes a microphone used for detecting the position of a sound or a sound source around the vehicle 100.
  • the in-vehicle sensor 126 includes various sensors for detecting in-vehicle information, and supplies sensor information from each sensor to each part of the vehicle control system 111.
  • the type and number of various sensors included in the in-vehicle sensor 126 are not particularly limited as long as they can be practically installed in the vehicle 100.
  • the in-vehicle sensor 126 can include one or more of a camera, a radar, a seating sensor, a steering wheel sensor, a microphone, and a biosensor.
  • a camera included in the in-vehicle sensor 126 for example, a camera of various shooting methods capable of measuring a distance, such as a ToF camera, a stereo camera, a monocular camera, and an infrared camera, can be used. Not limited to this, the camera included in the in-vehicle sensor 126 may be simply for acquiring a captured image regardless of the distance measurement.
  • the biosensor included in the in-vehicle sensor 126 is provided on, for example, a seat, a steering wheel, or the like, and detects various biometric information of a occupant such as a driver.
  • the vehicle sensor 127 includes various sensors for detecting the state of the vehicle 100, and supplies sensor information from each sensor to each part of the vehicle control system 111.
  • the types and numbers of various sensors included in the vehicle sensor 127 are not particularly limited as long as they can be practically installed in the vehicle 100.
  • the vehicle sensor 127 includes a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an inertial measurement unit (IMU: Inertial Measurement Unit) that integrates them.
  • the vehicle sensor 127 includes a steering angle sensor for detecting the steering angle of the steering wheel, a yaw rate sensor, an accelerator sensor for detecting the operation amount of the accelerator pedal, and a brake sensor for detecting the operation amount of the brake pedal.
  • the vehicle sensor 127 detects a rotation speed of an engine or a motor, an air pressure sensor of detecting tire pressure, a slip ratio sensor of detecting a tire slip ratio, and a wheel rotation speed and rotation speed. It is equipped with a wheel sensor.
  • the vehicle sensor 127 includes a battery sensor that detects the remaining amount and temperature of the battery, and an impact sensor that detects an impact from the outside.
  • the recording unit 128 includes at least one of a non-volatile storage medium and a volatile storage medium, and stores information and programs.
  • the recording unit 128 is used as, for example, an EEPROM (Electrically Erasable Programmable Read Only Memory) and a RAM (Random Access Memory), and the storage medium includes a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, and an optical storage device. And a photomagnetic storage device can be applied.
  • the recording unit 128 records various programs and information used by each unit of the vehicle control system 111.
  • the recording unit 128 is equipped with an EDR (Event Data Recorder) and DSSAD (Data Storage System for Automated Driving), and records information on the vehicle 100 before and after an event such as an accident and biometric information acquired by the in-vehicle sensor 126. ..
  • EDR Event Data Recorder
  • DSSAD Data Storage System for Automated Driving
  • the driving support / automatic driving control unit 129 controls the driving support and automatic driving of the vehicle 100.
  • the driving support / automatic driving control unit 129 includes an analysis unit 161, an action planning unit 162, and an motion control unit 163.
  • the analysis unit 161 analyzes the vehicle 100 and the surrounding conditions.
  • the analysis unit 161 includes a self-position estimation unit 171, a sensor fusion unit 172, and a recognition unit 173.
  • the self-position estimation unit 171 estimates the self-position of the vehicle 100 based on the sensor information from the external recognition sensor 125 and the high-precision map stored in the map information storage unit 123. For example, the self-position estimation unit 171 generates a local map based on the sensor information from the external recognition sensor 125, and estimates the self-position of the vehicle 100 by matching the local map with the high-precision map.
  • the position of the vehicle 100 is based on, for example, the center of the rear wheel-to-axle.
  • the local map is, for example, a three-dimensional high-precision map created by using a technology such as SLAM (Simultaneous Localization and Mapping), an occupied grid map (OccupancyGridMap), or the like.
  • the three-dimensional high-precision map is, for example, the point cloud map described above.
  • the occupied grid map is a map that divides a three-dimensional or two-dimensional space around the vehicle 100 into a grid (grid) of a predetermined size and shows the occupied state of an object in grid units.
  • the occupied state of an object is indicated by, for example, the presence or absence of an object and the probability of existence.
  • the local map is also used, for example, in the detection process and the recognition process of the external situation of the vehicle 100 by the recognition unit 173.
  • the self-position estimation unit 171 may estimate the self-position of the vehicle 100 based on the GNSS signal and the sensor information from the vehicle sensor 127.
  • the sensor fusion unit 172 performs sensor fusion processing for obtaining new information by combining a plurality of different types of sensor information (for example, image information supplied from the camera 151 and sensor information supplied from the radar 152). .. Methods for combining different types of sensor information include integration, fusion, and association.
  • the recognition unit 173 executes a detection process for detecting the external situation of the vehicle 100 and a recognition process for recognizing the external situation of the vehicle 100.
  • the recognition unit 173 performs detection processing and recognition processing of the external situation of the vehicle 100 based on the information from the external recognition sensor 125, the information from the self-position estimation unit 171 and the information from the sensor fusion unit 172. ..
  • the recognition unit 173 performs detection processing, recognition processing, and the like of objects around the vehicle 100.
  • the object detection process is, for example, a process of detecting the presence / absence, size, shape, position, movement, etc. of an object.
  • the object recognition process is, for example, a process of recognizing an attribute such as an object type or identifying a specific object.
  • the detection process and the recognition process are not always clearly separated and may overlap.
  • the recognition unit 173 detects an object around the vehicle 100 by performing clustering that classifies the point cloud based on the sensor information by the LiDAR 153, the radar 152, or the like into each block of the point cloud. As a result, the presence / absence, size, shape, and position of an object around the vehicle 100 are detected.
  • the recognition unit 173 detects the movement of an object around the vehicle 100 by performing tracking that follows the movement of a mass of point clouds classified by clustering. As a result, the velocity and the traveling direction (movement vector) of the object around the vehicle 100 are detected.
  • the recognition unit 173 detects or recognizes a vehicle, a person, a bicycle, an obstacle, a structure, a road, a traffic light, a traffic sign, a road sign, or the like with respect to the image information supplied from the camera 151. Further, the type of the object around the vehicle 100 may be recognized by performing the recognition process such as semantic segmentation.
  • the recognition unit 173 is based on the map stored in the map information storage unit 123, the self-position estimation result by the self-position estimation unit 171 and the recognition result of the object around the vehicle 100 by the recognition unit 173. It is possible to perform recognition processing of traffic rules around the vehicle 100. By this processing, the recognition unit 173 can recognize the position and state of the signal, the content of the traffic sign and the road marking, the content of the traffic regulation, the lane in which the vehicle can travel, and the like.
  • the recognition unit 173 can perform recognition processing of the environment around the vehicle 100.
  • the surrounding environment to be recognized by the recognition unit 173 weather, temperature, humidity, brightness, road surface condition, and the like are assumed.
  • the action planning unit 162 creates an action plan for the vehicle 100.
  • the action planning unit 162 creates an action plan by performing route planning and route tracking processing.
  • route planning is a process of planning a rough route from the start to the goal.
  • This route plan is called a track plan, and in the route planned by the route plan, the track generation (Local) capable of safely and smoothly traveling in the vicinity of the vehicle 100 in consideration of the motion characteristics of the vehicle 100 is taken into consideration.
  • the processing of path planning is also included.
  • the route plan may be distinguished from the long-term route plan and the activation generation from the short-term route plan or the local route plan.
  • the safety priority route represents a concept similar to activation generation, short-term route planning, or local route planning.
  • Route tracking is a process of planning an operation for safely and accurately traveling on a route planned by route planning within a planned time.
  • the action planning unit 162 can calculate the target speed and the target angular velocity of the vehicle 100, for example, based on the result of this route tracking process.
  • the motion control unit 163 controls the motion of the vehicle 100 in order to realize the action plan created by the action plan unit 162.
  • the motion control unit 163 controls the steering control unit 181, the brake control unit 182, and the drive control unit 183, which are included in the vehicle control unit 132 described later, and the vehicle 100 controls the track calculated by the track plan. Acceleration / deceleration control and direction control are performed so as to proceed.
  • the motion control unit 163 performs coordinated control for the purpose of realizing ADAS functions such as collision avoidance or impact mitigation, follow-up running, vehicle speed maintenance running, collision warning of own vehicle, and lane deviation warning of own vehicle.
  • the motion control unit 163 performs coordinated control for the purpose of automatic driving or the like that autonomously travels without being operated by the driver.
  • the DMS 130 performs driver authentication processing, driver status recognition processing, and the like based on sensor information from the in-vehicle sensor 126 and input information input to HMI 131, which will be described later.
  • the state of the driver to be recognized by the DMS 130 for example, physical condition, arousal degree, concentration degree, fatigue degree, line-of-sight direction, drunkenness, driving operation, posture and the like are assumed.
  • the DMS 130 may perform authentication processing for passengers other than the driver and recognition processing for the status of the passenger. Further, for example, the DMS 130 may perform the recognition processing of the situation inside the vehicle based on the sensor information from the sensor 126 in the vehicle. As the situation inside the vehicle to be recognized, for example, temperature, humidity, brightness, odor, etc. are assumed.
  • HMI131 inputs various information and instructions, and presents various information to the driver and the like.
  • the input of information by HMI131 will be outlined.
  • the HMI 131 includes an input device for a person to input information.
  • the HMI 131 generates an input signal based on information, instructions, and the like input by the input device, and supplies the input signal to each part of the vehicle control system 111.
  • the HMI 131 includes an operator such as a touch panel, a button, a switch, and a lever as an input device.
  • the HMI 131 may further include an input device capable of inputting information by a method other than manual operation by voice, gesture, or the like.
  • the HMI 131 may use, for example, a remote control device using infrared rays or radio waves, or an externally connected device such as a mobile device or a wearable device corresponding to the operation of the vehicle control system 111 as an input device.
  • a remote control device using infrared rays or radio waves
  • an externally connected device such as a mobile device or a wearable device corresponding to the operation of the vehicle control system 111 as an input device.
  • the presentation of information by HMI131 will be outlined.
  • the HMI 131 generates visual information, auditory information, and tactile information for the passenger or the outside of the vehicle. Further, the HMI 131 performs output control for controlling the output, output content, output timing, output method, etc. of each of the generated information.
  • As visual information the HMI 131 generates and outputs, for example, an image such as an operation screen, a status display of the vehicle 100, a warning display, a monitor image showing a situation around the vehicle 100, or information indicated by light.
  • the HMI 131 generates and outputs as auditory information, for example, information indicated by sounds such as voice guidance, warning sounds, and warning messages. Further, the HMI 131 generates and outputs tactile information that is given to the tactile sensation of the occupant by, for example, force, vibration, movement, or the like.
  • a display device that presents visual information by displaying an image by itself or a projector device that presents visual information by projecting an image can be applied. ..
  • the display device displays visual information in the passenger's field of view, such as a head-up display, a transmissive display, and a wearable device having an AR (Augmented Reality) function. It may be a device.
  • the HMI 131 can also use a display device included in a navigation device, an instrument panel, a CMS (Camera Monitoring System), an electronic mirror, a lamp, etc. provided in the vehicle 100 as an output device for outputting visual information.
  • an output device for which the HMI 131 outputs auditory information for example, an audio speaker, headphones, and earphones can be applied.
  • a haptics element using haptics technology can be applied as an output device for which the HMI 131 outputs tactile information.
  • the haptic element is provided in a portion of the vehicle 100 that the occupant of the vehicle contacts contacts, such as a steering wheel and a seat.
  • the vehicle control unit 132 controls each part of the vehicle 100.
  • the vehicle control unit 132 includes a steering control unit 181, a brake control unit 182, a drive control unit 183, a body system control unit 184, a light control unit 185, and a horn control unit 186.
  • the steering control unit 181 detects and controls the state of the steering system of the vehicle 100.
  • the steering system includes, for example, a steering mechanism including a steering wheel, electric power steering, and the like.
  • the steering control unit 181 includes, for example, a control unit such as an ECU that controls the steering system, an actuator that drives the steering system, and the like.
  • the brake control unit 182 detects and controls the state of the brake system of the vehicle 100.
  • the brake system includes, for example, a brake mechanism including a brake pedal, an ABS (Antilock Brake System), a regenerative brake mechanism, and the like.
  • the brake control unit 182 includes, for example, a control unit such as an ECU that controls the brake system.
  • the drive control unit 183 detects and controls the state of the drive system of the vehicle 100.
  • the drive system includes, for example, a drive force generator for generating a drive force of an accelerator pedal, an internal combustion engine, a drive motor, or the like, a drive force transmission mechanism for transmitting the drive force to the wheels, and the like.
  • the drive control unit 183 includes, for example, a control unit such as an ECU that controls the drive system.
  • the body system control unit 184 detects and controls the state of the body system of the vehicle 100.
  • the body system includes, for example, a keyless entry system, a smart key system, a power window device, a power seat, an air conditioner, an airbag, a seat belt, a shift lever, and the like.
  • the body system control unit 184 includes, for example, a control unit such as an ECU that controls the body system.
  • the light control unit 185 detects and controls various light states of the vehicle 100.
  • a headlight, a backlight, a fog light, a turn signal, a brake light, a projection, a bumper display, or the like is assumed.
  • the light control unit 185 includes a control unit such as an ECU that controls the light.
  • the horn control unit 186 detects and controls the state of the car horn of the vehicle 100.
  • the horn control unit 186 includes, for example, a control unit such as an ECU that controls the car horn.
  • FIG. 7 is a diagram showing an example of a sensing region of the external recognition sensor 125 of FIG. 6 by a camera 151, a radar 152, a LiDAR 153, an ultrasonic sensor 154, and the like. Note that FIG. 7 schematically shows a view of the vehicle 100 from above, with the left end side being the front end (front) side of the vehicle 100 and the right end side being the rear end (rear) side of the vehicle 100.
  • the sensing area 201F and the sensing area 201B show an example of the sensing area of the ultrasonic sensor 154.
  • the sensing region 201F covers the vicinity of the front end of the vehicle 100 by a plurality of ultrasonic sensors 154.
  • the sensing region 201B covers the periphery of the rear end of the vehicle 100 by a plurality of ultrasonic sensors 154.
  • the sensing results in the sensing area 201F and the sensing area 201B are used, for example, for parking support of the vehicle 100 and the like.
  • the sensing area 202F to the sensing area 202B show an example of the sensing area of the radar 152 for a short distance or a medium distance.
  • the sensing area 202F covers a position farther than the sensing area 201F in front of the vehicle 100.
  • the sensing region 202B covers the rear of the vehicle 100 to a position farther than the sensing region 201B.
  • the sensing area 202L covers the rear periphery of the left side surface of the vehicle 100.
  • the sensing region 202R covers the rear periphery of the right side surface of the vehicle 100.
  • the sensing result in the sensing area 202F is used, for example, for detecting a vehicle, a pedestrian, or the like existing in front of the vehicle 100.
  • the sensing result in the sensing region 202B is used, for example, for a collision prevention function behind the vehicle 100.
  • the sensing results in the sensing area 202L and the sensing area 202R are used, for example, for detecting an object in a blind spot on the side of the vehicle 100.
  • the sensing area 203F to the sensing area 203B show an example of the sensing area by the camera 151.
  • the sensing area 203F covers a position farther than the sensing area 202F in front of the vehicle 100.
  • the sensing region 203B covers the rear of the vehicle 100 to a position farther than the sensing region 202B.
  • the sensing area 203L covers the periphery of the left side surface of the vehicle 100.
  • the sensing region 203R covers the periphery of the right side surface of the vehicle 100.
  • the sensing result in the sensing area 203F can be used, for example, for recognition of traffic lights and traffic signs, a lane departure prevention support system, and an automatic headlight control system.
  • the sensing result in the sensing area 203B can be used, for example, for parking assistance and a surround view system.
  • the sensing results in the sensing area 203L and the sensing area 203R can be used, for example, in a surround view system.
  • Sensing area 204 shows an example of the sensing area of LiDAR153.
  • the sensing region 204 covers a position far from the sensing region 203F in front of the vehicle 100.
  • the sensing area 204 has a narrower range in the left-right direction than the sensing area 203F.
  • the sensing result in the sensing area 204 is used for detecting an object such as a peripheral vehicle, for example.
  • the sensing area 205 shows an example of the sensing area of the radar 152 for a long distance.
  • the sensing region 205 covers a position farther than the sensing region 204 in front of the vehicle 100.
  • the sensing area 205 has a narrower range in the left-right direction than the sensing area 204.
  • the sensing result in the sensing area 205 is used for, for example, ACC (Adaptive Cruise Control), emergency braking, collision avoidance, and the like.
  • ACC Adaptive Cruise Control
  • emergency braking braking
  • collision avoidance collision avoidance
  • the sensing areas of the cameras 151, the radar 152, the LiDAR 153, and the ultrasonic sensors 154 included in the external recognition sensor 125 may have various configurations other than those in FIG. 7. Specifically, the ultrasonic sensor 154 may also sense the side of the vehicle 100, or the LiDAR 153 may sense the rear of the vehicle 100. Further, the installation position of each sensor is not limited to each of the above-mentioned examples. Further, the number of each sensor may be one or a plurality.
  • Such a vehicle control system 111 generates movement amount information indicating the movement amount of the vehicle 100 calculated based on the rotation amount of the wheels detected by the vehicle sensor 127, and is based on the acceleration and the angular speed detected by the vehicle sensor 127. Generate gradient detection information. Further, the self-position estimation unit 171 corrects the self-position based on the road surface gradient information from the server 30 received by the communication unit 122, and generates the corrected self-position information. Further, the vehicle control system 111 transmits the generated movement amount information, the gradient detection information, and the corrected self-position information from the communication unit 122 to the server 30.
  • the vehicle control system 111 outputs the road surface gradient information from the server 30 received by the communication unit 122 to the self-position estimation unit 171 and the route plan to the action planning unit 162, respectively.
  • the action planning unit 162 performs route tracking for planning an operation for safely and accurately traveling on the route planned by the route plan, and generates control information for controlling the operation of the vehicle 100 to operate. Output to control unit 163.
  • the vehicle 100 can be moved to the destination by the optimum route. Further, since the vehicle 100 can accurately calculate its own position based on the traveling amount of the vehicle and the road surface gradient information, the current position of the vehicle 100 can be accurately detected even when the position information receiving unit 124 cannot receive the position information.
  • the technique according to the present disclosure may be realized by a mobile device such as a construction machine or an agricultural machine (tractor) limited to a vehicle such as an automobile, an electric vehicle, or a hybrid electric vehicle.
  • the series of processes described in the specification can be executed by hardware, software, or a composite configuration of both.
  • the program that records the processing sequence is installed in the memory in the computer built in the dedicated hardware and executed.
  • the program can be recorded in advance on a hard disk as a recording medium, SSD (Solid State Drive), or ROM (Read Only Memory).
  • a hard disk as a recording medium
  • SSD Solid State Drive
  • ROM Read Only Memory
  • the program is a flexible disc, CD-ROM (Compact Disc Read Only Memory), MO (Magneto optical) disc, DVD (Digital Versatile Disc), BD (Blu-Ray Disc (registered trademark)), magnetic disc, semiconductor memory card. It can be temporarily or permanently stored (recorded) on a removable recording medium such as an optical disc.
  • a removable recording medium such as an optical disc.
  • Such removable recording media can be provided as so-called package software.
  • the program may be transferred from the download site to the computer wirelessly or by wire via a network such as LAN (Local Area Network) or the Internet.
  • the computer can receive the program transferred in this way and install it on a recording medium such as a built-in hard disk.
  • the information processing device of the present technology can have the following configurations.
  • An information processing device including a map information processing unit that corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
  • the map information processing unit uses the gradient detection information generated based on the travel information acquired by the moving device, and the position of the map information corresponding to the moving position that generated the gradient detection information.
  • the information processing apparatus according to (1) which corrects the slope information of the position on the map.
  • the map information processing unit performs a filter process using the gradient detection information already generated at the same movement position as the gradient detection information generated based on the travel information acquired by the moving device, and filters.
  • the information processing apparatus wherein the gradient information of the position of the map information corresponding to the moving position is corrected by using the gradient information after processing.
  • the map information processing unit generates map information with gradient information set based on map information for which gradient information is not set and traveling information acquired by the moving device (1) to (3).
  • the traveling information includes sensor information acquired by a sensor provided in the moving device.
  • the information processing apparatus according to any one of (2) to (4), wherein the gradient detection information is generated based on the sensor information.
  • the information processing device is information that detects at least one of translational motion or rotational motion of the moving device.
  • the information processing device according to (5) or (6), which is an inertial measurement unit.
  • the inertial measurement unit includes at least one of an acceleration sensor and an angular velocity sensor, the translational motion is detected by the acceleration sensor, and the rotational motion is detected by the angular velocity sensor.
  • Processing device (9) The information processing apparatus according to any one of (1) to (8), further comprising a route planning unit that creates a route plan to a destination based on the map information corrected by the map information processing unit. (10) The information processing apparatus according to (9), wherein the route planning unit creates the route plan using the gradient information set in the map information. (11) The information processing device according to any one of (1) to (10), wherein the mobile device includes a self-position estimation unit that performs self-position estimation using the gradient information set in the map information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Instructional Devices (AREA)

Abstract

A map information processing unit 32 corrects, on the basis of map information in which gradient information is set and traveling information acquired by a mobile device, the gradient information set in the map information. The traveling information includes sensor information acquired by a sensor provided in the mobile device 20, and the gradient information is calculated on the basis of the sensor information. The map information processing unit 32 corrects, according to gradient detection information generated on the basis of the traveling information acquired by the mobile device 20 and gradient information after filtering processing using gradient detection information already generated at the same movement position, the gradient information corresponding to the movement position set in the map information. If the gradient information is not set in the map information, the map information processing unit 32 sets, in the map information, the gradient detection information generated on the basis of the traveling information acquired by the mobile device. The map information processing unit 32 can generate the map information in which correct gradient information is set.

Description

情報処理装置と情報処理方法およびプログラムInformation processing equipment and information processing methods and programs
 この技術は、情報処理装置と情報処理方法およびプログラムに関し、正しい勾配情報が設定された地図情報を得られるようにする。 This technology makes it possible to obtain map information with correct gradient information set for information processing equipment, information processing methods and programs.
 従来、移動装置を自律的に移動するため、LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)などのセンサを使って周囲環境を計測して地図情報を生成することが行われている。また、特許文献1は、勾配情報を設定した地図情報を利用して、走行不可能な勾配領域に掃除ロボットが立ち入らないように制御することが行われている。 Conventionally, in order to move a mobile device autonomously, a sensor such as LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing) is used to measure the surrounding environment and generate map information. Further, in Patent Document 1, the cleaning robot is controlled so as not to enter the non-travelable gradient area by using the map information in which the gradient information is set.
特開2019-109845号公報Japanese Unexamined Patent Publication No. 2019-109845
 ところで、特許文献1で用いられる地図情報は、掃除運転が可能な勾配等を示す勾配情報が関連付けられた家庭内の地図情報であり、限られた狭い領域のみを示しており、屋外等を示す広い領域の地図情報ではない。また、広い領域を示す容易に入手可能な地図情報は、一般的に二次元平面で位置を示しており勾配情報が設けられていない。 By the way, the map information used in Patent Document 1 is the map information in the home associated with the gradient information indicating the gradient and the like capable of cleaning operation, and shows only a limited narrow area, and indicates the outdoors and the like. It is not a large area map information. Further, the easily available map information indicating a wide area generally indicates the position in a two-dimensional plane, and the gradient information is not provided.
 そこで、この技術では、正しい勾配情報が設定された地図情報を得ることができる情報処理装置と情報処理方法およびプログラムを提供することを目的とする。 Therefore, the purpose of this technique is to provide an information processing device, an information processing method, and a program capable of obtaining map information in which correct gradient information is set.
 この技術の第1の側面は、
 勾配情報が設定された地図情報と移動装置で取得された走行情報に基づいて、前記地図情報に設定されている前記勾配情報を修正する地図情報処理部
を備える情報処理装置にある。
The first aspect of this technology is
The information processing device includes a map information processing unit that corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
 この技術において、地図情報処理部は、勾配情報が設定された地図情報と移動装置で取得された走行情報に基づいて、地図情報に設定されている勾配情報の修正を行う。また、地図情報処理部は、地図情報に勾配情報が設定されていないとき、勾配情報が設定されていない地図情報と移動装置で取得された走行情報に基づいて、勾配情報が設定された地図情報を生成する。走行情報は、移動装置に設けたセンサで取得されたセンサ情報を含み、勾配情報はセンサ情報に基づいて算出する。センサ情報は、移動装置の並進運動または回転運動のいずれか少なくとも1以上を検出した情報であり、例えば、加速度センサと角速度センサの少なくともいずれかを備えた慣性計測装置を用いて、加速度センサにより並進運動、角速度センサにより回転運動をそれぞれ検出する。 In this technology, the map information processing unit corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device. Further, when the slope information is not set in the map information, the map information processing unit sets the slope information based on the map information in which the slope information is not set and the traveling information acquired by the moving device. To generate. The traveling information includes the sensor information acquired by the sensor provided in the moving device, and the gradient information is calculated based on the sensor information. The sensor information is information that detects at least one of translational motion and rotational motion of the moving device, and is, for example, translation by an acceleration sensor using an inertial measurement unit including at least one of an acceleration sensor and an angular velocity sensor. Rotational motion is detected by motion and angular velocity sensors, respectively.
 地図情報処理部は、移動装置によって取得された走行情報に基づいて生成された勾配検出情報を用いて、勾配検出情報を生成した移動位置と対応する地図情報の位置の勾配情報を修正する。例えば、地図情報処理部は、移動装置によって取得された走行情報に基づき生成された勾配検出情報および同じ移動位置で既に生成されている勾配検出情報を用いたフィルタ処理後の勾配情報を用いて、移動位置と対応する地図情報の位置の勾配情報を修正する。 The map information processing unit uses the gradient detection information generated based on the travel information acquired by the moving device to correct the gradient information of the position of the map information corresponding to the moving position that generated the gradient detection information. For example, the map information processing unit uses the gradient detection information generated based on the travel information acquired by the moving device and the gradient information after filtering using the gradient detection information already generated at the same moving position. Correct the gradient information of the position of the map information corresponding to the movement position.
 経路計画部は、地図情報処理部で生成または修正された地図情報に基づいて目的地までの経路計画を作成する。また、経路計画部は、地図情報に設定されている勾配情報を用いて経路計画を作成する。なお、移動装置は、地図情報に設定されている勾配情報を用いて自己位置推定を行う自己位置推定部を備える。 The route planning department creates a route plan to the destination based on the map information generated or modified by the map information processing department. In addition, the route planning unit creates a route plan using the gradient information set in the map information. The mobile device includes a self-position estimation unit that estimates the self-position using the gradient information set in the map information.
 この技術の第2の側面は、
 勾配情報が設定された地図情報と移動装置で取得された走行情報に基づいて、前記地図情報に設定されている前記勾配情報の修正を地図情報処理部で行うこと
を含む情報処理方法にある。
The second aspect of this technology is
The information processing method includes modifying the gradient information set in the map information by the map information processing unit based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
 この技術の第3の側面は
 移動装置で取得された情報を用いた処理をコンピュータで実行させるプログラムであって、
 勾配情報が設定された地図情報と移動装置で取得された走行情報に基づいて、前記地図情報に設定されている前記勾配情報の修正を行う手順
を前記コンピュータで実行させるプログラムにある。
The third aspect of this technology is a program that causes a computer to execute processing using information acquired by a mobile device.
There is a program for causing the computer to execute a procedure for correcting the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
 なお、本技術のプログラムは、例えば、様々なプログラム・コードを実行可能な汎用コンピュータに対して、コンピュータ可読な形式で提供する記憶媒体、通信媒体、例えば、光ディスクや磁気ディスク、半導体メモリなどの記憶媒体、あるいは、ネットワークなどの通信媒体によって提供可能なプログラムである。このようなプログラムをコンピュータ可読な形式で提供することにより、コンピュータ上でプログラムに応じた処理が実現される。 The program of the present technology is, for example, a storage medium, a communication medium, for example, a storage medium such as an optical disk, a magnetic disk, or a semiconductor memory, which is provided in a computer-readable format to a general-purpose computer capable of executing various program codes. It is a program that can be provided by a medium or a communication medium such as a network. By providing such a program in a computer-readable format, processing according to the program can be realized on the computer.
情報処理装置を用いたシステムの構成を例示した図である。It is a figure which illustrated the structure of the system which used the information processing apparatus. システムの動作を例示したシーケンス図である。It is a sequence diagram which exemplifies the operation of a system. 経路探索地図情報の作成動作を説明するための図である。It is a figure for demonstrating the creation operation of a route search map information. 経路計画の作成例を示した図である。It is a figure which showed the creation example of a route plan. 補正自己位置情報を説明するための図である。It is a figure for demonstrating the correction self-position information. 車両制御システムの構成例を示した図である。It is a figure which showed the configuration example of the vehicle control system. センシング領域の例を示す図である。It is a figure which shows the example of the sensing area.
 以下、本技術を実施するための形態について説明する。なお、説明は以下の順序で行う。
 1.システムの構成
 2.システムの動作
 3.応用例
Hereinafter, a mode for implementing the present technology will be described. The explanation will be given in the following order.
1. 1. System configuration 2. System operation 3. Application example
 <1.システムの構成>
 図1は、本技術の情報処理装置を用いたシステムの構成を例示している。システム10は、移動装置20とサーバ30を有しており、サーバ30に情報処理装置の機能が設けられている。移動装置20は、サーバ30から供給された経路計画に基づいて移動動作を行う。また、サーバ30は、勾配情報が設定された地図情報と移動装置20で取得された走行情報に基づいて、地図情報に設定されている勾配情報を修正する。また、サーバ30は、勾配情報が設定された地図情報が生成されていない場合、勾配情報が設定されていない地図情報と移動装置20で取得された走行情報に基づいて、勾配情報が設定された地図情報を生成する。さらに、サーバ30は、路面の勾配情報(路面勾配情報ともいう)を設定した地図情報(以下「経路探索地図」という)を利用して経路計画を作成する。
<1. System configuration>
FIG. 1 illustrates a configuration of a system using the information processing apparatus of the present technology. The system 10 has a mobile device 20 and a server 30, and the server 30 is provided with the function of an information processing device. The mobile device 20 performs a movement operation based on the route plan supplied from the server 30. Further, the server 30 corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device 20. Further, when the map information in which the gradient information is set is not generated, the server 30 sets the gradient information based on the map information in which the gradient information is not set and the traveling information acquired by the moving device 20. Generate map information. Further, the server 30 creates a route plan by using map information (hereinafter referred to as “route search map”) in which road surface gradient information (also referred to as road surface gradient information) is set.
 移動装置20は、車輪オドメトリ21、勾配検出部22、自己位置推定部23、通信部24、経路追従部25、駆動制御部26、駆動部27を有している。 The moving device 20 has a wheel odometry 21, a gradient detection unit 22, a self-position estimation unit 23, a communication unit 24, a path tracking unit 25, a drive control unit 26, and a drive unit 27.
 車輪オドメトリ21は、車輪の回転量(回転数や回転角)を計測して移動装置20の移動量を算出する。車輪オドメトリ21は、算出した移動量を示す移動量情報を自己位置推定部23と通信部24へ出力する。なお、移動量の算出は、車輪の回転量に限らず車輪を回転させる駆動ギヤ等の回転量を用いてもよい。 The wheel odometry 21 measures the amount of rotation of the wheels (rotational speed and angle of rotation) and calculates the amount of movement of the moving device 20. The wheel odometry 21 outputs the movement amount information indicating the calculated movement amount to the self-position estimation unit 23 and the communication unit 24. The amount of movement is not limited to the amount of rotation of the wheels, but the amount of rotation of the drive gear or the like that rotates the wheels may be used.
 勾配検出部22は、路面の勾配を検出する。勾配検出部22は、例えば慣性計測装置(IMU:Inertial Measurement Unit)を用いて構成されており、IMUで生成されたセンサ情報に基づき路面の勾配を検出する。センサ情報は、移動装置の並進運動または回転運動のいずれか少なくとも1以上を検出した情報であり、IMUは、加速度センサと角速度センサを少なくとも備え、加速度センサにより並進運動を検出して、角速度センサにより回転運動を検出する。勾配検出部22は、検出した並進運動あるいは回転運動の少なくともいずれかから移動装置20が走行している路面の勾配を示す勾配検出情報を生成する。例えば、勾配検出部22は、特許文献「特開2003-097945号公報」「特開2013-044562号公報」等に記載されている手法を用いて算出した勾配を示す勾配検出情報を生成して通信部24へ出力する。 The gradient detection unit 22 detects the gradient of the road surface. The gradient detection unit 22 is configured by using, for example, an inertial measurement unit (IMU), and detects the gradient of the road surface based on the sensor information generated by the IMU. The sensor information is information that detects at least one of the translational motion or the rotational motion of the moving device, and the IMU includes at least an acceleration sensor and an angular velocity sensor, detects the translational motion by the acceleration sensor, and detects the translational motion by the angular velocity sensor. Detects rotational motion. The gradient detection unit 22 generates gradient detection information indicating the gradient of the road surface on which the moving device 20 is traveling from at least one of the detected translational motion or rotational motion. For example, the gradient detection unit 22 generates gradient detection information indicating a gradient calculated by using the methods described in Patent Documents "Japanese Patent Laid-Open No. 2003-09945", "Japanese Patent Laid-Open No. 2013-044562" and the like. Output to the communication unit 24.
 自己位置推定部23は、車輪オドメトリ21で算出された移動量情報に基づいて自己位置を推定する。さらに、自己位置推定部23は、推定した自己位置を通信部24で受信したサーバ30からの路面勾配情報に基づいて補正する。自己位置推定部23は、勾配情報を用いた自己位置推定の結果、すなわち路面勾配情報に基づいて補正した自己位置を示す補正自己位置情報を通信部24と経路追従部25へ出力する。また、自己位置推定部23は、路面勾配情報が得られていない場合、サーバ30に対して路面勾配情報(あるいは経路探索地図情報)の要求や更新を行うようにしてもよい。 The self-position estimation unit 23 estimates the self-position based on the movement amount information calculated by the wheel odometry 21. Further, the self-position estimation unit 23 corrects the estimated self-position based on the road surface gradient information from the server 30 received by the communication unit 24. The self-position estimation unit 23 outputs the result of self-position estimation using the gradient information, that is, the corrected self-position information indicating the corrected self-position based on the road surface gradient information to the communication unit 24 and the route tracking unit 25. Further, the self-position estimation unit 23 may request or update the road surface gradient information (or the route search map information) from the server 30 when the road surface gradient information is not obtained.
 通信部24は、サーバ30と無線通信を行う。無線通信は、例えば、LTE、WCDMA(登録商標)、5Gなどのいずれかを使用するセルラ通信を含んでもよく、Wi-Fi、ブルートゥース(登録商標)などのいずれかを使用する近距離無線通信を含んでもよい。通信部24は、サーバ30と無線通信を行い、例えば車輪オドメトリ21で生成された移動量情報や勾配検出部22で生成された勾配検出情報、自己位置推定部23で生成された補正自己位置情報をサーバ30へ送信する。また、通信部24は、サーバ30から送信された路面勾配情報(あるいは経路探索地図情報)と経路計画を受信して、路面勾配情報を自己位置推定部23、経路計画あるいは経路計画と経路探索地図情報を経路追従部25へそれぞれ出力する。 The communication unit 24 performs wireless communication with the server 30. The wireless communication may include cellular communication using any of LTE, WCDMA (registered trademark), 5G and the like, and short-range wireless communication using any of Wi-Fi, Bluetooth (registered trademark) and the like. It may be included. The communication unit 24 wirelessly communicates with the server 30, for example, the movement amount information generated by the wheel odometry 21, the gradient detection information generated by the gradient detection unit 22, and the corrected self-position information generated by the self-position estimation unit 23. To the server 30. Further, the communication unit 24 receives the road surface gradient information (or the route search map information) and the route plan transmitted from the server 30, and uses the road surface gradient information as the self-position estimation unit 23, the route plan or the route plan and the route search map. Information is output to the path tracking unit 25, respectively.
 経路追従部25は、サーバ30から供給された経路計画で示された経路を移動装置20で追従させる。経路追従部25は、サーバ30から取得した経路計画および自己位置推定部23で生成された補正自己位置情報に基づき、周辺環境の認識や目的地までの軌道探索を行い、障害物との衝突等を回避して移動動作を行えるように移動装置20の軌道を計画する。経路追従部25は、計画した軌道を駆動制御部26へ出力する。なお、周辺環境の認識は、サーバ30から取得した経路探索地図情報を用いてもよく、移動装置20に設けられた外界センサ(例えば測距センサ(LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)やToF(Time of Flight),ステレオカメラ等)からのセンサ信号を用いてもよい。 The route following unit 25 causes the moving device 20 to follow the route indicated by the route plan supplied from the server 30. The route following unit 25 recognizes the surrounding environment and searches for a trajectory to the destination based on the route plan acquired from the server 30 and the corrected self-position information generated by the self-position estimation unit 23, and collides with an obstacle or the like. The trajectory of the moving device 20 is planned so that the moving operation can be performed while avoiding the above. The path following unit 25 outputs the planned trajectory to the drive control unit 26. The surrounding environment may be recognized by using the route search map information acquired from the server 30, and an external sensor (for example, a range sensor (for example, LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing)) provided in the mobile device 20 may be used. ), ToF (Time of Flight), stereo camera, etc.) may be used.
 駆動制御部26は、経路追従部25で計画された軌道を移動するように駆動信号を生成して駆動部27へ出力する。 The drive control unit 26 generates a drive signal so as to move in the trajectory planned by the path follower unit 25 and outputs the drive signal to the drive unit 27.
 駆動部27は、車輪と駆動源等を用いて構成されており、駆動制御部26からの駆動信号に基づき駆動源で車輪を駆動して移動装置20を移動させる。 The drive unit 27 is configured by using wheels, a drive source, and the like, and drives the wheels with the drive source based on the drive signal from the drive control unit 26 to move the moving device 20.
 サーバ30は、通信部31、地図情報処理部32、経路計画部33を有している。 The server 30 has a communication unit 31, a map information processing unit 32, and a route planning unit 33.
 通信部31は、移動装置20の通信部24と無線通信を行うことができるように構成されている。無線通信は、上述のようにセルラ通信を含んでもよく、近距離無線通信を含んでもよい。通信部31は、移動装置20と無線通信を行い、移動装置20で生成された情報、例えば移動量情報や勾配検出情報を受信する。通信部31は、受信した移動量情報と補正自己位置情報および勾配検出情報を地図情報処理部32へ出力する。また、通信部31は、経路計画部33で作成された経路計画等を移動装置20へ送信する。 The communication unit 31 is configured to be able to perform wireless communication with the communication unit 24 of the mobile device 20. The wireless communication may include cellular communication as described above, or may include short-range wireless communication. The communication unit 31 wirelessly communicates with the mobile device 20 and receives information generated by the mobile device 20, such as movement amount information and gradient detection information. The communication unit 31 outputs the received movement amount information, the corrected self-position information, and the gradient detection information to the map information processing unit 32. Further, the communication unit 31 transmits the route plan or the like created by the route planning unit 33 to the mobile device 20.
 地図情報処理部32は、事前に取得している地図情報と移動装置20で生成された勾配検出情報に基づいて、路面勾配情報が設定された経路探索地図情報の生成または経路探索地図情報に設定されている路面勾配情報の修正を行う。地図情報処理部32は、情報格納部321と路面勾配情報生成部322を有している。 The map information processing unit 32 generates route search map information or route search map information in which road surface gradient information is set, based on the map information acquired in advance and the gradient detection information generated by the moving device 20. Correct the road surface slope information. The map information processing unit 32 has an information storage unit 321 and a road surface gradient information generation unit 322.
 情報格納部321は、外部機器等から取得した例えば入射が容易である二次元平面の地図情報を格納する。また、情報格納部321は、二次元平面の地図情報に路面勾配情報が設定された経路探索地図情報が生成された場合、生成された経路探索地図情報を格納する。さらに、情報格納部321は、移動装置20とは異なる他の装置で生成された勾配検出情報や移動装置20で過去に生成された勾配検出情報を、勾配の検出位置を示す位置情報と共に格納する。 The information storage unit 321 stores map information of a two-dimensional plane that is easily incident, for example, acquired from an external device or the like. Further, the information storage unit 321 stores the generated route search map information when the route search map information in which the road surface gradient information is set is generated in the map information of the two-dimensional plane. Further, the information storage unit 321 stores the gradient detection information generated by another device different from the mobile device 20 and the gradient detection information generated in the past by the mobile device 20 together with the position information indicating the gradient detection position. ..
 路面勾配情報生成部322は、移動装置20で生成された勾配検出情報を用いて、経路探索地図情報に設定されている路面勾配情報の修正を行う。また、路面勾配情報生成部322は、経路探索地図情報が生成されておらず、情報格納部321に勾配検出情報が格納されていない場合、移動装置20で生成された勾配検出情報を路面勾配情報として地図情報に設定して経路探索地図情報を生成する。 The road surface gradient information generation unit 322 corrects the road surface gradient information set in the route search map information by using the gradient detection information generated by the moving device 20. Further, when the route search map information is not generated and the gradient detection information is not stored in the information storage unit 321, the road surface gradient information generation unit 322 uses the gradient detection information generated by the moving device 20 as the road surface gradient information. The route search map information is generated by setting the map information as.
 路面勾配情報を生成する場合、路面勾配情報生成部322は、移動装置20の振動や勾配検出部22でのノイズ等の影響を少なくするために、例えば所定走行期間の勾配検出情報を平均化する。この場合、路面勾配情報は、走行方向の解像度が低下した情報となってしまう。そこで、路面勾配情報生成部322は、移動装置20で生成された勾配検出情報と、既に生成されて情報格納部321に格納されている同じ移動位置の勾配検出情報を用いたフィルタ処理あるいは統計処理(以下、まとめて「フィルタ処理」という)を行い、走行方向の解像度の低下を招くことなく、移動装置20の振動や勾配検出部22でのノイズ等の影響の少ない路面勾配情報を生成する。移動位置は、移動装置20から取得した移動量情報と移動量情報で示された移動位置の情報格納部321に格納されている勾配検出情報を用いて、移動装置20の自己位置推定部23と同様に補正を行い、補正後の自己位置の勾配検出情報を用いてフィルタ処理を行う。また、路面勾配情報生成部322は、移動装置20の自己位置推定部23で生成された補正自己位置情報で示された位置の勾配検出情報を用いてもよい。 When generating the road surface gradient information, the road surface gradient information generation unit 322 averages the gradient detection information for a predetermined traveling period, for example, in order to reduce the influence of vibration of the moving device 20 and noise in the gradient detection unit 22. .. In this case, the road surface gradient information becomes information in which the resolution in the traveling direction is lowered. Therefore, the road surface gradient information generation unit 322 performs filter processing or statistical processing using the gradient detection information generated by the moving device 20 and the gradient detection information of the same moving position that has already been generated and stored in the information storage unit 321. (Hereinafter, collectively referred to as "filter processing") is performed to generate road surface gradient information that is less affected by vibration of the moving device 20 and noise in the gradient detecting unit 22 without causing a decrease in resolution in the traveling direction. The moving position is set with the self-position estimation unit 23 of the moving device 20 by using the moving amount information acquired from the moving device 20 and the gradient detection information stored in the moving position information storage unit 321 indicated by the moving amount information. The correction is performed in the same manner, and the filter processing is performed using the gradient detection information of the self-position after the correction. Further, the road surface gradient information generation unit 322 may use the slope detection information of the position indicated by the corrected self-position information generated by the self-position estimation unit 23 of the mobile device 20.
 路面勾配情報生成部322は、例えばカルマンフィルタあるいはパーティクルフィルタ等を用いて複数の勾配検出情報を用いたフィルタ処理を行い、路面勾配情報を生成してもよく、複数の勾配検出情報の統計処理を行い、最頻値あるいは中央値または平均値を算出して、算出結果を路面勾配情報としてもよい。路面勾配情報生成部322は、生成した路面勾配情報を外部機器等から取得した地図情報に設定して、あるいは経路探索地図情報に設定されている路面勾配情報を、フィルタ処理後の路面勾配情報によって修正する。さらに、路面勾配情報生成部322は、情報格納部321に格納されている路探索地図情報を、路面勾配情報が修正されている経路探索地図情報に更新する。また、路面勾配情報生成部322は、勾配検出情報の送信元に対して路面勾配情報を出力する。 The road surface gradient information generation unit 322 may generate road surface gradient information by performing filter processing using a plurality of gradient detection information using, for example, a Kalman filter or a particle filter, and statistically performs a plurality of gradient detection information. , The mode value, the median value, or the average value may be calculated, and the calculation result may be used as the road surface gradient information. The road surface gradient information generation unit 322 sets the generated road surface gradient information in the map information acquired from an external device or the like, or sets the road surface gradient information in the route search map information by the road surface gradient information after the filtering process. Correct. Further, the road surface gradient information generation unit 322 updates the road search map information stored in the information storage unit 321 to the route search map information in which the road surface gradient information is corrected. Further, the road surface gradient information generation unit 322 outputs the road surface gradient information to the transmission source of the gradient detection information.
 経路計画部33は、路面勾配情報が最新の情報とされている経路探索地図情報で示された路面勾配情報に基づいて移動装置20で生成された補正自己位置情報で示された補正自己位置から目的地までの経路計画を作成する。また、経路計画部33は、経路探索地図情報に設定されている路面勾配情報を用いて最適な経路を示す経路計画を作成する。経路計画部33は、作成した経路計画を通信部31から移動装置20へ出力する。 The route planning unit 33 starts from the corrected self-position indicated by the corrected self-position information generated by the moving device 20 based on the road surface gradient information indicated by the route search map information whose road surface gradient information is the latest information. Create a route plan to your destination. Further, the route planning unit 33 creates a route plan showing the optimum route by using the road surface gradient information set in the route search map information. The route planning unit 33 outputs the created route plan from the communication unit 31 to the mobile device 20.
 <2.システムの動作>
 図2は、システムの動作を例示したシーケンス図である。ステップST1でサーバ30は、外部機器等で提供されている地図情報を取得して地図情報処理部32の情報格納部321に格納する。
<2. System operation>
FIG. 2 is a sequence diagram illustrating the operation of the system. In step ST1, the server 30 acquires the map information provided by the external device or the like and stores it in the information storage unit 321 of the map information processing unit 32.
 ステップST2で移動装置20は、車輪オドメトリ21で算出された移動量を示す移動量情報をサーバ30へ出力する。 In step ST2, the moving device 20 outputs the moving amount information indicating the moving amount calculated by the wheel odometry 21 to the server 30.
 ステップST3でサーバ30は、勾配検出情報を路面勾配情報生成部322へ出力する。地図情報処理部32の情報格納部321は、移動装置20から供給された移動量情報に基づいた移動位置と等しい位置の勾配検出情報を格納している勾配検出情報から抽出して路面勾配情報生成部322へ出力する。また、情報格納部321に路面勾配情報が格納されている場合、移動装置20から供給された移動量情報と路面勾配情報に基づき、移動装置20の自己位置推定部23と同様に推定した自己位置(後述する補正自己位置情報が示す自己位置に相当)と等しい位置の勾配検出情報を抽出して、路面勾配情報生成部322へ出力する。 In step ST3, the server 30 outputs the gradient detection information to the road surface gradient information generation unit 322. The information storage unit 321 of the map information processing unit 32 extracts from the gradient detection information storing the gradient detection information at the position equal to the movement position based on the movement amount information supplied from the movement device 20, and generates the road surface gradient information. Output to unit 322. When the road surface gradient information is stored in the information storage unit 321, the self-position estimated in the same manner as the self-position estimation unit 23 of the mobile device 20 based on the movement amount information and the road surface gradient information supplied from the mobile device 20. Gradient detection information at a position equal to (corresponding to the self-position indicated by the corrected self-position information described later) is extracted and output to the road surface gradient information generation unit 322.
 ステップST4で移動装置20は、勾配検出情報をサーバ30へ出力する。移動装置20は、移動量情報と路面勾配情報に基づく位置で勾配検出部22が生成した勾配検出情報をサーバ30へ出力する。移動装置20の勾配検出部22は、例えばIMUで生成された加速度や角加速度等に基づき移動装置の姿勢に応じた勾配検出情報を生成する。なお、移動装置の振動や物品の搭載状況に応じて移動装置の姿勢が変化すると、勾配検出情報は路面の勾配に対して誤差を生じる。 In step ST4, the mobile device 20 outputs the gradient detection information to the server 30. The moving device 20 outputs the gradient detection information generated by the gradient detection unit 22 to the server 30 at a position based on the movement amount information and the road surface gradient information. The gradient detection unit 22 of the moving device 20 generates gradient detection information according to the posture of the moving device based on, for example, the acceleration generated by the IMU, the angular acceleration, and the like. If the posture of the mobile device changes according to the vibration of the mobile device or the loading condition of the article, the gradient detection information causes an error with respect to the gradient of the road surface.
 ステップST5でサーバ30は、路面勾配情報を移動装置20へ出力する。地図情報処理部32の路面勾配情報生成部322はステップST3で情報格納部321から供給された勾配検出情報とステップST4で移動装置20から供給された勾配検出情報を用いてフィルタ処理を行い、移動装置20の移動位置における路面の勾配を示す路面勾配情報を生成して移動装置20へ出力する。また、路面勾配情報生成部322は、生成した路面勾配情報を用いて情報格納部321に格納されている経路探索地図情報の路面勾配情報を修正する。さらに、路面勾配情報生成部322は、情報格納部321に経路探索地図情報が格納されていない場合、外部機器等から取得した地図情報に路面勾配情報を設定して、経路探索地図情報を情報格納部321に格納させる。 In step ST5, the server 30 outputs the road surface gradient information to the mobile device 20. The road surface gradient information generation unit 322 of the map information processing unit 32 performs filter processing using the gradient detection information supplied from the information storage unit 321 in step ST3 and the gradient detection information supplied from the moving device 20 in step ST4, and moves. Road surface gradient information indicating the slope of the road surface at the moving position of the device 20 is generated and output to the moving device 20. Further, the road surface gradient information generation unit 322 corrects the road surface gradient information of the route search map information stored in the information storage unit 321 by using the generated road surface gradient information. Further, when the route search map information is not stored in the information storage unit 321, the road surface gradient information generation unit 322 sets the road surface gradient information in the map information acquired from an external device or the like and stores the route search map information. It is stored in the unit 321.
 図3は経路探索地図情報の作成動作を説明するための図である。図3の(a)は、勾配θが「θa」である路面を移動装置20が移動している状態を例示しており、図3の(b)は、移動位置Pmにおいて移動装置20の勾配検出部22で生成された勾配検出情報を例示している。図3の(c)は移動装置20や他の移動装置によって同じ移動位置で既に取得されている勾配検出情報を例示している。 FIG. 3 is a diagram for explaining the operation of creating route search map information. FIG. 3A exemplifies a state in which the moving device 20 is moving on a road surface having a gradient θ of “θa”, and FIG. 3B is a gradient of the moving device 20 at the moving position Pm. The gradient detection information generated by the detection unit 22 is illustrated. (C) of FIG. 3 exemplifies the gradient detection information already acquired at the same moving position by the moving device 20 or another moving device.
 路面勾配情報生成部322は、図3の(b)に示す勾配検出情報と、図3の(c)に示す勾配検出情報から、同じ移動位置Pmの勾配検出情報を用いてフィルタ処理を行い、図3の(d)に示すように、位置毎の路面勾配情報を生成する。なお、図3の(e)は、移動位置Pmにおける勾配検出情報で示された勾配の分布を示している。 The road surface gradient information generation unit 322 performs a filter process from the gradient detection information shown in FIG. 3 (b) and the gradient detection information shown in FIG. 3 (c) using the gradient detection information of the same moving position Pm. As shown in FIG. 3D, road surface gradient information for each position is generated. Note that FIG. 3 (e) shows the distribution of the gradient shown in the gradient detection information at the moving position Pm.
 また、路面勾配情報生成部322は、新たな勾配検出情報が供給される毎に、新たな勾配検出情報を追加してフィルタ処理を行い、路面勾配情報を修正する。情報格納部321は、路面勾配情報生成部322で生成された路面勾配情報を移動位置Pmに設定して経路探索地図情報を生成する。 Further, the road surface gradient information generation unit 322 adds new gradient detection information and performs filtering processing every time new gradient detection information is supplied, and corrects the road surface gradient information. The information storage unit 321 sets the road surface gradient information generated by the road surface gradient information generation unit 322 to the movement position Pm, and generates the route search map information.
 ステップST6で移動装置20は移動量情報を自己位置推定部23へ出力する。移動装置20の車輪オドメトリ21は、算出した移動量を示す移動量情報を自己位置推定部23へ出力する。 In step ST6, the moving device 20 outputs the moving amount information to the self-position estimation unit 23. The wheel odometry 21 of the moving device 20 outputs the movement amount information indicating the calculated movement amount to the self-position estimation unit 23.
 ステップST7で移動装置20は補正自己位置情報をサーバ30へ出力する。自己位置推定部23は、ステップST5でサーバ30から供給された路面勾配情報とステップST6で車輪オドメトリ21から供給された移動量情報に基づき地図上の移動量を算出して、路面の勾配の影響を除いた自己位置を算出する。さらに、自己位置推定部23は、路面の勾配の影響を除いた自己位置を示す補正自己位置情報をサーバ30へ出力する。 In step ST7, the mobile device 20 outputs the corrected self-position information to the server 30. The self-position estimation unit 23 calculates the amount of movement on the map based on the road surface gradient information supplied from the server 30 in step ST5 and the movement amount information supplied from the wheel odometry 21 in step ST6, and the influence of the road surface gradient. Calculate the self-position excluding. Further, the self-position estimation unit 23 outputs the corrected self-position information indicating the self-position excluding the influence of the slope of the road surface to the server 30.
 ステップST8でサーバ30は補正自己位置情報と経路探索地図情報を経路計画部33へ出力する。地図情報処理部32の情報格納部321は、移動装置20から供給された補正自己位置情報と、経路探索地図情報から補正自己位置情報で示された位置を基準として目的地を含む所定範囲の地図情報を経路計画部33へ出力する。 In step ST8, the server 30 outputs the corrected self-position information and the route search map information to the route planning unit 33. The information storage unit 321 of the map information processing unit 32 is a map of a predetermined range including the destination based on the corrected self-position information supplied from the mobile device 20 and the position indicated by the corrected self-position information from the route search map information. The information is output to the route planning unit 33.
 ステップST9でサーバ30は経路計画を移動装置20へ出力する。サーバ30の経路計画部33は、補正自己位置情報が示す移動装置の現在位置から目的地までの経路を示す経路計画を生成する。また、経路計画部33は、現在位置から目的地までの地図上の距離だけでなく経路探索地図情報に含まれている路面勾配情報を用いて、最適な経路を示す経路計画を生成する。例えば、経路計画部33は、距離が長くなるに伴い増加するように設定した距離コストCaと勾配が急となるに伴い増加するように設定した勾配コストCbを用いて、経路毎にコスト値を算出して、コスト値が最小の経路を選択する。 In step ST9, the server 30 outputs the route plan to the mobile device 20. The route planning unit 33 of the server 30 generates a route plan indicating a route from the current position of the mobile device indicated by the corrected self-position information to the destination. Further, the route planning unit 33 generates a route plan showing the optimum route by using not only the distance on the map from the current position to the destination but also the road surface gradient information included in the route search map information. For example, the route planning unit 33 uses a distance cost Ca set to increase as the distance increases and a gradient cost Cb set to increase as the gradient becomes steeper, and sets a cost value for each route. Calculate and select the route with the lowest cost value.
 経路計画部33は、経路探索地図情報に基づき移動可能な経路を利用して目的地まで辿るルートを判別する。ここで、例えばルートR-1~ルートR-nが判別された場合、経路計画部33は、各ルートのコスト値CR-1~CR-nを式(1)に示すコスト関数fに基づき算出する。なお、式(1)において、パラメータxは「x=1~n」である。コストCeは、距離や路面の勾配とは異なり、路面の勾配に関係する要因(例えば運転制御の容易性や移動に必要なエネルギー等)のコストを示しており、コスト値CR-1~CR-nの算出に含まれなくともよい。
  CR-x =f(Ca-x,Cb-x,Ce-x)  ・・・(1)
The route planning unit 33 determines the route to be traced to the destination by using the movable route based on the route search map information. Here, for example, when the route R-1 to the route R-n are determined, the route planning unit 33 calculates the cost values CR-1 to CR-n of each route based on the cost function f shown in the equation (1). do. In the equation (1), the parameter x is "x = 1 to n". The cost Ce indicates the cost of factors related to the road surface gradient (for example, ease of operation control and energy required for movement), unlike the distance and the road surface gradient, and the cost values CR-1 to CR- It does not have to be included in the calculation of n.
CR-x = f (Ca-x, Cb-x, Ce-x) ... (1)
 経路計画部33は、コスト値CR-1~CR-nから最小のコスト値を選択して、選択したコスト値に対応するルートに従い目的地まで移動する経路計画を作成する。 The route planning unit 33 selects the minimum cost value from the cost values CR-1 to CR-n, and creates a route plan for moving to the destination according to the route corresponding to the selected cost value.
 図4は、経路計画の作成例を示している。図4の(a)は、移動開始位置Psから目的地Pgに向かうルートとして、例えば丘を越えて進むルートR-aと、丘を回避するルートR-bを例示している。また、図4の(b)は、ルートR-aを選択したときの移動開始位置Psから目的地Pgまでの地図上での移動距離と路面の勾配の関係を示しており、図4の(c)は、ルートR-bを選択したときの移動開始位置Psから目的地Pgまでの地図上での移動距離と路面の勾配の関係を示している。 FIG. 4 shows an example of creating a route plan. FIG. 4A exemplifies, for example, a route R-a traveling over a hill and a route R-b avoiding a hill as a route from the movement start position Ps to the destination Pg. Further, FIG. 4B shows the relationship between the travel distance on the map from the travel start position Ps to the destination Pg when the route R-a is selected and the slope of the road surface. c) shows the relationship between the movement distance on the map from the movement start position Ps to the destination Pg when the route R-b is selected and the slope of the road surface.
 路面勾配情報が含まれていない地図情報を用いた場合、サーバ30は目的地までの距離が短いルートR-aを進む経路計画が作成される。しかし、本技術のように路面勾配情報を含む経路探索地図情報を用いれば、サーバ30は、路面勾配情報が含まれていない地図情報で示された距離よりも実際の移動距離が長くなる場合が生じることや目的地に移動するために必要なエネルギーが大きくなること等を判別できる。 When map information that does not include road surface gradient information is used, the server 30 creates a route plan that follows route R-a, which has a short distance to the destination. However, if the route search map information including the road surface gradient information is used as in the present technology, the server 30 may have an actual travel distance longer than the distance indicated by the map information not including the road surface gradient information. It is possible to determine what will occur and how much energy will be required to move to the destination.
 したがって、サーバ30は、式(1)に基づいてコスト値を算出して、コスト値がルートR-aよりもルートR-bが小さい場合、ルートR-bを進む経路計画を作成する。 Therefore, the server 30 calculates the cost value based on the equation (1), and when the cost value is smaller than the route R-a, the server 30 creates a route plan for the route R-b.
 このように、経路計画部33は、移動装置20の現在位置と目的地および経路探索地図情報に含まれている路面勾配情報を用いて、最適な経路例えばアップダウンが少なく容易に移動可能なルートを示す経路計画を生成して移動装置20へ出力する。 In this way, the route planning unit 33 uses the current position and destination of the moving device 20 and the road surface gradient information included in the route search map information to obtain an optimum route, for example, a route that can be easily moved with few ups and downs. A route plan indicating the above is generated and output to the mobile device 20.
 ステップST10で移動装置20は補正自己位置情報を経路追従部25へ出力する。移動装置20の自己位置推定部23は、路面の勾配の影響を除いた補正自己位置情報を経路追従部25へ出力する。 In step ST10, the moving device 20 outputs the corrected self-position information to the path tracking unit 25. The self-position estimation unit 23 of the moving device 20 outputs the corrected self-position information excluding the influence of the slope of the road surface to the path tracking unit 25.
 図5は、補正自己位置情報を説明するための図である。路面SFが勾配θaの傾斜面であり、移動装置20が路面SFを移動したときに車輪オドメトリ21で算出された移動量が「ML」である場合、図5の(a)に示すように、移動量「ML」に基づいて推定した自己位置は地図上の位置P1となる。しかし、移動装置20の実際の位置は、位置P2(=P1×Cosθa)である。したがって、推定した自己位置は誤差(P1-P2)を生じる。 FIG. 5 is a diagram for explaining the corrected self-position information. When the road surface SF is an inclined surface having a gradient θa and the movement amount calculated by the wheel odometry 21 when the moving device 20 moves on the road surface SF is “ML”, as shown in FIG. 5 (a). The self-position estimated based on the movement amount "ML" is the position P1 on the map. However, the actual position of the mobile device 20 is the position P2 (= P1 × Cosθa). Therefore, the estimated self-position causes an error (P1-P2).
 システム10における本技術を用いたサーバ30は、路面勾配情報を移動装置20に送信することから、図5の(b)に示すように、移動装置20が移動する路面の勾配が明らかとなる。なお、図5の(c)は、地図上の走行路(位置MPa乃至位置MPb)の位置MPaを基準とした高低差と路面勾配情報を例示している。 Since the server 30 using the present technology in the system 10 transmits the road surface gradient information to the mobile device 20, the gradient of the road surface on which the mobile device 20 moves becomes clear as shown in FIG. 5 (b). Note that FIG. 5C exemplifies the height difference and the road surface gradient information based on the position MPa of the traveling path (position MPa to position MPb) on the map.
 したがって、移動装置20は、車輪オドメトリ21で生成された移動量情報と移動装置20が移動した路面の路面勾配情報に基づいて自己位置の誤差を補正できるので、移動装置20の地図上の位置を精度よく検出できる。 Therefore, since the moving device 20 can correct the self-positioning error based on the movement amount information generated by the wheel odometry 21 and the road surface gradient information of the road surface on which the moving device 20 has moved, the position on the map of the moving device 20 can be corrected. It can be detected accurately.
 ステップST11で移動装置20は制御情報を駆動制御部26へ出力する。移動装置20の経路追従部25は、自己位置推定部23から供給された補正自己位置情報が示す現在位置と、サーバ30から供給された経路計画に基づき、経路計画で示された経路を移動装置20が追従するように移動制御を行うための制御情報を生成して駆動制御部26へ出力する。 In step ST11, the mobile device 20 outputs control information to the drive control unit 26. The route following unit 25 of the mobile device 20 moves the route indicated by the route plan based on the current position indicated by the corrected self-position information supplied from the self-position estimation unit 23 and the route plan supplied from the server 30. Control information for performing movement control so that 20 follows is generated and output to the drive control unit 26.
 このような本技術によれば、サーバ30は、移動装置で生成された勾配検出情報を利用して路面の勾配を精度よく算出することが可能となり、正しい路面勾配情報を含む経路探索地図情報を容易に生成できる。また、サーバ30は、路面勾配情報を含む経路探索地図情報を用いることで、路面の勾配を考慮した最適な経路計画を作成できるようになる。さらに、路面勾配情報が移動装置20に供給されるので、移動装置20は、走行した路面の路面勾配情報を利用することで、路面勾配情報が得られていない場合に比べて自己位置を精度よく検出できるようになる。 According to the present technology as described above, the server 30 can accurately calculate the slope of the road surface by using the gradient detection information generated by the mobile device, and can obtain the route search map information including the correct road surface gradient information. Can be easily generated. Further, the server 30 can create an optimum route plan in consideration of the slope of the road surface by using the route search map information including the road surface gradient information. Further, since the road surface gradient information is supplied to the mobile device 20, the mobile device 20 uses the road surface gradient information of the traveled road surface to accurately determine its own position as compared with the case where the road surface gradient information is not obtained. You will be able to detect it.
 <3.応用例>
 次に、本技術の応用例について説明する。なお、応用例は、移動装置20が車両である場合を例示している。
<3. Application example>
Next, an application example of this technology will be described. The application example illustrates a case where the mobile device 20 is a vehicle.
 図6は、本技術が適用される移動装置制御システムの一例である車両制御システム111の構成例を示した図である。 FIG. 6 is a diagram showing a configuration example of a vehicle control system 111, which is an example of a mobile device control system to which the present technology is applied.
 車両制御システム111は、車両100に設けられ、車両100の走行支援及び自動運転に関わる処理を行う。 The vehicle control system 111 is provided in the vehicle 100 and performs processing related to driving support and automatic driving of the vehicle 100.
 車両制御システム111は、車両制御ECU(Electronic Control Unit )121、通信部122、地図情報蓄積部123、位置情報受信部124、外部認識センサ125、車内センサ126、車両センサ127、記録部128、走行支援・自動運転制御部129、DMS(Driver Monitoring System)130、HMI(Human Machine Interface)31、及び、車両制御部132を備える。 The vehicle control system 111 includes a vehicle control ECU (Electronic Control Unit) 121, a communication unit 122, a map information storage unit 123, a position information receiving unit 124, an external recognition sensor 125, an in-vehicle sensor 126, a vehicle sensor 127, a recording unit 128, and a traveling unit. It includes a support / automatic driving control unit 129, a DMS (Driver Monitoring System) 130, an HMI (Human Machine Interface) 31, and a vehicle control unit 132.
 車両制御ECU121、通信部122、地図情報蓄積部123、位置情報受信部124、外部認識センサ125、車内センサ126、車両センサ127、記録部128、走行支援・自動運転制御部129、ドライバモニタリングシステム(DMS)130、ヒューマンマシーンインタフェース(HMI)131、及び、車両制御部132は、通信ネットワーク141を介して相互に通信可能に接続されている。通信ネットワーク141は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、LAN(Local Area Network)、FlexRay(登録商標)、イーサネット(登録商標)といったディジタル双方向通信の規格に準拠した車載通信ネットワークやバス等により構成される。通信ネットワーク141は、通信される情報の種類によって使い分けられてもよく、例えば、車両制御に関する情報であればCANが適用され、大容量情報であればイーサネットが適用される。なお、車両制御システム111の各部は、通信ネットワーク141を介さずに、例えば近距離無線通信(NFC(Near Field Communication))やBluetooth(登録商標)といった比較的近距離での通信を想定した無線通信を用いて直接的に接続される場合もある。 Vehicle control ECU 121, communication unit 122, map information storage unit 123, position information receiving unit 124, external recognition sensor 125, in-vehicle sensor 126, vehicle sensor 127, recording unit 128, driving support / automatic driving control unit 129, driver monitoring system ( The DMS) 130, the human machine interface (HMI) 131, and the vehicle control unit 132 are connected to each other so as to be communicable with each other via the communication network 141. The communication network 141 is in-vehicle compliant with digital bidirectional communication standards such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), FlexRay (registered trademark), and Ethernet (registered trademark). It consists of a communication network and a bus. The communication network 141 may be used properly depending on the type of information to be communicated. For example, CAN is applied for information related to vehicle control, and Ethernet is applied for large-capacity information. It should be noted that each part of the vehicle control system 111 does not go through the communication network 141, but wireless communication assuming relatively short-distance communication such as short-range wireless communication (NFC (Near Field Communication)) and Bluetooth (registered trademark). In some cases, it is directly connected using.
 なお、以下、車両制御システム111の各部が、通信ネットワーク141を介して通信を行う場合、通信ネットワーク141の記載を省略するものとする。例えば、車両制御ECU121と通信部122が通信ネットワーク141を介して通信を行う場合、単にプロセッサ121と通信部122とが通信を行うと記載する。 Hereinafter, when each part of the vehicle control system 111 communicates via the communication network 141, the description of the communication network 141 shall be omitted. For example, when the vehicle control ECU 121 and the communication unit 122 communicate with each other via the communication network 141, it is described that the processor 121 and the communication unit 122 simply communicate with each other.
 車両制御ECU121は、例えば、CPU(Central Processing Unit)、MPU(Micro Processing Unit)といった各種プロセッサにより構成される。車両制御ECU121は、車両制御システム111全体若しくは一部の機能の制御を行う。 The vehicle control ECU 121 is composed of various processors such as a CPU (Central Processing Unit) and an MPU (Micro Processing Unit), for example. The vehicle control ECU 121 controls the functions of the entire vehicle control system 111 or a part of the vehicle control system 111.
 通信部122は、車内及び車外の様々な機器、他の車両、サーバ、基地局等と通信を行い、各種の情報の送受信を行う。このとき、通信部122は、複数の通信方式を用いて通信を行うことができる。 The communication unit 122 communicates with various devices inside and outside the vehicle, other vehicles, servers, base stations, etc., and transmits and receives various information. At this time, the communication unit 122 can perform communication using a plurality of communication methods.
 通信部122が実行可能な車外との通信について、概略的に説明する。通信部122は、例えば、5G(第5世代移動通信システム)、LTE(Long Term Evolution)、DSRC(Dedicated Short Range Communications)等の無線通信方式により、基地局又はアクセスポイントを介して、外部ネットワーク上に存在するサーバ(以下、外部のサーバと呼ぶ)等と通信を行う。通信部122が通信を行う外部ネットワークは、例えば、インターネット、クラウドネットワーク、又は、事業者固有のネットワーク等である。通信部122による外部ネットワークに対して通信を行う通信方式は、所定以上の通信速度、且つ、所定以上の距離間でディジタル双方向通信が可能な無線通信方式であれば、特に限定されない。 The communication with the outside of the vehicle, which can be executed by the communication unit 122, will be briefly described. The communication unit 122 is mounted on an external network via a base station or an access point by a wireless communication method such as 5G (5th generation mobile communication system), LTE (Long Term Evolution), DSRC (Dedicated Short Range Communications), etc. Communicates with a server (hereinafter referred to as an external server) that exists in. The external network with which the communication unit 122 communicates is, for example, the Internet, a cloud network, a network peculiar to a business operator, or the like. The communication method for communicating with the external network by the communication unit 122 is not particularly limited as long as it is a wireless communication method capable of digital bidirectional communication at a communication speed of a predetermined value or higher and a distance of a predetermined distance or more.
 また例えば、通信部122は、P2P(Peer To Peer)技術を用いて、自車の近傍に存在する端末と通信を行うことができる。自車の近傍に存在する端末は、例えば、歩行者や自転車など比較的低速で移動する移動装置が装着する端末、店舗などに位置が固定されて設置される端末、あるいは、MTC(Machine Type Communication)端末である。さらに、通信部122は、V2X通信を行うこともできる。V2X通信とは、例えば、他の車両との間の車車間(Vehicle to Vehicle)通信、路側器等との間の路車間(Vehicle to Infrastructure)通信、家との間(Vehicle to Home)の通信、及び、歩行者が所持する端末等との間の歩車間(Vehicle to Pedestrian)通信等の、自車と他との通信をいう。 Further, for example, the communication unit 122 can communicate with a terminal existing in the vicinity of the own vehicle by using P2P (Peer To Peer) technology. Terminals that exist in the vicinity of the vehicle are, for example, terminals that are attached to mobile devices that move at relatively low speeds such as pedestrians and bicycles, terminals that are fixedly installed in stores, or MTC (Machine Type Communication). ) It is a terminal. Further, the communication unit 122 can also perform V2X communication. V2X communication is, for example, vehicle-to-vehicle (Vehicle to Vehicle) communication with other vehicles, road-to-vehicle (Vehicle to Infrastructure) communication with roadside devices, and home (Vehicle to Home) communication. , And communication between the vehicle and others, such as vehicle-to-vehicle (Vehicle to Pedestrian) communication with terminals owned by pedestrians.
 通信部122は、例えば、車両制御システム111の動作を制御するソフトウェアを更新するためのプログラムを外部から受信することができる(Over The Air)。通信部122は、さらに、地図情報、交通情報、車両100の周囲の情報等を外部から受信することができる。また例えば、通信部122は、車両100に関する情報や、車両100の周囲の情報等を外部に送信することができる。通信部122が外部に送信する車両100に関する情報としては、例えば、車両100の状態を示す情報、認識部173による認識結果等がある。さらに例えば、通信部122は、eコール等の車両緊急通報システムに対応した通信を行う。 The communication unit 122 can receive, for example, a program for updating the software that controls the operation of the vehicle control system 111 from the outside (Over The Air). The communication unit 122 can further receive map information, traffic information, information around the vehicle 100, and the like from the outside. Further, for example, the communication unit 122 can transmit information about the vehicle 100, information around the vehicle 100, and the like to the outside. Information about the vehicle 100 transmitted by the communication unit 122 to the outside includes, for example, information indicating the state of the vehicle 100, recognition result by the recognition unit 173, and the like. Further, for example, the communication unit 122 performs communication corresponding to a vehicle emergency call system such as eCall.
 通信部122が実行可能な車内との通信について、概略的に説明する。通信部122は、例えば無線通信を用いて、車内の各機器と通信を行うことができる。通信部122は、例えば、無線LAN、Bluetooth、NFC、WUSB(Wireless USB)といった、無線通信により所定以上の通信速度でディジタル双方向通信が可能な通信方式により、車内の機器と無線通信を行うことができる。これに限らず、通信部122は、有線通信を用いて車内の各機器と通信を行うこともできる。例えば、通信部122は、図示しない接続端子に接続されるケーブルを介した有線通信により、車内の各機器と通信を行うことができる。通信部122は、例えば、USB(Universal Serial Bus)、HDMI(登録商標)(High-Definition Multimedia Interface)、MHL(Mobile High-definition Link)といった、有線通信により所定以上の通信速度でディジタル双方向通信が可能な通信方式により、車内の各機器と通信を行うことができる。 The communication unit 122 can roughly explain the communication with the inside of the vehicle. The communication unit 122 can communicate with each device in the vehicle by using, for example, wireless communication. The communication unit 122 performs wireless communication with a device in the vehicle by a communication method such as wireless LAN, Bluetooth, NFC, WUSB (WirelessUSB), which enables digital bidirectional communication at a communication speed higher than a predetermined value by wireless communication. Can be done. Not limited to this, the communication unit 122 can also communicate with each device in the vehicle by using wired communication. For example, the communication unit 122 can communicate with each device in the vehicle by wired communication via a cable connected to a connection terminal (not shown). The communication unit 122 uses wired communication such as USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), and MHL (Mobile High-definition Link) to perform digital bidirectional communication at a communication speed equal to or higher than a predetermined value. It is possible to communicate with each device in the car by the communication method capable of.
 ここで、車内の機器とは、例えば、車内において通信ネットワーク141に接続されていない機器を指す。車内の機器としては、例えば、運転者等の搭乗者が所持するモバイル機器やウェアラブル機器、車内に持ち込まれ一時的に設置される情報機器等が想定される。 Here, the device in the vehicle refers to, for example, a device that is not connected to the communication network 141 in the vehicle. As the equipment in the vehicle, for example, mobile equipment and wearable equipment possessed by passengers such as drivers, information equipment brought into the vehicle and temporarily installed, and the like are assumed.
 例えば、通信部122は、電波ビーコン、光ビーコン、FM多重放送等の道路交通情報通信システム(VICS(登録商標)(Vehicle Information and Communication System))により送信される電磁波を受信する。 For example, the communication unit 122 receives an electromagnetic wave transmitted by a vehicle information and communication system (VICS (registered trademark) (Vehicle Information and Communication System)) such as a radio wave beacon, an optical beacon, and FM multiplex broadcasting.
 地図情報蓄積部123は、外部から取得した地図及び車両100で作成した地図の一方または両方を蓄積する。例えば、地図情報蓄積部123は、3次元の高精度地図、高精度地図より精度が低く、広いエリアをカバーするグローバルマップ等を蓄積する。 The map information storage unit 123 stores one or both of the map acquired from the outside and the map created by the vehicle 100. For example, the map information storage unit 123 stores a three-dimensional high-precision map, a global map that is less accurate than the high-precision map and covers a wide area, and the like.
 高精度地図は、例えば、ダイナミックマップ、ポイントクラウドマップ、ベクターマップなどである。ダイナミックマップは、例えば、動的情報、準動的情報、準静的情報、静的情報の4層からなる地図であり、外部のサーバ等から車両100に提供される。ポイントクラウドマップは、ポイントクラウド(点群情報)により構成される地図である。ここで、ベクターマップは、車線や信号の位置といった交通情報などをポイントクラウドマップに対応付けた、ADAS(Advanced Driver Assistance System)に適合させた地図を指すものとする。 High-precision maps are, for example, dynamic maps, point cloud maps, vector maps, etc. The dynamic map is, for example, a map including four layers of dynamic information, quasi-dynamic information, quasi-static information, and static information, and is provided to the vehicle 100 from an external server or the like. The point cloud map is a map composed of point clouds (point cloud information). Here, the vector map refers to a map conforming to ADAS (Advanced Driver Assistance System) in which traffic information such as lanes and signal positions are associated with a point cloud map.
 ポイントクラウドマップ及びベクターマップは、例えば、外部のサーバ等から提供されてもよいし、レーダ152、LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)153等によるセンシング結果に基づいて、後述するローカルマップとのマッチングを行うための地図として車両100で作成され、地図情報蓄積部123に蓄積されてもよい。また、外部のサーバ等から高精度地図が提供される場合、通信容量を削減するため、車両100がこれから走行する計画経路に関する、例えば数百メートル四方の地図情報が外部のサーバ等から取得される。 The point cloud map and the vector map may be provided, for example, from an external server or the like, or may be described later based on the sensing results of the radar 152, LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing) 153, etc., which will be described later. It may be created by the vehicle 100 as a map for matching with the map and stored in the map information storage unit 123. Further, when a high-precision map is provided from an external server or the like, in order to reduce the communication capacity, for example, map information of several hundred meters square regarding the planned route on which the vehicle 100 will travel is acquired from the external server or the like. ..
 位置情報受信部124は、例えばGNSS(Global Navigation Satellite System)衛星からGNSS信号を受信し、車両100の位置情報を取得する。受信したGNSS信号は、走行支援・自動運転制御部129に供給される。なお、位置情報受信部124は、GNSS信号を用いた方式に限定されず、例えば、ビーコンを用いて位置情報を取得してもよい。 The position information receiving unit 124 receives a GNSS signal from, for example, a GNSS (Global Navigation Satellite System) satellite, and acquires the position information of the vehicle 100. The received GNSS signal is supplied to the driving support / automatic driving control unit 129. The position information receiving unit 124 is not limited to the method using the GNSS signal, and may acquire the position information by using, for example, a beacon.
 外部認識センサ125は、車両100の外部の状況の認識に用いられる各種のセンサを備え、各センサからのセンサ情報を車両制御システム111の各部に供給する。外部認識センサ、例えば外部認識センサ125は、カメラ151、レーダ152、LiDAR153、及び、超音波センサ154を備える。これに限らず、外部認識センサ125は、カメラ151、レーダ152、LiDAR153、及び、超音波センサ154のうち1種類以上のセンサを備える構成でもよい。カメラ151、レーダ152、LiDAR153、及び、超音波センサ154の数は、現実的に車両100に設置可能な数であれば特に限定されない。また、外部認識センサ125が備えるセンサの種類は、この例に限定されず、外部認識センサ125は、他の種類のセンサを備えてもよい。外部認識センサ125が備える各センサのセンシング領域の例は、後述する。 The external recognition sensor 125 includes various sensors used for recognizing the external situation of the vehicle 100, and supplies sensor information from each sensor to each part of the vehicle control system 111. An external recognition sensor, for example, an external recognition sensor 125, includes a camera 151, a radar 152, a LiDAR 153, and an ultrasonic sensor 154. Not limited to this, the external recognition sensor 125 may be configured to include one or more of the camera 151, the radar 152, the LiDAR 153, and the ultrasonic sensor 154. The number of cameras 151, radar 152, LiDAR 153, and ultrasonic sensor 154 is not particularly limited as long as it can be practically installed in the vehicle 100. Further, the type of sensor included in the external recognition sensor 125 is not limited to this example, and the external recognition sensor 125 may include other types of sensors. An example of the sensing area of each sensor included in the external recognition sensor 125 will be described later.
 なお、カメラ151の撮影方式は、測距が可能な撮影方式であれば特に限定されない。例えば、カメラ151は、ToF(Time of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラといった各種の撮影方式のカメラを、必要に応じて適用することができる。これに限らず、カメラ151は、測距に関わらずに、単に撮影画像を取得するためのものであってもよい。 The shooting method of the camera 151 is not particularly limited as long as it is a shooting method capable of distance measurement. For example, as the camera 151, cameras of various shooting methods such as a ToF (Time of Flight) camera, a stereo camera, a monocular camera, and an infrared camera can be applied as needed. Not limited to this, the camera 151 may be simply for acquiring a captured image regardless of the distance measurement.
 また、例えば、外部認識センサ125は、車両100に対する環境を検出するための環境センサを備えることができる。環境センサは、天候、気象、明るさ等の環境を検出するためのセンサであって、例えば、雨滴センサ、霧センサ、日照センサ、雪センサ、照度センサ等の各種センサを含むことができる。 Further, for example, the external recognition sensor 125 can be provided with an environment sensor for detecting the environment for the vehicle 100. The environment sensor is a sensor for detecting the environment such as weather, weather, and brightness, and may include various sensors such as a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and an illuminance sensor.
 さらに、例えば、外部認識センサ125は、車両100の周囲の音や音源の位置の検出等に用いられるマイクロフォンを備える。 Further, for example, the external recognition sensor 125 includes a microphone used for detecting the position of a sound or a sound source around the vehicle 100.
 車内センサ126は、車内の情報を検出するための各種のセンサを備え、各センサからのセンサ情報を車両制御システム111の各部に供給する。車内センサ126が備える各種センサの種類や数は、現実的に車両100に設置可能な数であれば特に限定されない。 The in-vehicle sensor 126 includes various sensors for detecting in-vehicle information, and supplies sensor information from each sensor to each part of the vehicle control system 111. The type and number of various sensors included in the in-vehicle sensor 126 are not particularly limited as long as they can be practically installed in the vehicle 100.
 例えば、車内センサ126は、カメラ、レーダ、着座センサ、ステアリングホイールセンサ、マイクロフォン、生体センサのうち1種類以上のセンサを備えることができる。車内センサ126が備えるカメラとしては、例えば、ToFカメラ、ステレオカメラ、単眼カメラ、赤外線カメラといった、測距可能な各種の撮影方式のカメラを用いることができる。これに限らず、車内センサ126が備えるカメラは、測距に関わらずに、単に撮影画像を取得するためのものであってもよい。車内センサ126が備える生体センサは、例えば、シートやステアリングホイール等に設けられ、運転者等の搭乗者の各種の生体情報を検出する。 For example, the in-vehicle sensor 126 can include one or more of a camera, a radar, a seating sensor, a steering wheel sensor, a microphone, and a biosensor. As the camera included in the in-vehicle sensor 126, for example, a camera of various shooting methods capable of measuring a distance, such as a ToF camera, a stereo camera, a monocular camera, and an infrared camera, can be used. Not limited to this, the camera included in the in-vehicle sensor 126 may be simply for acquiring a captured image regardless of the distance measurement. The biosensor included in the in-vehicle sensor 126 is provided on, for example, a seat, a steering wheel, or the like, and detects various biometric information of a occupant such as a driver.
 車両センサ127は、車両100の状態を検出するための各種のセンサを備え、各センサからのセンサ情報を車両制御システム111の各部に供給する。車両センサ127が備える各種センサの種類や数は、現実的に車両100に設置可能な数であれば特に限定されない。 The vehicle sensor 127 includes various sensors for detecting the state of the vehicle 100, and supplies sensor information from each sensor to each part of the vehicle control system 111. The types and numbers of various sensors included in the vehicle sensor 127 are not particularly limited as long as they can be practically installed in the vehicle 100.
 例えば、車両センサ127は、速度センサ、加速度センサ、角速度センサ(ジャイロセンサ)、及び、それらを統合した慣性計測装置(IMU:Inertial Measurement Unit))を備える。例えば、車両センサ127は、ステアリングホイールの操舵角を検出する操舵角センサ、ヨーレートセンサ、アクセルペダルの操作量を検出するアクセルセンサ、及び、ブレーキペダルの操作量を検出するブレーキセンサを備える。例えば、車両センサ127は、エンジンやモータの回転数を検出する回転センサ、タイヤの空気圧を検出する空気圧センサ、タイヤのスリップ率を検出するスリップ率センサ、及び、車輪の回転数と回転速度を検出する車輪センサを備える。例えば、車両センサ127は、バッテリの残量及び温度を検出するバッテリセンサ、及び、外部からの衝撃を検出する衝撃センサを備える。 For example, the vehicle sensor 127 includes a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an inertial measurement unit (IMU: Inertial Measurement Unit) that integrates them. For example, the vehicle sensor 127 includes a steering angle sensor for detecting the steering angle of the steering wheel, a yaw rate sensor, an accelerator sensor for detecting the operation amount of the accelerator pedal, and a brake sensor for detecting the operation amount of the brake pedal. For example, the vehicle sensor 127 detects a rotation speed of an engine or a motor, an air pressure sensor of detecting tire pressure, a slip ratio sensor of detecting a tire slip ratio, and a wheel rotation speed and rotation speed. It is equipped with a wheel sensor. For example, the vehicle sensor 127 includes a battery sensor that detects the remaining amount and temperature of the battery, and an impact sensor that detects an impact from the outside.
 記録部128は、不揮発性の記憶媒体および揮発性の記憶媒体のうち少なくとも一方を含み、情報やプログラムを記憶する。記録部128は、例えばEEPROM(Electrically Erasable Programmable Read Only Memory)およびRAM(Random Access Memory)として用いられ、記憶媒体としては、HDD(Hard Disc Drive)といった磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス、及び、光磁気記憶デバイスを適用することができる。記録部128は、車両制御システム111の各部が用いる各種プログラムや情報を記録する。例えば、記録部128は、EDR(Event Data Recorder)やDSSAD(Data Storage System for Automated Driving)を備え、事故等のイベントの前後の車両100の情報や車内センサ126によって取得された生体情報を記録する。 The recording unit 128 includes at least one of a non-volatile storage medium and a volatile storage medium, and stores information and programs. The recording unit 128 is used as, for example, an EEPROM (Electrically Erasable Programmable Read Only Memory) and a RAM (Random Access Memory), and the storage medium includes a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, and an optical storage device. And a photomagnetic storage device can be applied. The recording unit 128 records various programs and information used by each unit of the vehicle control system 111. For example, the recording unit 128 is equipped with an EDR (Event Data Recorder) and DSSAD (Data Storage System for Automated Driving), and records information on the vehicle 100 before and after an event such as an accident and biometric information acquired by the in-vehicle sensor 126. ..
 走行支援・自動運転制御部129は、車両100の走行支援及び自動運転の制御を行う。例えば、走行支援・自動運転制御部129は、分析部161、行動計画部162、及び、動作制御部163を備える。 The driving support / automatic driving control unit 129 controls the driving support and automatic driving of the vehicle 100. For example, the driving support / automatic driving control unit 129 includes an analysis unit 161, an action planning unit 162, and an motion control unit 163.
 分析部161は、車両100及び周囲の状況の分析処理を行う。分析部161は、自己位置推定部171、センサフュージョン部172、及び、認識部173を備える。 The analysis unit 161 analyzes the vehicle 100 and the surrounding conditions. The analysis unit 161 includes a self-position estimation unit 171, a sensor fusion unit 172, and a recognition unit 173.
 自己位置推定部171は、外部認識センサ125からのセンサ情報、及び、地図情報蓄積部123に蓄積されている高精度地図に基づいて、車両100の自己位置を推定する。例えば、自己位置推定部171は、外部認識センサ125からのセンサ情報に基づいてローカルマップを生成し、ローカルマップと高精度地図とのマッチングを行うことにより、車両100の自己位置を推定する。車両100の位置は、例えば、後輪対車軸の中心が基準とされる。 The self-position estimation unit 171 estimates the self-position of the vehicle 100 based on the sensor information from the external recognition sensor 125 and the high-precision map stored in the map information storage unit 123. For example, the self-position estimation unit 171 generates a local map based on the sensor information from the external recognition sensor 125, and estimates the self-position of the vehicle 100 by matching the local map with the high-precision map. The position of the vehicle 100 is based on, for example, the center of the rear wheel-to-axle.
 ローカルマップは、例えば、SLAM(Simultaneous Localization and Mapping)等の技術を用いて作成される3次元の高精度地図、占有格子地図(Occupancy Grid Map)等である。3次元の高精度地図は、例えば、上述したポイントクラウドマップ等である。占有格子地図は、車両100の周囲の3次元又は2次元の空間を所定の大きさのグリッド(格子)に分割し、グリッド単位で物体の占有状態を示す地図である。物体の占有状態は、例えば、物体の有無や存在確率により示される。ローカルマップは、例えば、認識部173による車両100の外部の状況の検出処理及び認識処理にも用いられる。 The local map is, for example, a three-dimensional high-precision map created by using a technology such as SLAM (Simultaneous Localization and Mapping), an occupied grid map (OccupancyGridMap), or the like. The three-dimensional high-precision map is, for example, the point cloud map described above. The occupied grid map is a map that divides a three-dimensional or two-dimensional space around the vehicle 100 into a grid (grid) of a predetermined size and shows the occupied state of an object in grid units. The occupied state of an object is indicated by, for example, the presence or absence of an object and the probability of existence. The local map is also used, for example, in the detection process and the recognition process of the external situation of the vehicle 100 by the recognition unit 173.
 なお、自己位置推定部171は、GNSS信号、及び、車両センサ127からのセンサ情報に基づいて、車両100の自己位置を推定してもよい。 The self-position estimation unit 171 may estimate the self-position of the vehicle 100 based on the GNSS signal and the sensor information from the vehicle sensor 127.
 センサフュージョン部172は、複数の異なる種類のセンサ情報(例えば、カメラ151から供給される画像情報、及び、レーダ152から供給されるセンサ情報)を組み合わせて、新たな情報を得るセンサフュージョン処理を行う。異なる種類のセンサ情報を組み合わせる方法としては、統合、融合、連合等がある。 The sensor fusion unit 172 performs sensor fusion processing for obtaining new information by combining a plurality of different types of sensor information (for example, image information supplied from the camera 151 and sensor information supplied from the radar 152). .. Methods for combining different types of sensor information include integration, fusion, and association.
 認識部173は、車両100の外部の状況の検出を行う検出処理と、車両100の外部の状況の認識を行う認識処理と、を実行する。 The recognition unit 173 executes a detection process for detecting the external situation of the vehicle 100 and a recognition process for recognizing the external situation of the vehicle 100.
 例えば、認識部173は、外部認識センサ125からの情報、自己位置推定部171からの情報、センサフュージョン部172からの情報等に基づいて、車両100の外部の状況の検出処理及び認識処理を行う。 For example, the recognition unit 173 performs detection processing and recognition processing of the external situation of the vehicle 100 based on the information from the external recognition sensor 125, the information from the self-position estimation unit 171 and the information from the sensor fusion unit 172. ..
 具体的には、例えば、認識部173は、車両100の周囲の物体の検出処理及び認識処理等を行う。物体の検出処理とは、例えば、物体の有無、大きさ、形、位置、動き等を検出する処理である。物体の認識処理とは、例えば、物体の種類等の属性を認識したり、特定の物体を識別したりする処理である。ただし、検出処理と認識処理とは、必ずしも明確に分かれるものではなく、重複する場合がある。 Specifically, for example, the recognition unit 173 performs detection processing, recognition processing, and the like of objects around the vehicle 100. The object detection process is, for example, a process of detecting the presence / absence, size, shape, position, movement, etc. of an object. The object recognition process is, for example, a process of recognizing an attribute such as an object type or identifying a specific object. However, the detection process and the recognition process are not always clearly separated and may overlap.
 例えば、認識部173は、LiDAR153又はレーダ152等によるセンサ情報に基づくポイントクラウドを点群の塊毎に分類するクラスタリングを行うことにより、車両100の周囲の物体を検出する。これにより、車両100の周囲の物体の有無、大きさ、形状、位置が検出される。 For example, the recognition unit 173 detects an object around the vehicle 100 by performing clustering that classifies the point cloud based on the sensor information by the LiDAR 153, the radar 152, or the like into each block of the point cloud. As a result, the presence / absence, size, shape, and position of an object around the vehicle 100 are detected.
 例えば、認識部173は、クラスタリングにより分類された点群の塊の動きを追従するトラッキングを行うことにより、車両100の周囲の物体の動きを検出する。これにより、車両100の周囲の物体の速度及び進行方向(移動ベクトル)が検出される。 For example, the recognition unit 173 detects the movement of an object around the vehicle 100 by performing tracking that follows the movement of a mass of point clouds classified by clustering. As a result, the velocity and the traveling direction (movement vector) of the object around the vehicle 100 are detected.
 例えば、認識部173は、カメラ151から供給される画像情報に対して、車両、人、自転車、障害物、構造物、道路、信号機、交通標識、道路標示などを検出または認識する。また、セマンティックセグメンテーション等の認識処理を行うことにより、車両100の周囲の物体の種類を認識してもいい。 For example, the recognition unit 173 detects or recognizes a vehicle, a person, a bicycle, an obstacle, a structure, a road, a traffic light, a traffic sign, a road sign, or the like with respect to the image information supplied from the camera 151. Further, the type of the object around the vehicle 100 may be recognized by performing the recognition process such as semantic segmentation.
 例えば、認識部173は、地図情報蓄積部123に蓄積されている地図、自己位置推定部171による自己位置の推定結果、及び、認識部173による車両100の周囲の物体の認識結果に基づいて、車両100の周囲の交通ルールの認識処理を行うことができる。認識部173は、この処理により、信号の位置及び状態、交通標識及び道路標示の内容、交通規制の内容、並びに、走行可能な車線などを認識することができる。 For example, the recognition unit 173 is based on the map stored in the map information storage unit 123, the self-position estimation result by the self-position estimation unit 171 and the recognition result of the object around the vehicle 100 by the recognition unit 173. It is possible to perform recognition processing of traffic rules around the vehicle 100. By this processing, the recognition unit 173 can recognize the position and state of the signal, the content of the traffic sign and the road marking, the content of the traffic regulation, the lane in which the vehicle can travel, and the like.
 例えば、認識部173は、車両100の周囲の環境の認識処理を行うことができる。認識部173が認識対象とする周囲の環境としては、天候、気温、湿度、明るさ、及び、路面の状態等が想定される。 For example, the recognition unit 173 can perform recognition processing of the environment around the vehicle 100. As the surrounding environment to be recognized by the recognition unit 173, weather, temperature, humidity, brightness, road surface condition, and the like are assumed.
 行動計画部162は、車両100の行動計画を作成する。例えば、行動計画部162は、経路計画、経路追従の処理を行うことにより、行動計画を作成する。 The action planning unit 162 creates an action plan for the vehicle 100. For example, the action planning unit 162 creates an action plan by performing route planning and route tracking processing.
 なお、経路計画(Global path planning)とは、スタートからゴールまでの大まかな経路を計画する処理である。この経路計画には、軌道計画と言われ、経路計画で計画された経路において、車両100の運動特性を考慮して、車両100の近傍で安全かつ滑らかに進行することが可能な軌道生成(Local path planning)の処理も含まれる。経路計画を長期経路計画、および起動生成を短期経路計画、または局所経路計画と区別してもよい。安全優先経路は、起動生成、短期経路計画、または局所経路計画と同様の概念を表す。 Note that route planning (Global path planning) is a process of planning a rough route from the start to the goal. This route plan is called a track plan, and in the route planned by the route plan, the track generation (Local) capable of safely and smoothly traveling in the vicinity of the vehicle 100 in consideration of the motion characteristics of the vehicle 100 is taken into consideration. The processing of path planning) is also included. The route plan may be distinguished from the long-term route plan and the activation generation from the short-term route plan or the local route plan. The safety priority route represents a concept similar to activation generation, short-term route planning, or local route planning.
 経路追従とは、経路計画により計画した経路を計画された時間内で安全かつ正確に走行するための動作を計画する処理である。行動計画部162は、例えば、この経路追従の処理の結果に基づき、車両100の目標速度と目標角速度を計算することができる。 Route tracking is a process of planning an operation for safely and accurately traveling on a route planned by route planning within a planned time. The action planning unit 162 can calculate the target speed and the target angular velocity of the vehicle 100, for example, based on the result of this route tracking process.
 動作制御部163は、行動計画部162により作成された行動計画を実現するために、車両100の動作を制御する。 The motion control unit 163 controls the motion of the vehicle 100 in order to realize the action plan created by the action plan unit 162.
 例えば、動作制御部163は、後述する車両制御部132に含まれる、ステアリング制御部181、ブレーキ制御部182、及び、駆動制御部183を制御して、軌道計画により計算された軌道を車両100が進行するように、加減速制御及び方向制御を行う。例えば、動作制御部163は、衝突回避あるいは衝撃緩和、追従走行、車速維持走行、自車の衝突警告、自車のレーン逸脱警告等のADASの機能実現を目的とした協調制御を行う。例えば、動作制御部163は、運転者の操作によらずに自律的に走行する自動運転等を目的とした協調制御を行う。 For example, the motion control unit 163 controls the steering control unit 181, the brake control unit 182, and the drive control unit 183, which are included in the vehicle control unit 132 described later, and the vehicle 100 controls the track calculated by the track plan. Acceleration / deceleration control and direction control are performed so as to proceed. For example, the motion control unit 163 performs coordinated control for the purpose of realizing ADAS functions such as collision avoidance or impact mitigation, follow-up running, vehicle speed maintenance running, collision warning of own vehicle, and lane deviation warning of own vehicle. For example, the motion control unit 163 performs coordinated control for the purpose of automatic driving or the like that autonomously travels without being operated by the driver.
 DMS130は、車内センサ126からのセンサ情報、及び、後述するHMI131に入力される入力情報等に基づいて、運転者の認証処理、及び、運転者の状態の認識処理等を行う。この場合にDMS130の認識対象となる運転者の状態としては、例えば、体調、覚醒度、集中度、疲労度、視線方向、酩酊度、運転操作、姿勢等が想定される。 The DMS 130 performs driver authentication processing, driver status recognition processing, and the like based on sensor information from the in-vehicle sensor 126 and input information input to HMI 131, which will be described later. In this case, as the state of the driver to be recognized by the DMS 130, for example, physical condition, arousal degree, concentration degree, fatigue degree, line-of-sight direction, drunkenness, driving operation, posture and the like are assumed.
 なお、DMS130が、運転者以外の搭乗者の認証処理、及び、当該搭乗者の状態の認識処理を行うようにしてもよい。また、例えば、DMS130が、車内センサ126からのセンサ情報に基づいて、車内の状況の認識処理を行うようにしてもよい。認識対象となる車内の状況としては、例えば、気温、湿度、明るさ、臭い等が想定される。 Note that the DMS 130 may perform authentication processing for passengers other than the driver and recognition processing for the status of the passenger. Further, for example, the DMS 130 may perform the recognition processing of the situation inside the vehicle based on the sensor information from the sensor 126 in the vehicle. As the situation inside the vehicle to be recognized, for example, temperature, humidity, brightness, odor, etc. are assumed.
 HMI131は、各種の情報や指示等の入力と、各種の情報の運転者などへの提示を行う。 HMI131 inputs various information and instructions, and presents various information to the driver and the like.
 HMI131による情報の入力について、概略的に説明する。HMI131は、人が情報を入力するための入力デバイスを備える。HMI131は、入力デバイスにより入力された情報や指示等に基づいて入力信号を生成し、車両制御システム111の各部に供給する。HMI131は、入力デバイスとして、例えばタッチパネル、ボタン、スイッチ、及び、レバーといった操作子を備える。これに限らず、HMI131は、音声やジェスチャ等により手動操作以外の方法で情報を入力可能な入力デバイスをさらに備えてもよい。さらに、HMI131は、例えば、赤外線あるいは電波を利用したリモートコントロール装置や、車両制御システム111の操作に対応したモバイル機器若しくはウェアラブル機器等の外部接続機器を入力デバイスとして用いてもよい。 The input of information by HMI131 will be outlined. The HMI 131 includes an input device for a person to input information. The HMI 131 generates an input signal based on information, instructions, and the like input by the input device, and supplies the input signal to each part of the vehicle control system 111. The HMI 131 includes an operator such as a touch panel, a button, a switch, and a lever as an input device. Not limited to this, the HMI 131 may further include an input device capable of inputting information by a method other than manual operation by voice, gesture, or the like. Further, the HMI 131 may use, for example, a remote control device using infrared rays or radio waves, or an externally connected device such as a mobile device or a wearable device corresponding to the operation of the vehicle control system 111 as an input device.
 HMI131による情報の提示について、概略的に説明する。HMI131は、搭乗者又は車外に対する視覚情報、聴覚情報、及び、触覚情報の生成を行う。また、HMI131は、生成されたこれら各情報の出力、出力内容、出力タイミングおよび出力方法等を制御する出力制御を行う。HMI131は、視覚情報として、例えば、操作画面、車両100の状態表示、警告表示、車両100の周囲の状況を示すモニタ画像等の画像や光により示される情報を生成および出力する。また、HMI131は、聴覚情報として、例えば、音声ガイダンス、警告音、警告メッセージ等の音により示される情報を生成および出力する。さらに、HMI131は、触覚情報として、例えば、力、振動、動き等により搭乗者の触覚に与えられる情報を生成および出力する。 The presentation of information by HMI131 will be outlined. The HMI 131 generates visual information, auditory information, and tactile information for the passenger or the outside of the vehicle. Further, the HMI 131 performs output control for controlling the output, output content, output timing, output method, etc. of each of the generated information. As visual information, the HMI 131 generates and outputs, for example, an image such as an operation screen, a status display of the vehicle 100, a warning display, a monitor image showing a situation around the vehicle 100, or information indicated by light. Further, the HMI 131 generates and outputs as auditory information, for example, information indicated by sounds such as voice guidance, warning sounds, and warning messages. Further, the HMI 131 generates and outputs tactile information that is given to the tactile sensation of the occupant by, for example, force, vibration, movement, or the like.
 HMI131が視覚情報を出力する出力デバイスとしては、例えば、自身が画像を表示することで視覚情報を提示する表示装置や、画像を投影することで視覚情報を提示するプロジェクタ装置を適用することができる。なお、表示装置は、通常のディスプレイを有する表示装置以外にも、例えば、ヘッドアップディスプレイ、透過型ディスプレイ、AR(Augmented Reality)機能を備えるウェアラブル機器といった、搭乗者の視界内に視覚情報を表示する装置であってもよい。また、HMI131は、車両100に設けられるナビゲーション装置、インストルメントパネル、CMS(Camera Monitoring System)、電子ミラー、ランプなどが有する表示デバイスを、視覚情報を出力する出力デバイスとして用いることも可能である。 As an output device for which the HMI 131 outputs visual information, for example, a display device that presents visual information by displaying an image by itself or a projector device that presents visual information by projecting an image can be applied. .. In addition to the display device having a normal display, the display device displays visual information in the passenger's field of view, such as a head-up display, a transmissive display, and a wearable device having an AR (Augmented Reality) function. It may be a device. Further, the HMI 131 can also use a display device included in a navigation device, an instrument panel, a CMS (Camera Monitoring System), an electronic mirror, a lamp, etc. provided in the vehicle 100 as an output device for outputting visual information.
 HMI131が聴覚情報を出力する出力デバイスとしては、例えば、オーディオスピーカ、ヘッドホン、イヤホンを適用することができる。 As an output device for which the HMI 131 outputs auditory information, for example, an audio speaker, headphones, and earphones can be applied.
 HMI131が触覚情報を出力する出力デバイスとしては、例えば、ハプティクス技術を用いたハプティクス素子を適用することができる。ハプティクス素子は、例えば、ステアリングホイール、シートといった、車両100の搭乗者が接触する部分に設けられる。 As an output device for which the HMI 131 outputs tactile information, for example, a haptics element using haptics technology can be applied. The haptic element is provided in a portion of the vehicle 100 that the occupant of the vehicle contacts contacts, such as a steering wheel and a seat.
 車両制御部132は、車両100の各部の制御を行う。車両制御部132は、ステアリング制御部181、ブレーキ制御部182、駆動制御部183、ボディ系制御部184、ライト制御部185、及び、ホーン制御部186を備える。 The vehicle control unit 132 controls each part of the vehicle 100. The vehicle control unit 132 includes a steering control unit 181, a brake control unit 182, a drive control unit 183, a body system control unit 184, a light control unit 185, and a horn control unit 186.
 ステアリング制御部181は、車両100のステアリングシステムの状態の検出及び制御等を行う。ステアリングシステムは、例えば、ステアリングホイール等を備えるステアリング機構、電動パワーステアリング等を備える。ステアリング制御部181は、例えば、ステアリングシステムの制御を行うECU等の制御ユニット、ステアリングシステムの駆動を行うアクチュエータ等を備える。 The steering control unit 181 detects and controls the state of the steering system of the vehicle 100. The steering system includes, for example, a steering mechanism including a steering wheel, electric power steering, and the like. The steering control unit 181 includes, for example, a control unit such as an ECU that controls the steering system, an actuator that drives the steering system, and the like.
 ブレーキ制御部182は、車両100のブレーキシステムの状態の検出及び制御等を行う。ブレーキシステムは、例えば、ブレーキペダル等を含むブレーキ機構、ABS(Antilock Brake System)、回生ブレーキ機構等を備える。ブレーキ制御部182は、例えば、ブレーキシステムの制御を行うECU等の制御ユニット等を備える。 The brake control unit 182 detects and controls the state of the brake system of the vehicle 100. The brake system includes, for example, a brake mechanism including a brake pedal, an ABS (Antilock Brake System), a regenerative brake mechanism, and the like. The brake control unit 182 includes, for example, a control unit such as an ECU that controls the brake system.
 駆動制御部183は、車両100の駆動システムの状態の検出及び制御等を行う。駆動システムは、例えば、アクセルペダル、内燃機関又は駆動用モータ等の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構等を備える。駆動制御部183は、例えば、駆動システムの制御を行うECU等の制御ユニット等を備える。 The drive control unit 183 detects and controls the state of the drive system of the vehicle 100. The drive system includes, for example, a drive force generator for generating a drive force of an accelerator pedal, an internal combustion engine, a drive motor, or the like, a drive force transmission mechanism for transmitting the drive force to the wheels, and the like. The drive control unit 183 includes, for example, a control unit such as an ECU that controls the drive system.
 ボディ系制御部184は、車両100のボディ系システムの状態の検出及び制御等を行う。ボディ系システムは、例えば、キーレスエントリシステム、スマートキーシステム、パワーウインドウ装置、パワーシート、空調装置、エアバッグ、シートベルト、シフトレバー等を備える。ボディ系制御部184は、例えば、ボディ系システムの制御を行うECU等の制御ユニット等を備える。 The body system control unit 184 detects and controls the state of the body system of the vehicle 100. The body system includes, for example, a keyless entry system, a smart key system, a power window device, a power seat, an air conditioner, an airbag, a seat belt, a shift lever, and the like. The body system control unit 184 includes, for example, a control unit such as an ECU that controls the body system.
 ライト制御部185は、車両100の各種のライトの状態の検出及び制御等を行う。制御対象となるライトとしては、例えば、ヘッドライト、バックライト、フォグライト、ターンシグナル、ブレーキライト、プロジェクション、バンパーの表示等が想定される。ライト制御部185は、ライトの制御を行うECU等の制御ユニット等を備える。 The light control unit 185 detects and controls various light states of the vehicle 100. As the light to be controlled, for example, a headlight, a backlight, a fog light, a turn signal, a brake light, a projection, a bumper display, or the like is assumed. The light control unit 185 includes a control unit such as an ECU that controls the light.
 ホーン制御部186は、車両100のカーホーンの状態の検出及び制御等を行う。ホーン制御部186は、例えば、カーホーンの制御を行うECU等の制御ユニット等を備える。 The horn control unit 186 detects and controls the state of the car horn of the vehicle 100. The horn control unit 186 includes, for example, a control unit such as an ECU that controls the car horn.
 図7は、図6の外部認識センサ125のカメラ151、レーダ152、LiDAR153、及び、超音波センサ154等によるセンシング領域の例を示す図である。なお、図7において、車両100を上面から見た様子が模式的に示され、左端側が車両100の前端(フロント)側であり、右端側が車両100の後端(リア)側となっている。 FIG. 7 is a diagram showing an example of a sensing region of the external recognition sensor 125 of FIG. 6 by a camera 151, a radar 152, a LiDAR 153, an ultrasonic sensor 154, and the like. Note that FIG. 7 schematically shows a view of the vehicle 100 from above, with the left end side being the front end (front) side of the vehicle 100 and the right end side being the rear end (rear) side of the vehicle 100.
 センシング領域201F及びセンシング領域201Bは、超音波センサ154のセンシング領域の例を示している。センシング領域201Fは、複数の超音波センサ154によって車両100の前端周辺をカバーしている。センシング領域201Bは、複数の超音波センサ154によって車両100の後端周辺をカバーしている。 The sensing area 201F and the sensing area 201B show an example of the sensing area of the ultrasonic sensor 154. The sensing region 201F covers the vicinity of the front end of the vehicle 100 by a plurality of ultrasonic sensors 154. The sensing region 201B covers the periphery of the rear end of the vehicle 100 by a plurality of ultrasonic sensors 154.
 センシング領域201F及びセンシング領域201Bにおけるセンシング結果は、例えば、車両100の駐車支援等に用いられる。 The sensing results in the sensing area 201F and the sensing area 201B are used, for example, for parking support of the vehicle 100 and the like.
 センシング領域202F乃至センシング領域202Bは、短距離又は中距離用のレーダ152のセンシング領域の例を示している。センシング領域202Fは、車両100の前方において、センシング領域201Fより遠い位置までカバーしている。センシング領域202Bは、車両100の後方において、センシング領域201Bより遠い位置までカバーしている。センシング領域202Lは、車両100の左側面の後方の周辺をカバーしている。センシング領域202Rは、車両100の右側面の後方の周辺をカバーしている。 The sensing area 202F to the sensing area 202B show an example of the sensing area of the radar 152 for a short distance or a medium distance. The sensing area 202F covers a position farther than the sensing area 201F in front of the vehicle 100. The sensing region 202B covers the rear of the vehicle 100 to a position farther than the sensing region 201B. The sensing area 202L covers the rear periphery of the left side surface of the vehicle 100. The sensing region 202R covers the rear periphery of the right side surface of the vehicle 100.
 センシング領域202Fにおけるセンシング結果は、例えば、車両100の前方に存在する車両や歩行者等の検出等に用いられる。センシング領域202Bにおけるセンシング結果は、例えば、車両100の後方の衝突防止機能等に用いられる。センシング領域202L及びセンシング領域202Rにおけるセンシング結果は、例えば、車両100の側方の死角における物体の検出等に用いられる。 The sensing result in the sensing area 202F is used, for example, for detecting a vehicle, a pedestrian, or the like existing in front of the vehicle 100. The sensing result in the sensing region 202B is used, for example, for a collision prevention function behind the vehicle 100. The sensing results in the sensing area 202L and the sensing area 202R are used, for example, for detecting an object in a blind spot on the side of the vehicle 100.
 センシング領域203F乃至センシング領域203Bは、カメラ151によるセンシング領域の例を示している。センシング領域203Fは、車両100の前方において、センシング領域202Fより遠い位置までカバーしている。センシング領域203Bは、車両100の後方において、センシング領域202Bより遠い位置までカバーしている。センシング領域203Lは、車両100の左側面の周辺をカバーしている。センシング領域203Rは、車両100の右側面の周辺をカバーしている。 The sensing area 203F to the sensing area 203B show an example of the sensing area by the camera 151. The sensing area 203F covers a position farther than the sensing area 202F in front of the vehicle 100. The sensing region 203B covers the rear of the vehicle 100 to a position farther than the sensing region 202B. The sensing area 203L covers the periphery of the left side surface of the vehicle 100. The sensing region 203R covers the periphery of the right side surface of the vehicle 100.
 センシング領域203Fにおけるセンシング結果は、例えば、信号機や交通標識の認識、車線逸脱防止支援システム、自動ヘッドライト制御システムに用いることができる。センシング領域203Bにおけるセンシング結果は、例えば、駐車支援、及び、サラウンドビューシステムに用いることができる。センシング領域203L及びセンシング領域203Rにおけるセンシング結果は、例えば、サラウンドビューシステムに用いることができる。 The sensing result in the sensing area 203F can be used, for example, for recognition of traffic lights and traffic signs, a lane departure prevention support system, and an automatic headlight control system. The sensing result in the sensing area 203B can be used, for example, for parking assistance and a surround view system. The sensing results in the sensing area 203L and the sensing area 203R can be used, for example, in a surround view system.
 センシング領域204は、LiDAR153のセンシング領域の例を示している。センシング領域204は、車両100の前方において、センシング領域203Fより遠い位置までカバーしている。一方、センシング領域204は、センシング領域203Fより左右方向の範囲が狭くなっている。 Sensing area 204 shows an example of the sensing area of LiDAR153. The sensing region 204 covers a position far from the sensing region 203F in front of the vehicle 100. On the other hand, the sensing area 204 has a narrower range in the left-right direction than the sensing area 203F.
 センシング領域204におけるセンシング結果は、例えば、周辺車両等の物体検出に用いられる。 The sensing result in the sensing area 204 is used for detecting an object such as a peripheral vehicle, for example.
 センシング領域205は、長距離用のレーダ152のセンシング領域の例を示している。
センシング領域205は、車両100の前方において、センシング領域204より遠い位置までカバーしている。一方、センシング領域205は、センシング領域204より左右方向の範囲が狭くなっている。
The sensing area 205 shows an example of the sensing area of the radar 152 for a long distance.
The sensing region 205 covers a position farther than the sensing region 204 in front of the vehicle 100. On the other hand, the sensing area 205 has a narrower range in the left-right direction than the sensing area 204.
 センシング領域205におけるセンシング結果は、例えば、ACC(Adaptive Cruise Control)、緊急ブレーキ、衝突回避等に用いられる。 The sensing result in the sensing area 205 is used for, for example, ACC (Adaptive Cruise Control), emergency braking, collision avoidance, and the like.
 なお、外部認識センサ125が含むカメラ151、レーダ152、LiDAR153、及び、超音波センサ154の各センサのセンシング領域は、図7以外に各種の構成をとってもよい。具体的には、超音波センサ154が車両100の側方もセンシングするようにしてもよいし、LiDAR153が車両100の後方をセンシングするようにしてもよい。また、各センサの設置位置は、上述した各例に限定されない。また、各センサの数は、1つでもよいし、複数であってもよい。 The sensing areas of the cameras 151, the radar 152, the LiDAR 153, and the ultrasonic sensors 154 included in the external recognition sensor 125 may have various configurations other than those in FIG. 7. Specifically, the ultrasonic sensor 154 may also sense the side of the vehicle 100, or the LiDAR 153 may sense the rear of the vehicle 100. Further, the installation position of each sensor is not limited to each of the above-mentioned examples. Further, the number of each sensor may be one or a plurality.
 このような車両制御システム111は、車両センサ127で検出された車輪の回転量に基づき算出した車両100の移動量を示す移動量情報の生成と、車両センサ127で検出された加速度や角速度に基づき勾配検出情報を生成する。また、自己位置推定部171は、通信部122で受信したサーバ30からの路面勾配情報に基づき自己位置を補正して補正自己位置情報を生成する。さらに、車両制御システム111は、生成した移動量情報と勾配検出情報および補正自己位置情報を、通信部122からサーバ30へ送信する。 Such a vehicle control system 111 generates movement amount information indicating the movement amount of the vehicle 100 calculated based on the rotation amount of the wheels detected by the vehicle sensor 127, and is based on the acceleration and the angular speed detected by the vehicle sensor 127. Generate gradient detection information. Further, the self-position estimation unit 171 corrects the self-position based on the road surface gradient information from the server 30 received by the communication unit 122, and generates the corrected self-position information. Further, the vehicle control system 111 transmits the generated movement amount information, the gradient detection information, and the corrected self-position information from the communication unit 122 to the server 30.
 また、車両制御システム111は、通信部122で受信したサーバ30からの路面勾配情報を自己位置推定部171、経路計画を行動計画部162へそれぞれ出力する。 Further, the vehicle control system 111 outputs the road surface gradient information from the server 30 received by the communication unit 122 to the self-position estimation unit 171 and the route plan to the action planning unit 162, respectively.
 行動計画部162は、経路計画により計画した経路を計画された時間内で安全かつ正確に走行するための動作を計画する経路追従を行い、車両100の動作制御を行う制御情報を生成して動作制御部163へ出力する。 The action planning unit 162 performs route tracking for planning an operation for safely and accurately traveling on the route planned by the route plan, and generates control information for controlling the operation of the vehicle 100 to operate. Output to control unit 163.
 このように、本技術の情報処理装置で生成された路面勾配情報や経路計画を車両100に供給することで車両100を最適な経路で目的地に移動させることができるようになる。また、車両100は、車両の走行量と路面勾配情報によって、自己位置を精度よく算出できるので、位置情報受信部124で位置情報を受信できない場合でも、車両100の現在位置を精度よく検出できる。なお、本開示に係る技術は、例えば自動車、電気自動車、ハイブリッド電気自動車等の車両に限らす建設機械や農業機械(トラクター)などの移動装置で実現されてもよい。 In this way, by supplying the road surface gradient information and the route plan generated by the information processing apparatus of the present technology to the vehicle 100, the vehicle 100 can be moved to the destination by the optimum route. Further, since the vehicle 100 can accurately calculate its own position based on the traveling amount of the vehicle and the road surface gradient information, the current position of the vehicle 100 can be accurately detected even when the position information receiving unit 124 cannot receive the position information. The technique according to the present disclosure may be realized by a mobile device such as a construction machine or an agricultural machine (tractor) limited to a vehicle such as an automobile, an electric vehicle, or a hybrid electric vehicle.
 明細書中において説明した一連の処理はハードウェア、またはソフトウェア、あるいは両者の複合構成によって実行することが可能である。ソフトウェアによる処理を実行する場合は、処理シーケンスを記録したプログラムを、専用のハードウェアに組み込まれたコンピュータ内のメモリにインストールして実行させる。または、各種処理が実行可能な汎用コンピュータにプログラムをインストールして実行させることが可能である。 The series of processes described in the specification can be executed by hardware, software, or a composite configuration of both. When executing processing by software, the program that records the processing sequence is installed in the memory in the computer built in the dedicated hardware and executed. Alternatively, it is possible to install and execute the program on a general-purpose computer that can execute various processes.
 例えば、プログラムは記録媒体としてのハードディスクやSSD(Solid State Drive)、ROM(Read Only Memory)に予め記録しておくことができる。あるいは、プログラムはフレキシブルディスク、CD-ROM(Compact Disc Read Only Memory),MO(Magneto optical)ディスク,DVD(Digital Versatile Disc)、BD(Blu-Ray Disc(登録商標))、磁気ディスク、半導体メモリカード等のリムーバブル記録媒体に、一時的または永続的に格納(記録)しておくことができる。このようなリムーバブル記録媒体は、いわゆるパッケージソフトウェアとして提供することができる。 For example, the program can be recorded in advance on a hard disk as a recording medium, SSD (Solid State Drive), or ROM (Read Only Memory). Alternatively, the program is a flexible disc, CD-ROM (Compact Disc Read Only Memory), MO (Magneto optical) disc, DVD (Digital Versatile Disc), BD (Blu-Ray Disc (registered trademark)), magnetic disc, semiconductor memory card. It can be temporarily or permanently stored (recorded) on a removable recording medium such as an optical disc. Such removable recording media can be provided as so-called package software.
 また、プログラムは、リムーバブル記録媒体からコンピュータにインストールする他、ダウンロードサイトからLAN(Local Area Network)やインターネット等のネットワークを介して、コンピュータに無線または有線で転送してもよい。コンピュータでは、そのようにして転送されてくるプログラムを受信し、内蔵するハードディスク等の記録媒体にインストールすることができる。 In addition to installing the program on the computer from a removable recording medium, the program may be transferred from the download site to the computer wirelessly or by wire via a network such as LAN (Local Area Network) or the Internet. The computer can receive the program transferred in this way and install it on a recording medium such as a built-in hard disk.
 なお、本明細書に記載した効果はあくまで例示であって限定されるものではなく、記載されていない付加的な効果があってもよい。また、本技術は、上述した技術の実施の形態に限定して解釈されるべきではない。この技術の実施の形態は、例示という形態で本技術を開示しており、本技術の要旨を逸脱しない範囲で当業者が実施の形態の修正や代用をなし得ることは自明である。すなわち、本技術の要旨を判断するためには、請求の範囲を参酌すべきである。 It should be noted that the effects described in the present specification are merely examples and are not limited, and may have additional effects not described. In addition, the present technology should not be construed as being limited to the embodiments of the above-mentioned techniques. The embodiment of this technique discloses the present technology in the form of an example, and it is obvious that a person skilled in the art can modify or substitute the embodiment without departing from the gist of the present technique. That is, in order to judge the gist of this technology, the scope of claims should be taken into consideration.
 また、本技術の情報処理装置は以下のような構成も取ることができる。
 (1) 勾配情報が設定された地図情報と移動装置で取得された走行情報に基づいて、前記地図情報に設定されている前記勾配情報を修正する地図情報処理部を備える情報処理装置。
 (2) 前記地図情報処理部は、前記移動装置によって取得された走行情報に基づいて生成された勾配検出情報を用いて、前記勾配検出情報を生成した移動位置と対応する前記地図情報の位置の勾配情報を修正する地図上の位置の勾配情報を修正する(1)に記載の情報処理装置。
 (3) 前記地図情報処理部は、前記移動装置によって取得された走行情報に基づいて生成された勾配検出情報と同じ移動位置で既に生成されている勾配検出情報を用いたフィルタ処理を行い、フィルタ処理後の勾配情報を用いて前記移動位置と対応する前記地図情報の位置の勾配情報を修正する(2)に記載の情報処理装置。
 (4) 前記地図情報処理部は、勾配情報が設定されていない地図情報と移動装置で取得された走行情報に基づいて、勾配情報が設定された地図情報を生成する(1)乃至(3)のいずれかに記載の情報処理装置。
 (5) 前記走行情報は、前記移動装置に設けたセンサで取得されたセンサ情報を含み、
 前記勾配検出情報は、前記センサ情報に基づいて生成される(2)乃至(4)のいずれかに記載の情報処理装置。
 (6) 前記センサ情報は、前記移動装置の並進運動または回転運動のいずれか少なくとも1以上を検出した情報である(5)に記載の情報処理装置。
 (7) 前記センサは、慣性計測装置である(5)または(6)に記載の情報処理装置。
 (8) 前記慣性計測装置は、加速度センサと角速度センサの少なくともいずれかを備え、前記加速度センサにより前記並進運動を検出して、前記角速度センサにより前記回転運動を検出する(7)に記載の情報処理装置。
 (9) 前記地図情報処理部で修正された前記地図情報に基づいて目的地までの経路計画を作成する経路計画部をさらに備える(1)乃至(8)のいずれかに記載の情報処理装置。
 (10) 前記経路計画部は、前記地図情報に設定されている勾配情報を用いて前記経路計画を作成する(9)に記載の情報処理装置。
 (11) 前記移動装置は、前記地図情報に設定されている前記勾配情報を用いて自己位置推定を行う自己位置推定部を備える(1)乃至(10)のいずれかに記載の情報処理装置。
In addition, the information processing device of the present technology can have the following configurations.
(1) An information processing device including a map information processing unit that corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
(2) The map information processing unit uses the gradient detection information generated based on the travel information acquired by the moving device, and the position of the map information corresponding to the moving position that generated the gradient detection information. The information processing apparatus according to (1), which corrects the slope information of the position on the map.
(3) The map information processing unit performs a filter process using the gradient detection information already generated at the same movement position as the gradient detection information generated based on the travel information acquired by the moving device, and filters. The information processing apparatus according to (2), wherein the gradient information of the position of the map information corresponding to the moving position is corrected by using the gradient information after processing.
(4) The map information processing unit generates map information with gradient information set based on map information for which gradient information is not set and traveling information acquired by the moving device (1) to (3). The information processing device described in any of.
(5) The traveling information includes sensor information acquired by a sensor provided in the moving device.
The information processing apparatus according to any one of (2) to (4), wherein the gradient detection information is generated based on the sensor information.
(6) The information processing device according to (5), wherein the sensor information is information that detects at least one of translational motion or rotational motion of the moving device.
(7) The information processing device according to (5) or (6), which is an inertial measurement unit.
(8) The information according to (7), wherein the inertial measurement unit includes at least one of an acceleration sensor and an angular velocity sensor, the translational motion is detected by the acceleration sensor, and the rotational motion is detected by the angular velocity sensor. Processing device.
(9) The information processing apparatus according to any one of (1) to (8), further comprising a route planning unit that creates a route plan to a destination based on the map information corrected by the map information processing unit.
(10) The information processing apparatus according to (9), wherein the route planning unit creates the route plan using the gradient information set in the map information.
(11) The information processing device according to any one of (1) to (10), wherein the mobile device includes a self-position estimation unit that performs self-position estimation using the gradient information set in the map information.
 10・・・システム
 20・・・移動装置
 21・・・車輪オドメトリ
 22・・・勾配検出部
 23・・・自己位置推定部
 24・・・通信部
 25・・・経路追従部
 26・・・駆動制御部
 27・・・駆動部
 30・・・サーバ
 31・・・通信部
 32・・・地図情報処理部
 33・・・経路計画部
 321・・・情報格納部
 322・・・路面勾配情報生成部
10 ... System 20 ... Moving device 21 ... Wheel odometry 22 ... Gradient detection unit 23 ... Self-position estimation unit 24 ... Communication unit 25 ... Path tracking unit 26 ... Drive Control unit 27 ... Drive unit 30 ... Server 31 ... Communication unit 32 ... Map information processing unit 33 ... Route planning unit 321 ... Information storage unit 322 ... Road surface gradient information generation unit

Claims (13)

  1.  勾配情報が設定された地図情報と移動装置で取得された走行情報に基づいて、前記地図情報に設定されている前記勾配情報を修正する地図情報処理部
    を備える情報処理装置。
    An information processing device including a map information processing unit that corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
  2.  前記地図情報処理部は、前記移動装置によって取得された走行情報に基づいて生成された勾配検出情報を用いて、前記勾配検出情報を生成した移動位置と対応する前記地図情報の位置の勾配情報を修正する
    請求項1に記載の情報処理装置。
    The map information processing unit uses the gradient detection information generated based on the travel information acquired by the moving device to obtain the gradient information of the position of the map information corresponding to the moving position that generated the gradient detection information. The information processing apparatus according to claim 1.
  3.  前記地図情報処理部は、前記移動装置によって取得された走行情報に基づいて生成された勾配検出情報と同じ移動位置で既に生成されている勾配検出情報を用いたフィルタ処理を行い、フィルタ処理後の勾配情報を用いて前記移動位置と対応する前記地図情報の位置の勾配情報を修正する
    請求項2に記載の情報処理装置。
    The map information processing unit performs a filter process using the gradient detection information already generated at the same movement position as the gradient detection information generated based on the travel information acquired by the moving device, and after the filter processing. The information processing apparatus according to claim 2, wherein the gradient information is used to correct the gradient information of the position of the map information corresponding to the moving position.
  4.  前記地図情報処理部は、勾配情報が設定されていない地図情報と移動装置で取得された走行情報に基づいて、勾配情報が設定された地図情報を生成する
    請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the map information processing unit generates map information in which gradient information is set based on map information in which gradient information is not set and traveling information acquired by the moving device.
  5.  前記走行情報は、前記移動装置に設けたセンサで取得されたセンサ情報を含み、
     前記勾配検出情報は、前記センサ情報に基づいて生成される
    請求項2に記載の情報処理装置。
    The traveling information includes sensor information acquired by a sensor provided in the moving device.
    The information processing apparatus according to claim 2, wherein the gradient detection information is generated based on the sensor information.
  6.  前記センサ情報は、前記移動装置の並進運動または回転運動のいずれか少なくとも1以上を検出した情報である
    請求項5に記載の情報処理装置。
    The information processing device according to claim 5, wherein the sensor information is information that detects at least one of translational motion and rotational motion of the moving device.
  7.  前記センサは、慣性計測装置である
    請求項5に記載の情報処理装置。
    The information processing device according to claim 5, wherein the sensor is an inertial measurement unit.
  8.  前記慣性計測装置は、加速度センサと角速度センサの少なくともいずれかを備え、前記加速度センサにより前記並進運動を検出して、前記角速度センサにより前記回転運動を検出する
    請求項7に記載の情報処理装置。
    The information processing device according to claim 7, wherein the inertial measurement unit includes at least one of an acceleration sensor and an angular velocity sensor, the translational motion is detected by the acceleration sensor, and the rotational motion is detected by the angular velocity sensor.
  9.  前記地図情報処理部で修正された前記地図情報に基づいて目的地までの経路計画を作成する経路計画部をさらに備える
    請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, further comprising a route planning unit that creates a route plan to a destination based on the map information modified by the map information processing unit.
  10.  前記経路計画部は、前記地図情報に設定されている勾配情報を用いて前記経路計画を作成する
    請求項9に記載の情報処理装置。
    The information processing device according to claim 9, wherein the route planning unit creates the route plan using the gradient information set in the map information.
  11.  前記移動装置は、前記地図情報に設定されている前記勾配情報を用いて自己位置推定を行う自己位置推定部を備える
    請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the mobile device includes a self-position estimation unit that performs self-position estimation using the gradient information set in the map information.
  12.  勾配情報が設定された地図情報と移動装置で取得された走行情報に基づいて、前記地図情報に設定されている前記勾配情報の修正を地図情報処理部で行うこと
    を含む情報処理方法。
    An information processing method including that the map information processing unit corrects the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
  13.  移動装置で取得された情報を用いた処理をコンピュータで実行させるプログラムであって、
     勾配情報が設定された地図情報と移動装置で取得された走行情報に基づいて、前記地図情報に設定されている前記勾配情報の修正を行う手順
    を前記コンピュータで実行させるプログラム。
    A program that allows a computer to execute processing using information acquired by a mobile device.
    A program for causing the computer to execute a procedure for correcting the gradient information set in the map information based on the map information in which the gradient information is set and the traveling information acquired by the moving device.
PCT/JP2021/022629 2020-07-28 2021-06-15 Information processing device, information processing method, and program WO2022024569A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-127114 2020-07-28
JP2020127114A JP2022024493A (en) 2020-07-28 2020-07-28 Information processing device, information processing method and program

Publications (1)

Publication Number Publication Date
WO2022024569A1 true WO2022024569A1 (en) 2022-02-03

Family

ID=80035427

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/022629 WO2022024569A1 (en) 2020-07-28 2021-06-15 Information processing device, information processing method, and program

Country Status (2)

Country Link
JP (1) JP2022024493A (en)
WO (1) WO2022024569A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007225911A (en) * 2006-02-23 2007-09-06 Hitachi Ltd Road map information collection method, road map information collection system, and road map information processing device
US20130035849A1 (en) * 2011-08-01 2013-02-07 Mitac Research (Shanghai) Ltd. Navigation Apparatus Having Three-Dimensional Gravity Sensor and Navigation Method Thereof
JP2018106017A (en) * 2016-12-27 2018-07-05 株式会社オゼットクリエイティブ Device, program, and method for map information creation
JP2019158852A (en) * 2018-03-16 2019-09-19 三菱電機株式会社 Speed creation device, speed creation program, and speed creation method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007225911A (en) * 2006-02-23 2007-09-06 Hitachi Ltd Road map information collection method, road map information collection system, and road map information processing device
US20130035849A1 (en) * 2011-08-01 2013-02-07 Mitac Research (Shanghai) Ltd. Navigation Apparatus Having Three-Dimensional Gravity Sensor and Navigation Method Thereof
JP2018106017A (en) * 2016-12-27 2018-07-05 株式会社オゼットクリエイティブ Device, program, and method for map information creation
JP2019158852A (en) * 2018-03-16 2019-09-19 三菱電機株式会社 Speed creation device, speed creation program, and speed creation method

Also Published As

Publication number Publication date
JP2022024493A (en) 2022-02-09

Similar Documents

Publication Publication Date Title
US11377101B2 (en) Information processing apparatus, information processing method, and vehicle
WO2021241189A1 (en) Information processing device, information processing method, and program
US20240054793A1 (en) Information processing device, information processing method, and program
WO2019039281A1 (en) Information processing device, information processing method, program and mobile body
US20220383749A1 (en) Signal processing device, signal processing method, program, and mobile device
US20240069564A1 (en) Information processing device, information processing method, program, and mobile apparatus
WO2023153083A1 (en) Information processing device, information processing method, information processing program, and moving device
WO2022004423A1 (en) Information processing device, information processing method, and program
US20230289980A1 (en) Learning model generation method, information processing device, and information processing system
JP2023062484A (en) Information processing device, information processing method, and information processing program
WO2022024569A1 (en) Information processing device, information processing method, and program
WO2023145460A1 (en) Vibration detection system and vibration detection method
WO2023063145A1 (en) Information processing device, information processing method, and information processing program
WO2023074419A1 (en) Information processing device, information processing method, and information processing system
WO2023068116A1 (en) On-vehicle communication device, terminal device, communication method, information processing method, and communication system
WO2024062976A1 (en) Information processing device and information processing method
WO2023054090A1 (en) Recognition processing device, recognition processing method, and recognition processing system
WO2023149089A1 (en) Learning device, learning method, and learning program
WO2024009829A1 (en) Information processing device, information processing method, and vehicle control system
WO2023162497A1 (en) Image-processing device, image-processing method, and image-processing program
WO2023171401A1 (en) Signal processing device, signal processing method, and recording medium
WO2022107532A1 (en) Information processing device, information processing method, and program
WO2023053498A1 (en) Information processing device, information processing method, recording medium, and in-vehicle system
WO2024024471A1 (en) Information processing device, information processing method, and information processing system
WO2022259621A1 (en) Information processing device, information processing method, and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21850821

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21850821

Country of ref document: EP

Kind code of ref document: A1