WO2024101104A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2024101104A1
WO2024101104A1 PCT/JP2023/037819 JP2023037819W WO2024101104A1 WO 2024101104 A1 WO2024101104 A1 WO 2024101104A1 JP 2023037819 W JP2023037819 W JP 2023037819W WO 2024101104 A1 WO2024101104 A1 WO 2024101104A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
unit
landmark
landmarks
map
Prior art date
Application number
PCT/JP2023/037819
Other languages
English (en)
Japanese (ja)
Inventor
佑允 高橋
康平 小島
祐輝 遠藤
遼 高橋
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2024101104A1 publication Critical patent/WO2024101104A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles

Definitions

  • This technology relates to an information processing device, an information processing method, and a program, and more specifically to an information processing device etc. that can be applied to enable good self-position calculation in a moving body such as an autonomous mobile robot (AMR).
  • AMR autonomous mobile robot
  • Patent Document 1 describes a technology for automatically embedding a recognition target on a map.
  • the purpose of this technology is to ensure that self-position calculations are performed efficiently.
  • the concept of this technology is as follows: an information storage unit that stores information on landmarks that are used as indicators for self-position calculation; an information generating unit that generates associated information according to a type of landmark based on the landmark information stored in the information storing unit;
  • the information processing device includes a display unit that displays landmarks on a map based on the landmark information stored in the information storage unit, and displays the associated information in association with the landmarks.
  • an information storage unit that stores information about landmarks that serve as indicators for self-position calculation.
  • An information generation unit generates associated information according to the type of landmark based on the landmark information stored in the information storage unit.
  • a display unit then displays the landmarks on a map based on the landmark information stored in the information storage unit, and also displays the associated information in association with the landmarks.
  • this technology displays landmarks and associated information according to the type of landmark on a map, and users can refer to this display to effectively set or change the route of a moving body such as a robot and the placement positions of landmarks, making it easy to ensure that self-position calculations using landmark information are performed satisfactorily.
  • the present technology may further include, for example, a recognition unit that recognizes a specific marker, and a landmark registration unit that converts the information of the specific marker recognized by the recognition unit into information in a map coordinate system and then writes it as landmark information in the information storage unit. This makes it possible to register markers placed in the environment as landmarks on the map.
  • a display unit may be further provided that displays a specific marker on a map based on information in a map coordinate system of the specific marker recognized by the recognition unit
  • the landmark registration unit may be configured to write the map coordinate system information of the specific marker as landmark information in the information storage unit when an instruction to register the specific marker displayed on the map is received.
  • the system may further include a display unit that displays landmarks on the map based on the landmark information written to the information storage unit by the landmark registration unit, and when an instruction to delete a specific landmark displayed on the map is received, the landmark registration unit may delete the information of the specific landmark from the information storage unit. This makes it possible to easily delete landmarks that are not in the appropriate position on the map.
  • the cross-sectional shape of the front surface of a specified marker may be a shape corresponding to a composite waveform formed by superimposing multiple sine waveforms with different frequencies. This makes it possible to simply and easily obtain identifier information for a specified marker by observing multiple peak frequencies in a frequency spectrum obtained by performing a Fourier transform process on a 2D point cloud corresponding to the specified marker obtained by 2D LiDAR, for example.
  • the combination of frequencies of multiple sine waveforms may be a combination of frequencies where the higher frequency is not an integer multiple of the lower frequency for any two frequencies. This makes it easier to observe multiple peak frequencies related to identifier information in the frequency spectrum.
  • the recognition unit may be configured to recognize a specific marker based on a 2D point cloud corresponding to the specific marker obtained by 2D LiDAR, and to obtain identifier information, position information, and attitude information of the specific marker.
  • the recognition unit may be configured to be configured to extract a 2D point cloud corresponding to the specific marker from the 2D point cloud obtained by 2D LiDAR according to the level of reflected light, which is configured to be made of a retroreflective material.
  • AR markers may be used as landmarks.
  • the associated information may be information on an area suitable for calculating the user's position and orientation. By displaying this associated information, the user can easily grasp an area suitable for calculating the user's position, and it becomes easy to set a route so that the user's position can be calculated satisfactorily.
  • retroreflective markers may be used as landmarks.
  • the associated information may be information on an area where three or more landmarks can be recognized.
  • the associated information may be information on the accuracy of the self-location calculated at a position specified by the user.
  • the user can easily understand the accuracy of the self-location calculated at the specified position, and can easily set a route so that the self-location is calculated with high accuracy.
  • the associated information may be information on a position where an additional landmark should be placed to improve the accuracy of the self-location calculated at a position specified by the user.
  • the user can easily understand the position where an additional landmark should be placed to improve the accuracy of the self-location calculated at the specified position, and can easily place an additional landmark so that the self-location is calculated with high accuracy at the specified position.
  • the accompanying information may be warning information in the case where the landmark placement may cause a malfunction.
  • the user can easily understand that the landmark placement may cause a malfunction, and can take measures such as changing the landmark placement.
  • the associated information may be information on recommended landmark placements in cases where there is a possibility of malfunction due to the placement of the landmarks.
  • the user can easily understand the recommended placement of the landmarks, and can appropriately change the placement of the landmarks.
  • the associated information may be information about areas on the route set on the map where there is a possibility of erroneous distance measurement to landmarks.
  • the user can easily grasp areas on the route where there is a possibility of erroneous distance measurement to landmarks, and can easily change the route to avoid erroneous distance measurement to landmarks.
  • the accompanying information may be warning information when a location on the route set on the map has a curvature greater than or equal to a set value within an area where three or more landmarks can be recognized.
  • the user can easily understand that a location on the route set on the map has a curvature greater than or equal to a set value within an area where three or more landmarks can be recognized, and can take measures such as changing the route.
  • the information processing method includes a step of displaying landmarks on a map based on the landmark information stored in the information storage unit, and displaying the associated information in association with the landmarks.
  • Another concept of the present technology is generating associated information according to a type of landmark based on the information of the landmark stored in the information storage unit;
  • the program causes a computer to execute an information processing method having a procedure for displaying landmarks on a map based on the landmark information stored in the information storage unit, and displaying the associated information in association with the landmarks.
  • FIG. 1 is a block diagram showing an example of the configuration of a vehicle control system that is an example of a mobility device control system to which the present technology is applied.
  • 11 is a diagram showing an example of a sensing area by an external recognition sensor;
  • FIG. 10 is a diagram illustrating an example of a relationship between a robot and a marker.
  • FIG. 1 is a block diagram showing an example of the configuration of a system for implementing a first embodiment.
  • FIG. 13 is a diagram showing an example of a display on a display unit of a UI unit of a robot operation application. 13 is a flowchart illustrating an example of a processing procedure of a coordinate conversion unit included in the robot system.
  • FIG. 13 is a flowchart showing an example of a processing procedure of an application section and a UI section of a robot operation application.
  • 13 is a flowchart illustrating an example of a processing procedure of a landmark registration unit of the robot system.
  • 13 is a diagram for explaining the waveform of the cross-sectional shape of the front surface of a three-dimensional marker.
  • FIG. This is a diagram to explain the structure of a three-dimensional marker and how the front of the three-dimensional marker is scanned horizontally with a 2D LiDAR to obtain a 2D point cloud.
  • FIG. 13 is a diagram for explaining how to obtain a frequency spectrum in which multiple peak frequencies appear by performing a Fourier transform process on a 2D point group.
  • FIG. 11 is a diagram for explaining how rotation angle information of a three-dimensional marker is obtained from a compression factor of a peak frequency that appears in a frequency spectrum.
  • FIG. 11 is a diagram for explaining how to obtain rotation direction information of a three-dimensional marker based on the positive or negative slope of an approximate line calculated from a 2D point cloud.
  • 10 is a flowchart showing an example of a processing procedure of an ID decoding unit constituting the recognition unit.
  • 10 is a flowchart illustrating an example of a processing procedure of a marker position and orientation information acquisition unit constituting the recognition unit.
  • FIG. 11 is a block diagram showing an example of the configuration of a system for implementing a second embodiment. 11 is a diagram for explaining an automatic landmark registration operation and a landmark deletion process by a user operation.
  • FIG. 11 is a block diagram showing an example of the configuration of a system for implementing a third embodiment.
  • 11A and 11B are diagrams for explaining a relational expression between distance and recognition accuracy, and a relational expression between angle and recognition accuracy.
  • FIG. 13 is a block diagram showing an example of the configuration of a system for implementing a fourth embodiment.
  • FIG. 13 is a diagram showing an example of an area in which three or more landmarks displayed on a map can be recognized.
  • FIG. 13 is a block diagram showing an example of the configuration of a system for implementing a fifth embodiment.
  • 11 is a diagram showing an example of the accuracy (distance measurement error) of one's own position calculated at a position displayed on a map and designated by a user;
  • FIG. 13 is a diagram showing another example of the accuracy (distance measurement error) of one's own position calculated at a position specified by a user, displayed on a map.
  • FIG. 11 is a diagram showing yet another example of the accuracy (distance measurement error) of one's own position calculated at a position specified by a user, displayed on a map.
  • FIG. 13 is a block diagram showing an example of the configuration of a system for implementing a sixth embodiment.
  • FIG. 13 is a diagram showing an example of landmarks (retroreflective markers) and warnings and recommended positions displayed on a map.
  • FIG. 13 is a diagram showing another example of landmarks (retroreflective markers) and warnings and recommended positions displayed on a map.
  • FIG. 1 is a diagram for explaining the range of erroneous distance measurement when the distance measurement sensor is a LiDAR or a depth camera.
  • FIG. 13 is a block diagram showing an example of the configuration of a system for implementing a seventh embodiment. 1 is a diagram showing an example of a travel route set by a user operation displayed on a map, and an example of an area in which erroneous distance measurement of a landmark may occur.
  • FIG. 13 is a block diagram showing an example of the configuration of a system for implementing an eighth embodiment. 1 is a diagram showing an example of a display of a travel route set by a user operation on a map and an area in which three or more landmarks can be recognized.
  • FIG. FIG. 2 is a block diagram illustrating an example of a hardware configuration of a computer.
  • FIG. 1 is a block diagram showing an example of the configuration of a vehicle control system 11, which is an example of a mobility device control system to which the present technology is applied.
  • the vehicle control system 11 is installed in the vehicle 1 and performs processing related to driving assistance and autonomous driving of the vehicle 1.
  • the vehicle control system 11 includes a vehicle control ECU (Electronic Control Unit) 21, a communication unit 22, a map information storage unit 23, a location information acquisition unit 24, an external recognition sensor 25, an in-vehicle sensor 26, a vehicle sensor 27, a memory unit 28, a driving assistance/automated driving control unit 29, a DMS (Driver Monitoring System) 30, an HMI (Human Machine Interface) 31, and a vehicle control unit 32.
  • vehicle control ECU Electronic Control Unit
  • a communication unit 22 includes a communication unit 22, a map information storage unit 23, a location information acquisition unit 24, an external recognition sensor 25, an in-vehicle sensor 26, a vehicle sensor 27, a memory unit 28, a driving assistance/automated driving control unit 29, a DMS (Driver Monitoring System) 30, an HMI (Human Machine Interface) 31, and a vehicle control unit 32.
  • the vehicle control ECU 21, communication unit 22, map information storage unit 23, position information acquisition unit 24, external recognition sensor 25, in-vehicle sensor 26, vehicle sensor 27, memory unit 28, driving assistance/automatic driving control unit 29, driver monitoring system (DMS) 30, human machine interface (HMI) 31, and vehicle control unit 32 are connected to each other so as to be able to communicate with each other via a communication network 41.
  • the communication network 41 is composed of an in-vehicle communication network or bus that complies with a digital two-way communication standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), FlexRay (registered trademark), or Ethernet (registered trademark).
  • the communication network 41 may be used differently depending on the type of data being transmitted.
  • CAN may be applied to data related to vehicle control
  • Ethernet may be applied to large-volume data.
  • each part of the vehicle control system 11 may be directly connected without going through the communication network 41, using wireless communication intended for communication over relatively short distances, such as near field communication (NFC) or Bluetooth (registered trademark).
  • NFC near field communication
  • Bluetooth registered trademark
  • the vehicle control ECU 21 is composed of various processors, such as a CPU (Central Processing Unit) and an MPU (Micro Processing Unit).
  • the vehicle control ECU 21 controls all or part of the functions of the vehicle control system 11.
  • the communication unit 22 communicates with various devices inside and outside the vehicle, other vehicles, servers, base stations, etc., and transmits and receives various types of data. At this time, the communication unit 22 can communicate using multiple communication methods.
  • the communication unit 22 communicates with servers (hereinafter referred to as external servers) on an external network via base stations or access points using wireless communication methods such as 5G (fifth generation mobile communication system), LTE (Long Term Evolution), and DSRC (Dedicated Short Range Communications).
  • the external network with which the communication unit 22 communicates is, for example, the Internet, a cloud network, or an operator-specific network.
  • the communication method that the communication unit 22 uses with the external network is not particularly limited as long as it is a wireless communication method that allows digital two-way communication at a communication speed equal to or higher than a predetermined distance.
  • the communication unit 22 can communicate with a terminal present in the vicinity of the vehicle using P2P (Peer To Peer) technology.
  • the terminal present in the vicinity of the vehicle can be, for example, a terminal attached to a mobile object moving at a relatively slow speed, such as a pedestrian or a bicycle, a terminal installed at a fixed position in a store, or an MTC (Machine Type Communication) terminal.
  • the communication unit 22 can also perform V2X communication.
  • V2X communication refers to communication between the vehicle and others, such as vehicle-to-vehicle communication with other vehicles, vehicle-to-infrastructure communication with roadside devices, vehicle-to-home communication with a home, and vehicle-to-pedestrian communication with a terminal carried by a pedestrian, etc.
  • the communication unit 22 can, for example, receive from the outside a program for updating the software that controls the operation of the vehicle control system 11 (Over the Air).
  • the communication unit 22 can further receive map information, traffic information, information about the surroundings of the vehicle 1, etc. from the outside.
  • the communication unit 22 can transmit information about the vehicle 1 and information about the surroundings of the vehicle 1 to the outside.
  • Information about the vehicle 1 that the communication unit 22 transmits to the outside includes, for example, data indicating the state of the vehicle 1, the recognition results by the recognition unit 73, etc.
  • the communication unit 22 performs communication corresponding to a vehicle emergency notification system such as e-Call.
  • the communication unit 22 receives electromagnetic waves transmitted by a road traffic information and communication system (VICS (Vehicle Information and Communication System) (registered trademark)) such as a radio beacon, optical beacon, or FM multiplex broadcasting.
  • VICS Vehicle Information and Communication System
  • the communication unit 22 can communicate with each device in the vehicle using, for example, wireless communication.
  • the communication unit 22 can perform wireless communication with each device in the vehicle using a communication method that allows digital two-way communication at a communication speed equal to or higher than a predetermined speed via wireless communication, such as wireless LAN, Bluetooth, NFC, or WUSB (Wireless USB).
  • the communication unit 22 can also communicate with each device in the vehicle using wired communication.
  • the communication unit 22 can communicate with each device in the vehicle using wired communication via a cable connected to a connection terminal (not shown).
  • the communication unit 22 can communicate with each device in the vehicle using a communication method that allows digital two-way communication at a communication speed equal to or higher than a predetermined speed via wired communication, such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface) (registered trademark), or MHL (Mobile High-definition Link).
  • a communication method that allows digital two-way communication at a communication speed equal to or higher than a predetermined speed via wired communication, such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface) (registered trademark), or MHL (Mobile High-definition Link).
  • devices in the vehicle refers to devices that are not connected to the communication network 41 in the vehicle.
  • Examples of devices in the vehicle include mobile devices and wearable devices carried by passengers such as the driver, and information devices that are brought into the vehicle and temporarily installed.
  • the map information storage unit 23 stores one or both of a map acquired from an external source and a map created by the vehicle 1.
  • the map information storage unit 23 stores a three-dimensional high-precision map, a global map that is less accurate than a high-precision map and covers a wide area, etc.
  • High-precision maps include, for example, dynamic maps, point cloud maps, and vector maps.
  • a dynamic map is, for example, a map consisting of four layers of dynamic information, semi-dynamic information, semi-static information, and static information, and is provided to the vehicle 1 from an external server or the like.
  • a point cloud map is a map composed of a point cloud (point group data).
  • a vector map is, for example, a map that associates traffic information such as the positions of lanes and traffic lights with a point cloud map, and is adapted for ADAS (Advanced Driver Assistance System) and AD (Autonomous Driving).
  • the point cloud map and vector map may be provided, for example, from an external server, or may be created by the vehicle 1 based on sensing results from the camera 51, radar 52, LiDAR 53, etc. as a map for matching with a local map described below, and stored in the map information storage unit 23.
  • map data of, for example, an area of several hundred meters square regarding the planned route along which the vehicle 1 will travel is acquired from the external server, etc., in order to reduce communication capacity.
  • the location information acquisition unit 24 receives GNSS signals from Global Navigation Satellite System (GNSS) satellites and acquires location information of the vehicle 1.
  • GNSS Global Navigation Satellite System
  • the acquired location information is supplied to the driving assistance/automated driving control unit 29.
  • the location information acquisition unit 24 is not limited to a method using GNSS signals, and may acquire location information using a beacon, for example.
  • the external recognition sensor 25 includes various sensors used to recognize the situation outside the vehicle 1, and supplies sensor data from each sensor to each part of the vehicle control system 11.
  • the type and number of sensors included in the external recognition sensor 25 are arbitrary.
  • the external recognition sensor 25 includes a camera 51, a radar 52, a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) 53, and an ultrasonic sensor 54.
  • the external recognition sensor 25 may be configured to include one or more types of sensors among the camera 51, the radar 52, the LiDAR 53, and the ultrasonic sensor 54.
  • the number of cameras 51, radars 52, LiDAR 53, and ultrasonic sensors 54 is not particularly limited as long as it is a number that can be realistically installed on the vehicle 1.
  • the types of sensors included in the external recognition sensor 25 are not limited to this example, and the external recognition sensor 25 may include other types of sensors. Examples of the sensing areas of each sensor included in the external recognition sensor 25 will be described later.
  • the imaging method of camera 51 is not particularly limited.
  • cameras of various imaging methods such as a ToF (Time of Flight) camera, a stereo camera, a monocular camera, and an infrared camera, which are imaging methods capable of distance measurement, can be applied to camera 51 as necessary.
  • ToF Time of Flight
  • stereo camera stereo camera
  • monocular camera stereo camera
  • infrared camera which are imaging methods capable of distance measurement
  • the present invention is not limited to this, and camera 51 may simply be used to obtain a photographed image, without being related to distance measurement.
  • the external recognition sensor 25 can be equipped with an environmental sensor for detecting the environment relative to the vehicle 1.
  • the environmental sensor is a sensor for detecting the environment such as the weather, climate, brightness, etc., and can include various sensors such as a raindrop sensor, a fog sensor, a sunlight sensor, a snow sensor, an illuminance sensor, etc.
  • the external recognition sensor 25 includes a microphone that is used to detect sounds around the vehicle 1 and the location of sound sources.
  • the in-vehicle sensor 26 includes various sensors for detecting information inside the vehicle, and supplies sensor data from each sensor to each part of the vehicle control system 11. There are no particular limitations on the types and number of the various sensors included in the in-vehicle sensor 26, so long as they are of the types and number that can be realistically installed in the vehicle 1.
  • the in-vehicle sensor 26 may be equipped with one or more types of sensors including a camera, radar, a seating sensor, a steering wheel sensor, a microphone, and a biometric sensor.
  • the camera equipped in the in-vehicle sensor 26 may be a camera using various imaging methods capable of measuring distances, such as a ToF camera, a stereo camera, a monocular camera, or an infrared camera. Without being limited to this, the camera equipped in the in-vehicle sensor 26 may be a camera simply for acquiring captured images, regardless of distance measurement.
  • the biometric sensor equipped in the in-vehicle sensor 26 is provided, for example, on a seat, steering wheel, etc., and detects various types of biometric information of passengers such as the driver.
  • the vehicle sensor 27 includes various sensors for detecting the state of the vehicle 1, and supplies sensor data from each sensor to each part of the vehicle control system 11. There are no particular limitations on the types and number of the various sensors included in the vehicle sensor 27, so long as they are of the types and number that can be realistically installed on the vehicle 1.
  • the vehicle sensor 27 includes a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an inertial measurement unit (IMU) that integrates these.
  • the vehicle sensor 27 includes a steering angle sensor that detects the steering angle of the steering wheel, a yaw rate sensor, an accelerator sensor that detects the amount of accelerator pedal operation, and a brake sensor that detects the amount of brake pedal operation.
  • the vehicle sensor 27 includes a rotation sensor that detects the number of rotations of the engine or motor, an air pressure sensor that detects the air pressure of the tires, a slip ratio sensor that detects the slip ratio of the tires, and a wheel speed sensor that detects the rotation speed of the wheels.
  • the vehicle sensor 27 includes a battery sensor that detects the remaining charge and temperature of the battery, and an impact sensor that detects external impacts.
  • the memory unit 28 includes at least one of a non-volatile storage medium and a volatile storage medium, and stores data and programs.
  • the memory unit 28 is used, for example, as an EEPROM (Electrically Erasable Programmable Read Only Memory) and a RAM (Random Access Memory), and the storage medium may be a magnetic storage device such as a hard disc drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the memory unit 28 stores various programs and data used by each part of the vehicle control system 11.
  • the memory unit 28 includes an EDR (Event Data Recorder) and a DSSAD (Data Storage System for Automated Driving), and stores information about the vehicle 1 before and after an event such as an accident, and information acquired by the in-vehicle sensor 26.
  • EDR Event Data Recorder
  • DSSAD Data Storage System for Automated Driving
  • the driving assistance/automated driving control unit 29 controls driving assistance and automatic driving of the vehicle 1.
  • the driving assistance/automated driving control unit 29 includes an analysis unit 61, an action planning unit 62, and an operation control unit 63.
  • the analysis unit 61 performs analysis processing of the vehicle 1 and the surrounding conditions.
  • the analysis unit 61 includes a self-position estimation unit 71, a sensor fusion unit 72, and a recognition unit 73.
  • the self-position estimation unit 71 estimates the self-position of the vehicle 1 based on the sensor data from the external recognition sensor 25 and the high-precision map stored in the map information storage unit 23. For example, the self-position estimation unit 71 generates a local map based on the sensor data from the external recognition sensor 25, and estimates the self-position of the vehicle 1 by matching the local map with the high-precision map.
  • the position of the vehicle 1 is based on, for example, the center of the rear wheel pair axle.
  • the local map is, for example, a three-dimensional high-precision map or an occupancy grid map created using technology such as SLAM (Simultaneous Localization and Mapping).
  • the three-dimensional high-precision map is, for example, the point cloud map described above.
  • the occupancy grid map is a map in which the three-dimensional or two-dimensional space around the vehicle 1 is divided into grids of a predetermined size, and the occupancy state of objects is shown on a grid-by-grid basis.
  • the occupancy state of objects is indicated, for example, by the presence or absence of an object and the probability of its existence.
  • the local map is also used, for example, in detection processing and recognition processing of the situation outside the vehicle 1 by the recognition unit 73.
  • the self-position estimation unit 71 may estimate the self-position of the vehicle 1 based on the position information acquired by the position information acquisition unit 24 and the sensor data from the vehicle sensor 27.
  • the sensor fusion unit 72 performs sensor fusion processing to combine multiple different types of sensor data (e.g., image data supplied from the camera 51 and sensor data supplied from the radar 52) to obtain new information.
  • Methods for combining different types of sensor data include integration, fusion, and association.
  • the recognition unit 73 executes a detection process to detect the situation outside the vehicle 1, and a recognition process to recognize the situation outside the vehicle 1.
  • the recognition unit 73 performs detection and recognition processing of the situation outside the vehicle 1 based on information from the external recognition sensor 25, information from the self-position estimation unit 71, information from the sensor fusion unit 72, etc.
  • the recognition unit 73 performs detection processing and recognition processing of objects around the vehicle 1.
  • Object detection processing is, for example, processing to detect the presence or absence, size, shape, position, movement, etc. of an object.
  • Object recognition processing is, for example, processing to recognize attributes such as the type of object, and to identify a specific object.
  • detection processing and recognition processing are not necessarily clearly separated, and there may be overlap.
  • the recognition unit 73 detects objects around the vehicle 1 by performing clustering to classify a point cloud based on sensor data from the radar 52, the LiDAR 53, or the like into clusters of points. This allows the presence or absence, size, shape, and position of objects around the vehicle 1 to be detected.
  • the recognition unit 73 detects the movement of objects around the vehicle 1 by performing tracking to follow the movement of the clusters of point clouds classified by clustering. This allows the speed and direction of travel (movement vector) of the objects around the vehicle 1 to be detected.
  • the recognition unit 73 detects or recognizes vehicles, people, bicycles, obstacles, structures, roads, traffic lights, traffic signs, road markings, etc. based on image data supplied from the camera 51.
  • the recognition unit 73 may also recognize the types of objects around the vehicle 1 by performing recognition processing such as semantic segmentation.
  • the recognition unit 73 can perform recognition processing of traffic rules around the vehicle 1 based on the map stored in the map information storage unit 23, the result of self-location estimation by the self-location estimation unit 71, and the result of recognition of objects around the vehicle 1 by the recognition unit 73. Through this processing, the recognition unit 73 can recognize the positions and states of traffic signals, the contents of traffic signs and road markings, the contents of traffic regulations, and lanes on which travel is possible, etc.
  • the recognition unit 73 can perform recognition processing of the environment around the vehicle 1.
  • the surrounding environment that the recognition unit 73 recognizes may include weather, temperature, humidity, brightness, and road surface conditions.
  • the behavior planning unit 62 creates a behavior plan for the vehicle 1. For example, the behavior planning unit 62 creates the behavior plan by performing route planning and route following processing.
  • Global path planning is a process that plans a rough route from the start to the goal. This route planning is called trajectory planning, and also includes a process of local path planning that takes into account the motion characteristics of vehicle 1 on the planned route and generates a trajectory that allows safe and smooth progress in the vicinity of vehicle 1.
  • Path following is a process of planning operations for safely and accurately traveling along a route planned by a route plan within a planned time.
  • the action planning unit 62 can, for example, calculate the target speed and target angular velocity of the vehicle 1 based on the results of this path following process.
  • the operation control unit 63 controls the operation of the vehicle 1 to realize the action plan created by the action planning unit 62.
  • the operation control unit 63 controls the steering control unit 81, the brake control unit 82, and the drive control unit 83 included in the vehicle control unit 32 described below, and performs acceleration/deceleration control and directional control so that the vehicle 1 proceeds along the trajectory calculated by the trajectory plan.
  • the operation control unit 63 performs cooperative control aimed at realizing ADAS functions such as collision avoidance or impact mitigation, following driving, maintaining vehicle speed, collision warning for the vehicle itself, and lane departure warning for the vehicle itself.
  • the operation control unit 63 performs cooperative control aimed at automatic driving, which drives autonomously without the driver's operation.
  • the DMS 30 performs processes such as authenticating the driver and recognizing the driver's state based on the sensor data from the in-vehicle sensors 26 and the input data input to the HMI 31 (described later).
  • Examples of the driver's state to be recognized include physical condition, alertness, concentration, fatigue, line of sight, level of intoxication, driving operation, posture, etc.
  • the DMS 30 may also perform authentication processing for passengers other than the driver and recognition processing for the status of the passengers.
  • the DMS 30 may also perform recognition processing for the situation inside the vehicle based on sensor data from the in-vehicle sensor 26. Examples of the situation inside the vehicle that may be recognized include temperature, humidity, brightness, odor, etc.
  • HMI31 inputs various data and instructions, and displays various data to the driver, etc.
  • the HMI 31 is equipped with an input device that allows a person to input data.
  • the HMI 31 generates input signals based on data and instructions input via the input device, and supplies the signals to each part of the vehicle control system 11.
  • the HMI 31 is equipped with input devices such as a touch panel, buttons, switches, and levers. Without being limited to these, the HMI 31 may further be equipped with an input device that allows information to be input by a method other than manual operation, such as voice or gestures.
  • the HMI 31 may use, as an input device, an externally connected device such as a remote control device that uses infrared or radio waves, or a mobile device or wearable device that supports the operation of the vehicle control system 11.
  • the HMI 31 generates visual information, auditory information, and tactile information for the occupants or the outside of the vehicle.
  • the HMI 31 also performs output control to control the output, output content, output timing, output method, etc. of each piece of generated information.
  • the HMI 31 generates and outputs, as visual information, information indicated by images or light, such as an operation screen, a status display of the vehicle 1, a warning display, and a monitor image showing the situation around the vehicle 1.
  • the HMI 31 also generates and outputs, as auditory information, information indicated by sounds, such as voice guidance, warning sounds, and warning messages.
  • the HMI 31 also generates and outputs, as tactile information, information that is imparted to the occupants' sense of touch by, for example, force, vibration, movement, etc.
  • the output device from which the HMI 31 outputs visual information may be, for example, a display device that presents visual information by displaying an image itself, or a projector device that presents visual information by projecting an image.
  • the display device may be a device that displays visual information within the field of vision of the passenger, such as a head-up display, a transmissive display, or a wearable device with an AR (Augmented Reality) function, in addition to a display device having a normal display.
  • the HMI 31 may also use display devices such as a navigation device, instrument panel, CMS (Camera Monitoring System), electronic mirror, lamp, etc., provided in the vehicle 1 as output devices that output visual information.
  • CMS Camera Monitoring System
  • the output device through which the HMI 31 outputs auditory information can be, for example, an audio speaker, headphones, or earphones.
  • Haptic elements using haptic technology can be used as an output device for the HMI 31 to output tactile information.
  • the haptic elements are provided on parts of the vehicle 1 that are in contact with passengers, such as the steering wheel and the seat.
  • the vehicle control unit 32 controls each part of the vehicle 1.
  • the vehicle control unit 32 includes a steering control unit 81, a brake control unit 82, a drive control unit 83, a body control unit 84, a light control unit 85, and a horn control unit 86.
  • the steering control unit 81 detects and controls the state of the steering system of the vehicle 1.
  • the steering system includes, for example, a steering mechanism including a steering wheel, an electric power steering, etc.
  • the steering control unit 81 includes, for example, a steering ECU that controls the steering system, an actuator that drives the steering system, etc.
  • the brake control unit 82 detects and controls the state of the brake system of the vehicle 1.
  • the brake system includes, for example, a brake mechanism including a brake pedal, an ABS (Antilock Brake System), a regenerative brake mechanism, etc.
  • the brake control unit 82 includes, for example, a brake ECU that controls the brake system, and an actuator that drives the brake system.
  • the drive control unit 83 detects and controls the state of the drive system of the vehicle 1.
  • the drive system includes, for example, an accelerator pedal, a drive force generating device for generating drive force such as an internal combustion engine or a drive motor, and a drive force transmission mechanism for transmitting the drive force to the wheels.
  • the drive control unit 83 includes, for example, a drive ECU for controlling the drive system, and an actuator for driving the drive system.
  • the body system control unit 84 detects and controls the state of the body system of the vehicle 1.
  • the body system includes, for example, a keyless entry system, a smart key system, a power window device, a power seat, an air conditioning system, an airbag, a seat belt, a shift lever, etc.
  • the body system control unit 84 includes, for example, a body system ECU that controls the body system, an actuator that drives the body system, etc.
  • the light control unit 85 detects and controls the state of various lights of the vehicle 1. Examples of lights to be controlled include headlights, backlights, fog lights, turn signals, brake lights, projection, and bumper displays.
  • the light control unit 85 includes a light ECU that controls the lights, an actuator that drives the lights, and the like.
  • the horn control unit 86 detects and controls the state of the car horn of the vehicle 1.
  • the horn control unit 86 includes, for example, a horn ECU that controls the car horn, an actuator that drives the car horn, etc.
  • FIG. 2 is a diagram showing an example of a sensing area by the camera 51, radar 52, LiDAR 53, ultrasonic sensor 54, etc. of the external recognition sensor 25 in FIG. 1. Note that FIG. 2 shows a schematic view of the vehicle 1 as seen from above, with the left end side being the front end of the vehicle 1 and the right end side being the rear end of the vehicle 1.
  • Sensing area 101F and sensing area 101B show examples of sensing areas of ultrasonic sensors 54. Sensing area 101F covers the periphery of the front end of vehicle 1 with multiple ultrasonic sensors 54. Sensing area 101B covers the periphery of the rear end of vehicle 1 with multiple ultrasonic sensors 54.
  • sensing results in sensing area 101F and sensing area 101B are used, for example, for parking assistance for vehicle 1.
  • Sensing area 102F to sensing area 102B show examples of sensing areas of a short-range or medium-range radar 52. Sensing area 102F covers a position farther in front of the vehicle 1 than sensing area 101F. Sensing area 102B covers a position farther in the rear of the vehicle 1 than sensing area 101B. Sensing area 102L covers the rear periphery of the left side of the vehicle 1. Sensing area 102R covers the rear periphery of the right side of the vehicle 1.
  • the sensing results in sensing area 102F are used, for example, to detect vehicles, pedestrians, etc., that are in front of vehicle 1.
  • the sensing results in sensing area 102B are used, for example, for collision prevention functions behind vehicle 1.
  • the sensing results in sensing area 102L and sensing area 102R are used, for example, to detect objects in blind spots to the sides of vehicle 1.
  • Sensing area 103F to sensing area 103B show examples of sensing areas by camera 51. Sensing area 103F covers a position farther in front of vehicle 1 than sensing area 102F. Sensing area 103B covers a position farther in the rear of vehicle 1 than sensing area 102B. Sensing area 103L covers the periphery of the left side of vehicle 1. Sensing area 103R covers the periphery of the right side of vehicle 1.
  • the sensing results in sensing area 103F can be used, for example, for recognizing traffic signals and traffic signs, lane departure prevention support systems, and automatic headlight control systems.
  • the sensing results in sensing area 103B can be used, for example, for parking assistance and surround view systems.
  • the sensing results in sensing area 103L and sensing area 103R can be used, for example, for surround view systems.
  • Sensing area 104 shows an example of the sensing area of LiDAR 53. Sensing area 104 covers a position farther in front of vehicle 1 than sensing area 103F. On the other hand, sensing area 104 has a narrower range in the left-right direction than sensing area 103F.
  • the sensing results in the sensing area 104 are used, for example, to detect objects such as surrounding vehicles.
  • a sensing area 105 shows an example of a sensing area of a long-range radar 52 .
  • the sensing area 105 covers a position farther in front of the vehicle 1 than the sensing area 104.
  • the sensing area 105 has a narrower range in the left-right direction than the sensing area 104.
  • the sensing results in the sensing area 105 are used, for example, for ACC (Adaptive Cruise Control), emergency braking, collision avoidance, etc.
  • ACC Adaptive Cruise Control
  • emergency braking braking
  • collision avoidance etc.
  • the sensing areas of the cameras 51, radar 52, LiDAR 53, and ultrasonic sensors 54 included in the external recognition sensor 25 may have various configurations other than those shown in FIG. 2. Specifically, the ultrasonic sensor 54 may also sense the sides of the vehicle 1, and the LiDAR 53 may sense the rear of the vehicle 1.
  • the installation positions of the sensors are not limited to the examples described above. The number of sensors may be one or more.
  • Example 1 is an example in which a moving body, in this case an autonomous traveling transport robot (hereinafter referred to as “robot” as appropriate), recognizes a marker (recognition target) placed in the environment, converts the position and attitude information of this marker from the body coordinate system to a map coordinate system, and registers it as position and attitude information of a landmark that serves as an index for calculating the self-position.
  • robot autonomous traveling transport robot
  • Figure 3 shows a schematic example of the relationship between a robot and a marker.
  • the robot determines its own position in a map coordinate system (SLAM coordinate system), converts the marker information visible in the robot's body coordinates to the map coordinate system, and registers it as landmark information.
  • the marker can be of any type, such as an AR marker or a regenerative reflective marker.
  • FIG. 4 shows an example of the configuration of a system 10A for implementing the first embodiment.
  • This system 10A has a robot system 11A and a robot operation application 12A.
  • the robot system 11A has a sensor group 111, a self-position estimation unit 112, a recognition unit 113, a coordinate conversion unit 114, an information storage unit 115, and a landmark registration unit 116.
  • the sensor group 111 is mounted on the aircraft, i.e., the robot, and includes sensors necessary for estimating the robot's own position and sensors for recognizing markers placed in the environment.
  • the sensor group 111 includes, for example, a Global Navigation Satellite System (GNSS) sensor, a gyro sensor, a geomagnetic sensor, an angular velocity sensor, an Inertial Measurement Unit (IMU), a Light Detection And Ranging (LiDAR), a Time Of Flight (ToF) sensor, an image sensor, and a wheel odometry sensor.
  • GNSS Global Navigation Satellite System
  • IMU Inertial Measurement Unit
  • LiDAR Light Detection And Ranging
  • ToF Time Of Flight
  • image sensor an image sensor
  • wheel odometry sensor a wheel odometry sensor.
  • the self-position estimation unit 112 estimates its own position and attitude using the necessary sensor outputs from the sensor group 111, and obtains information on the position and attitude of the robot in the map coordinate system.
  • the attitude of the robot means the orientation of the robot.
  • the recognition unit 113 recognizes markers placed in the environment using the necessary sensor outputs from the sensor group 111, and obtains information on the position and attitude of the marker in the robot coordinate system.
  • the attitude of the marker means the direction of the marker.
  • the coordinate conversion unit 114 converts the information on the position and orientation of the marker in the aircraft coordinate system obtained by the recognition unit 113 into information on the position and orientation of the marker in the map coordinate system based on the information on the position and orientation of the aircraft in the map coordinate system obtained by the position estimation unit 112.
  • the information holding unit 115 holds information on the position and attitude of the aircraft in the map coordinate system obtained by the self-position estimation unit 112, and also holds information on the position and attitude of the marker in the map coordinate system obtained by the coordinate conversion unit 114. In this case, every time the self-position estimation unit 112 obtains information on the position and attitude of the aircraft in the map coordinate system, the information on the position and attitude of the aircraft in the map coordinate system held in the information holding unit 115 is updated. Note that if the identification unit 113 also obtains ID information (identifier information) of the marker, the information holding unit 115 holds the ID information as well as the position and attitude information in the map coordinate system as information on the marker. In the following explanation, the description of the ID information will be omitted as appropriate.
  • information on the position and orientation in the map coordinate system of a marker recognized by the recognition unit 113 is stored in the information storage unit 115. However, if the marker is no longer recognized by the recognition unit 113 due to the movement of the machine (robot), the information on the position and orientation of the marker in the map coordinate system is deleted from the information storage unit 115.
  • the landmark registration unit 116 registers a landmark based on a landmark registration instruction from the robot operation application 12A.
  • the landmark registration instruction includes information on the position and orientation in the map coordinate system of the marker to be registered as a landmark
  • the landmark registration unit 116 writes the information on the position and orientation in the map coordinate system of the marker to be registered in the information storage unit 115 as information on the position and orientation of the landmark, and stores it.
  • the robot operation application 12A has an application unit 121 and a UI unit 122.
  • the UI unit 122 has a display unit that displays and presents information to the user, and an operation unit that accepts user operations.
  • the application unit 121 generates display information for displaying information stored in the information storage unit 115 of the robot system 11A and user operation information, and sends it to the UI unit 122.
  • the application unit 121 also receives user operation information from the UI unit 122 and performs processing corresponding to it.
  • the application unit 121 sends a landmark registration instruction to the landmark registration unit 116 of the robot system 11A, and writes and stores information about the position and orientation of the marker in the map coordinate system in the information storage unit 115 as information about the position and orientation of the landmark.
  • the information storage unit 115 also stores map information, and the map displayed on the display unit of the UI unit 122 is based on this map information.
  • This map information may be provided to the robot system 11A in advance from an external source, or may be created within the robot system 11A.
  • Figure 5(a) shows an example of a display on the display unit of the UI unit 122 of the robot operation app 12A when a specific marker placed in the environment is recognized by the recognition unit 113 of the robot system 11A.
  • the machine (robot) and the marker (marker 1) are displayed as icons at the corresponding position on the map.
  • an operation screen for registering the marker on the map is displayed as a pop-up screen, as shown in Figure 5 (b).
  • "Type" indicates the type of marker, which in this case indicates that it is a two-dimensional barcode.
  • the recognition unit 113 may be configured to obtain type information of this marker in addition to information on the position and attitude of the marker in the aircraft coordinate system.
  • the "Register map” button is a button that allows the user to perform map registration operations. When the user operates the “Register map” button, map registration operation information is sent from the UI unit 122 to the application unit 121.
  • Figure 5 (c) shows an example of the display on the display unit of the UI unit 122 of the robot operation application 12A after information on the position and orientation of the marker (marker 1) in the map coordinate system has been written to the information storage unit 115 as information on the position and orientation of the landmark (Landmark 1) based on the user's map registration operation, i.e., after the landmark (Landmark 1) has been registered on the map.
  • an icon indicating landmark 1 (Landmark 1) is displayed superimposed on the display position of the icon indicating marker 1 (marker 1) on the map.
  • the flowchart in FIG. 6 shows an example of the processing procedure of the coordinate conversion unit 114.
  • the coordinate conversion unit 114 starts processing in step ST1.
  • step ST2 the coordinate conversion unit 114 acquires information on the position and attitude of the marker in the aircraft coordinate system from the recognition unit 113.
  • step ST3 the coordinate conversion unit 114 acquires information on the position and attitude of the aircraft in the map coordinate system at the same time as the information on the position and attitude of the marker in the aircraft coordinate system acquired in step ST2 from the self-position estimation unit 112.
  • step ST4 the coordinate conversion unit 114 converts the information on the position and attitude of the marker from the aircraft coordinate system to the map coordinate system based on the information on the position and attitude of the aircraft in the map coordinate system.
  • step ST5 the coordinate conversion unit 114 writes the information on the position and attitude of the marker in the map coordinate system to the information storage unit 115.
  • step ST5 the coordinate conversion unit 114 returns to processing in step ST2 and repeats the same processing as described above.
  • the information on the position and orientation of the marker in the map coordinate system written in the information storage unit 115 is automatically deleted after a certain period of time has passed. Therefore, for a specific marker placed in the environment, the information on the position and orientation of the specific marker in the map coordinate system will continue to be stored in the information storage unit 115 as long as the specific marker is recognized by the recognition unit 113, but once it is no longer recognized by the recognition unit 113, the information on the position and orientation of the specific marker in the map coordinate system will no longer be stored in the information storage unit 115.
  • the information on the position and orientation in the map coordinate system of the marker that is no longer recognized by the recognition unit 113 is deleted from the information storage unit 115, so that the markers displayed on the map in the UI unit 122 of the robot operation application 12A are only those that correspond to the markers currently recognized by the recognition unit 113.
  • step ST11 the application unit 121/UI unit 122 starts processing.
  • step ST12 the application unit 121/UI unit 122 acquires information on the position and orientation of the aircraft, markers, and landmarks in the map coordinate system from the information storage unit 115.
  • step ST13 the application unit 121/UI unit 122 displays the aircraft, markers, and landmarks on the map based on the information on the position and orientation of the aircraft, markers, and landmarks in the map coordinate system.
  • step ST14 the application unit 121/UI unit 122 determines whether the user has performed a map registration operation. If a map registration operation has been performed, in step ST15, the application unit 121/UI unit 122 sends a landmark registration instruction to the landmark registration unit 116.
  • step ST15 the application unit 121/UI unit 122 returns to processing step ST12 and repeats the same processing as described above. Note that if no map registration operation is performed in step ST14, the application unit 121/UI unit 122 immediately returns to processing step ST12.
  • the flowchart in FIG. 8 shows an example of the processing procedure of the landmark registration unit 116.
  • the landmark registration unit 116 starts processing in step ST21.
  • step ST22 the landmark registration unit 116 determines whether or not a landmark registration instruction has been received from the application unit 121.
  • step ST23 the landmark registration unit 116 writes information about the position and orientation of the mark in the map coordinate system related to the landmark registration instruction as information about the position and orientation of the landmark in the information storage unit 115. After processing in step ST23, the landmark registration unit 116 returns to processing in step ST22 and repeats the same processing as described above.
  • markers arranged in the environment can be registered on the map as landmarks. Furthermore, in the system 10A of the first embodiment, when a registration instruction is given for a specific marker displayed on the map, the landmark registration unit 116 writes the position and orientation information of the specific marker in the map coordinate system as the position and orientation information of the landmark in the information storage unit 115, and the specific marker displayed on the map can be registered as a landmark on the map only if the specific marker is present in an appropriate position. In other words, when a specific marker displayed on the map is present in an unexpected position, the specific marker can be prevented from being registered on the map as a landmark, and a landmark with appropriate position and orientation information can be registered. In this case, if the ID information of the specific marker is also recognized by the recognition unit 113 as described above, the information of the registered landmark includes the ID information in addition to the position and orientation information.
  • the system 10A in Example 1 is an example in which the robot operation app 12A exists outside the robot system 11A.
  • the robot system 11A is operated remotely by the robot operation app 12A.
  • the robot operation app 12A exists inside the robot system 11A.
  • the user will perform various operations on a UI section provided in the robot itself.
  • the fact that the robot operation app 12A may exist inside the robot system 11A in this way is also conceivable in each of the following examples.
  • the marker placed in the environment may be any type, such as an AR marker, a regenerative reflective marker, etc.
  • ID information identifier information
  • the cross-sectional shape of the front surface of this three-dimensional marker corresponds to a superimposed waveform formed by superimposing multiple sine waveforms with different frequencies.
  • FIG. 9(a) shows sine waveform 1 with amplitude a of 1 and frequency f of 1.
  • Figure 9(b) shows sine waveform 2 with amplitude a of 1 and frequency f of 2.
  • Figure 9(c) shows sine waveform 3 with amplitude a of 1 and frequency f of 3.
  • Figure 9(d) shows a composite waveform formed by superimposing the above-mentioned sine waveforms 1, 2, and 3.
  • FIG. 10(a) shows a three-dimensional marker 20 whose front cross-sectional shape corresponds to the composite waveform shown in FIG. 9(d).
  • the three-dimensional marker 20 only needs to have a shape that corresponds to at least one period of the composite waveform.
  • a 2D point cloud can be obtained as planar distance information by scanning the 3D marker 20 horizontally with a 2D LiDAR.
  • This 2D point cloud corresponds to the cross-sectional shape of the front surface of the 3D marker 20, and therefore corresponds to the sampled values obtained by sampling the composite waveform shown in FIG. 9(d).
  • the 2D LiDAR is assumed to have a resolution that is sufficient to reproduce the cross-sectional shape of the front surface of the three-dimensional marker 20, that is, to obtain a 2D point cloud at intervals that satisfy the sampling theorem.
  • the amplitude is increased, the depth of the three-dimensional marker 20 will increase, and it will take up a large installation space, which is not desirable in terms of saving space.
  • the minimum amplitude d (see Figure 10(a)) of the waveform shape of the cross section of the front surface of the three-dimensional marker 20 needs to correspond to the ranging accuracy of the 2D LiDAR. For example, if the 2D LiDAR has an accuracy of 1 cm and an error of about 1 cm, it is believed that an amplitude d of at least about 5 cm will be sufficient to obtain a 2D point cloud with sufficient accuracy.
  • the solid line L1 indicates the center line of the waveform
  • the dashed line L2 indicates the position of the minimum amplitude.
  • Figure 11(a) shows a 2D point cloud (same as the 2D point cloud shown in Figure 10(b)) obtained by scanning the three-dimensional marker 20 horizontally with a 2D LiDAR.
  • Figure 11(b) shows the frequency spectrum obtained by performing Fourier transform processing, for example FFT (Fast Fourier Transform) processing, on the 2D point cloud.
  • Fourier transform processing for example FFT (Fast Fourier Transform) processing
  • the frequency ratio of the three peak frequencies f1, f2, and f3 that appear in this frequency spectrum matches the frequency ratio of the sine waveforms 1, 2, and 3 mentioned above (see Figures 9(a) to (c)).
  • the frequency ratio of the three peak frequencies f1, f2, and f3 is 1:2:3, and therefore it is possible to obtain the three-digit numerical ID information "123" from this frequency ratio.
  • the waveform shape of the cross section of the front surface of the three-dimensional marker 20 corresponds to a composite waveform obtained by superimposing three sine waveforms with different frequencies, but it is also possible to make the waveform shape of the cross section of the front surface of the three-dimensional marker 20 correspond to a composite waveform obtained by superimposing two or four or more sine waveforms with different frequencies, in which case it is possible to obtain numerical ID information of two or more digits corresponding to the frequency ratio of the multiple superimposed sine waveforms.
  • the cross-sectional shape of the front surface of the three-dimensional marker is shaped to correspond to a superimposed waveform formed by superimposing multiple sine waveforms with different frequencies
  • the present method obtains numerical ID information from the frequency ratio of multiple peak frequencies contained in a frequency spectrum obtained by performing a Fourier transform on a 2D point cloud obtained by scanning the three-dimensional marker horizontally with a 2D LiDAR.
  • the computational load can be reduced compared to a method in which the cross-sectional shape of the front surface of the three-dimensional marker is shaped to a specific shape corresponding to ID information, and ID information is obtained by performing a shape matching process on a 2D point cloud obtained by scanning the three-dimensional marker horizontally with a 2D LiDAR.
  • the 2D point cloud obtained by actually scanning the stereoscopic marker horizontally with a 2D LiDAR contains ranging errors, but this error is included as high-frequency noise in the frequency spectrum obtained by performing a Fourier transform on the 2D point cloud, so the peak frequency for obtaining ID information can be identified well without being affected by the ranging errors of the 2D point cloud.
  • Figure 12(a) shows a frequency spectrum (the same as the frequency spectrum shown in Figure 11(b)) obtained by performing Fourier transform processing, for example FFT processing, on the 2D point cloud obtained by horizontally scanning the three-dimensional marker 20 (see Figure 10(a)) with a 2D LiDAR.
  • Fourier transform processing for example FFT processing
  • the distance D from the aircraft (2D LiDAR) to the three-dimensional marker 20 is the distance from the aircraft (2D LiDAR) to the average line of the waveform on the front of the three-dimensional marker 20.
  • This average line represents a wave with an infinite period, and when a Fourier transform is performed on this average line, a frequency peak is obtained only at the zero frequency. From these facts, the distance D from the aircraft (2D LiDAR) to the three-dimensional marker 20 can be obtained by extracting the DC component from the frequency spectrum, and performing an inverse Fourier transform process on this DC component to reconstruct it.
  • the distance D from the aircraft (2D LiDAR) to the three-dimensional marker 20 can also be obtained by taking the average of the distances of all points in the 2D point cloud.
  • the position of the three-dimensional marker 20 in the coordinate system of the aircraft (2D LiDAR) can be obtained based on the distance D obtained as described above and the directional information of the three-dimensional marker 20 relative to the aircraft (2D LiDAR).
  • the directional information of the three-dimensional marker 20 relative to the aircraft (2D LiDAR) can be obtained based on the directional information of the laser light irradiated from the 2D LiDAR to the three-dimensional marker 20.
  • the attitude (orientation) of the three-dimensional marker 20 in the coordinate system of the aircraft (2D LiDAR) can be calculated as follows.
  • Figure 13(a) shows the state in which the front of the three-dimensional marker 20 faces in a direction rotated 45° counterclockwise with respect to the direction of the aircraft (2D LiDAR).
  • Figure 13(b) shows the frequency spectrum obtained by performing a Fourier transform on the 2D point cloud obtained by the 2D LiDAR in this state.
  • Figure 13(c) shows the frequency spectrum obtained by performing a Fourier transform on the 2D point cloud obtained by the 2D LiDAR in a state in which the front of the three-dimensional marker 20 is facing the aircraft (2D LiDAR).
  • the three peak frequencies f1_r, f2_r, and f3_r are compressed to 1/ ⁇ 2 times, or cos 45° times, the three peak frequencies f1, f2, and f3 (see FIG. 13(c)) that appear in the frequency spectrum when the front of the three-dimensional marker 20 is facing the direction of the aircraft (2D LiDAR).
  • this relationship is also the same when the front of the three-dimensional marker 20 faces in a direction rotated 45° clockwise from the direction of the aircraft (2D LiDAR).
  • the three peak frequencies f1_r, f2_r, and f3_r are 1/ ⁇ 2 times, or cos 45° times
  • the three peak frequencies f1, f2, and f3 are 1/ ⁇ 2 times, or cos 45° times
  • the three peak frequencies f1, f2, and f3 see FIG. 13(c) that appear in the frequency spectrum when the front of the three-dimensional marker 20 faces the direction of the aircraft (2D LiDAR).
  • the peak frequency that appears in the frequency spectrum when the front of the three-dimensional marker 20 is facing in a direction rotated counterclockwise or clockwise by ⁇ relative to the direction of the aircraft (2D LiDAR) is cos ⁇ times the peak frequency that appears in the frequency spectrum when the front of the three-dimensional marker 20 is facing in the direction of the aircraft (2D LiDAR).
  • the three peak frequencies f1_r, f2_r, and f3_r when the front of the three-dimensional marker 20 is facing in a direction rotated relative to the direction of the aircraft (2D LiDAR) are compressed relative to the three peak frequencies f1, f2, and f3 when the front of the three-dimensional marker 20 is facing in the direction of the aircraft (2D LiDAR), but the frequency ratio of the three peak frequencies does not change. Therefore, even if the front of the three-dimensional marker 20 is facing in a direction rotated relative to the direction of the aircraft (2D LiDAR), it is possible to numerically obtain ID information (identifier information) in the same manner as described above.
  • whether the front of the three-dimensional marker 20 is rotating counterclockwise or clockwise relative to the direction of the aircraft (2D LiDAR) can be determined by calculating an approximate straight line, for example by the least squares method, from the 2D point cloud obtained by scanning the three-dimensional marker 20 horizontally with the 2D LiDAR, and determining whether the inclination is positive or negative, thereby obtaining information on the rotation direction of the three-dimensional marker 20.
  • Figure 14(a) shows a state in which the front of the three-dimensional marker 20 is facing in a direction rotated 45° (+45°) counterclockwise with respect to the direction of the aircraft (2D LiDAR).
  • the approximation line calculated from the 2D point cloud slopes upward to the right, and its slope is recognized as positive.
  • Figure 14(b) shows a state in which the front of the three-dimensional marker 20 is facing in a direction rotated 45° (-45°) clockwise with respect to the direction of the aircraft (2D LiDAR).
  • the approximation line calculated from the 2D point cloud slopes upward to the left, and its slope is recognized as negative.
  • the attitude (orientation) of the three-dimensional marker 20 in the coordinate system of the aircraft (2D LiDAR) can be obtained based on the rotation angle information and rotation direction information of the three-dimensional marker 20 obtained as described above, and the directional information of the three-dimensional marker 20 relative to the aircraft (2D LiDAR).
  • the directional information of the three-dimensional marker 20 relative to the aircraft (2D LiDAR) can be obtained based on the directional information of the laser light irradiated from the 2D LiDAR to the three-dimensional marker 20, as described above.
  • the rotation angle information of the three-dimensional marker 20 is obtained by calculating the magnification M of the peak frequency that appears in the frequency spectrum relative to the peak frequency that appears in the frequency spectrum when the front of the three-dimensional marker 20 is facing the aircraft (2D LiDAR).
  • the rotation angle information of the three-dimensional marker 20 based on an approximate straight line calculated to obtain the rotation direction information of the three-dimensional marker 20 as described above, the error will be large and it will be difficult to obtain a stable angle. Therefore, it is desirable to obtain the rotation angle information of the three-dimensional marker 20 by calculating the magnification M of the peak frequency that appears in the frequency spectrum relative to the peak frequency that appears in the frequency spectrum when the front of the three-dimensional marker 20 is facing the aircraft (2D LiDAR).
  • Figure 15 shows an example configuration of the recognition unit 113 (see Figure 4) when obtaining ID information (identifier information) and information on the position and attitude of the marker in the aircraft coordinate system based on a 2D point cloud obtained by horizontally scanning a three-dimensional marker (see three-dimensional marker 20 shown in Figure 10(a)) from which ID information (identifier information) can be obtained numerically using a 2D LiDAR.
  • the recognition unit 113 has a point cloud extraction filter unit 131, an FFT (Fast Fourier Transform) unit 132, an ID (identification) decoding unit 133, a marker position and orientation information acquisition unit 134, and a marker information integration unit 135.
  • FFT Fast Fourier Transform
  • the point cloud extraction filter unit 131 extracts a 2D point cloud obtained by scanning a three-dimensional marker from the 2D point cloud obtained by 2D LiDAR.
  • Each point that constitutes the 2D point cloud obtained by 2D LiDAR contains not only ranging information (information on direction and distance) but also intensity information of the returned laser light, that is, reflection intensity information.
  • the three-dimensional marker is made of a retroreflective material, and the point cloud extraction filter unit 131 extracts each point associated with a reflection intensity exceeding a threshold as a 2D point cloud obtained by scanning the three-dimensional marker.
  • retroreflective materials are materials that have the property of retroreflection, and examples of such materials include glass bead types and prism types. Retroreflective properties are optical properties that cause light from a light source to be reflected directly back towards the light source, regardless of the direction from which it is incident.
  • each point constituting the 2D point cloud obtained by 2D LiDAR contains not only ranging information (information on direction and distance) but also reflection intensity information, and an example has been shown in which the point cloud extraction filter unit 131 uses reflection intensity information to extract (filter) a 2D point cloud obtained by scanning a three-dimensional marker from the 2D point cloud obtained by 2D LiDAR.
  • the 2D LiDAR can obtain other information, such as color information, in addition to reflection intensity information, that can identify that the 2D point cloud is from a three-dimensional marker, then the point cloud extraction filter unit 131 can also use that color information for filtering.
  • the FFT unit 132 performs FFT processing as a Fourier transform on the 2D point cloud obtained by scanning the three-dimensional marker extracted by the point cloud extraction filter unit 131 to obtain a frequency spectrum.
  • the ID decode unit 133 recognizes a predetermined number of peak frequencies corresponding to the number of digits of the ID present in the frequency spectrum obtained by the FFT unit 132, finds the ratio relationship of the predetermined number of peak frequencies, and obtains numerical ID information (identifier information).
  • the marker position and orientation information acquisition unit 134 obtains the position and orientation information of the three-dimensional marker in the aircraft coordinate system based on the frequency spectrum obtained by the FFT unit 132 and the 2D point cloud obtained by scanning the three-dimensional marker extracted by the point cloud extraction filter unit 131.
  • the position of the three-dimensional marker in the coordinate system of the aircraft (2D LiDAR) is found based on the distance D from the aircraft (2D LiDAR) to the three-dimensional marker and the directional information of the three-dimensional marker relative to the aircraft (2D LiDAR).
  • the distance D from the aircraft (2D LiDAR) to the three-dimensional marker can be obtained by extracting the DC component from the frequency spectrum and reconstructing it by performing an inverse Fourier transform process on that DC component.
  • the directional information of the three-dimensional marker relative to the aircraft (2D LiDAR) can be obtained based on the directional information of the laser light irradiated from the 2D LiDAR to the three-dimensional marker.
  • the attitude (orientation) of the 3D marker in the coordinate system of the aircraft (2D LiDAR) is found based on the rotation angle information and rotation direction information of the 3D marker, and the direction information of the 3D marker relative to the aircraft (2D LiDAR).
  • the rotation angle information of the 3D marker can be obtained by calculating the magnification M of the peak frequency that appears in the frequency spectrum to the peak frequency that appears in the frequency spectrum when the front of the 3D marker is facing the aircraft (2D LiDAR), and finding the Arccos M.
  • it can be obtained by calculating an approximation line, for example by the least squares method, from the 2D point cloud obtained by scanning the 3D marker horizontally with the 2D LiDAR, and recognizing the slope (positive or negative).
  • the marker information integration unit 135 associates the ID information (identifier information) obtained by the ID decoding unit 133 with the position and orientation information of the three-dimensional marker obtained by the marker position and orientation information acquisition unit 134, and generates the output information of the recognition unit 113.
  • the flowchart in FIG. 16 shows an example of the processing procedure of the ID decoding unit 133.
  • the ID decoding unit 133 starts processing in step ST31.
  • the ID decoding unit 133 receives a frequency spectrum corresponding to the three-dimensional marker waveform from the FFT unit 132.
  • step ST33 the ID decoding unit 133 recognizes a predetermined number of peak frequencies corresponding to the number of ID digits present in the frequency spectrum.
  • step ST34 the ID decoding unit 133 obtains numerical ID information by calculating the ratio relationship (frequency ratio) of the predetermined number of peak frequencies.
  • step ST35 the ID decoding unit 133 sends the ID information to the marker information integration unit 135.
  • step ST36 the ID decoding unit 133 ends the series of processes.
  • the flowchart in FIG. 17 shows an example of the processing procedure of the marker position and orientation information acquisition unit 134.
  • the marker position and orientation information acquisition unit 134 starts processing in step ST41.
  • step ST42 the marker position and orientation information acquisition unit 134 receives a 2D point cloud corresponding to the three-dimensional marker waveform from the point cloud extraction filter unit 131, and also receives a frequency spectrum corresponding to the three-dimensional marker waveform from the FFT unit 132.
  • step ST43 the marker position and attitude information acquisition unit 134 extracts the DC component of the frequency spectrum and performs an inverse Fourier transform process on the DC component to obtain distance information from the aircraft (2D LiDAR) to the three-dimensional marker.
  • step ST44 the marker position and attitude information acquisition unit 134 obtains position information of the three-dimensional marker in the aircraft coordinate system based on the distance information from the aircraft (2D LiDAR) to the three-dimensional marker and the direction information of the three-dimensional marker relative to the aircraft (2D LiDAR).
  • step ST45 the marker position and attitude information acquisition unit 134 obtains rotation angle information of the three-dimensional marker, which indicates how much the front of the three-dimensional marker is rotating counterclockwise or clockwise relative to the direction of the aircraft (2D LiDAR), based on information on the magnification of the peak frequency that appears in the frequency spectrum relative to the peak frequency that appears in the frequency spectrum when the front of the three-dimensional marker is facing the direction of the aircraft.
  • step ST46 the marker position and orientation information acquisition unit 134 obtains an approximate straight line from the 2D point cloud, and obtains rotation direction information of the three-dimensional marker indicating whether the front of the three-dimensional marker 20 is rotating counterclockwise or clockwise relative to the direction of the aircraft (2D LiDAR) based on the positive or negative slope of the approximate straight line.
  • step ST47 the marker position and orientation information acquisition unit 134 obtains attitude (orientation) information of the three-dimensional marker in the aircraft coordinate system based on the rotation angle information and rotation direction information of the three-dimensional marker and the direction information of the three-dimensional marker relative to the aircraft.
  • step ST48 the marker position and attitude information acquisition unit 134 sends the marker position and attitude information (position information and attitude information) of the three-dimensional marker in the aircraft coordinate system to the marker information integration unit 135. Then, in step ST49, the marker position and attitude information acquisition unit 134 ends the series of processes.
  • the marker information in the aircraft coordinate system obtained by the recognition unit 113 is converted by the coordinate conversion unit 114 into marker information in the map coordinate system, and the landmark registration unit 116 writes this marker information in the map coordinate system into the information storage unit 115 as landmark information, making it possible to easily register markers placed in the environment as landmarks on the map.
  • a marker is displayed on the map based on the map coordinate system information of the marker obtained by the recognition unit 113, and when an instruction to register a specific marker displayed on the map is given from the application unit 121, the landmark registration unit 116 writes the information of the specific marker in the map coordinate system as landmark information to the information storage unit 115.
  • a specific marker displayed on the map can be registered as a landmark on the map only if the specific marker is located in an appropriate position, making it possible to selectively register landmarks with appropriate position information.
  • Example 2 is an example in which a moving body, in this case an autonomous traveling transport robot (hereinafter referred to as “robot” as appropriate), recognizes a marker (recognition target) placed in the environment, converts the position and attitude information of this marker from the aircraft coordinate system to a map coordinate system, and registers it as position and attitude information of a landmark.
  • robot autonomous traveling transport robot
  • FIG. 18 shows an example of the configuration of a system 10B for implementing the first embodiment.
  • This system 10B has a robot system 11B and a robot operation application 12B.
  • the robot system 11B has a sensor group 111, a self-position estimation unit 112, a recognition unit 113, a coordinate conversion unit 114, an information storage unit 115, and a landmark registration unit 116.
  • the information holding unit 115 holds information on the position and orientation of the machine (robot) in the map coordinate system obtained by the self-position estimation unit 112, and also holds information on the position and orientation of the marker in the map coordinate system obtained by the coordinate conversion unit 114.
  • the information on the position and orientation of the machine in the map coordinate system held in the information holding unit 115 is updated.
  • the identification unit 113 also obtains ID information (identifier information) of the marker
  • the information holding unit 115 holds the ID information as well as the position and orientation information in the map coordinate system as information on the marker. In the following explanation, the description of the ID information will be omitted as appropriate.
  • the information holding unit 115 holds the information on the position and orientation of the marker in the map coordinate system obtained by the coordinate conversion unit 114, it sends this information on the position and orientation of the marker in the map coordinate system to the landmark registration unit 116.
  • the landmark registration unit 116 receives information on the position and orientation of the marker in the map coordinate system from the information holding unit 115, it registers the landmark.
  • the landmark registration unit 116 writes the information on the position and orientation of the marker in the map coordinate system sent from the information holding unit 115 into the information holding unit 115 as information on the position and orientation of the landmark, and causes the information holding unit 115 to hold it.
  • information on the position and orientation in the map coordinate system of a marker recognized by the recognition unit 113 is stored in the information storage unit 115. However, if the marker is no longer recognized by the recognition unit 113 due to the movement of the machine (robot), the information on the position and orientation of the marker in the map coordinate system is deleted from the information storage unit 115.
  • the robot operation application 12B has an application unit 121 and a UI unit 122.
  • the UI unit 122 has a display unit that displays and presents information to the user, and an operation unit that accepts user operations.
  • the application unit 121 generates display information for displaying information stored in the information storage unit 115 of the robot system 11B and user operation information, and sends it to the UI unit 122.
  • the application unit 121 also receives user operation information from the UI unit 122 and performs processing corresponding to it.
  • the automatic landmark registration operation as described above is started, for example, when the user designates landmark search mode and issues a movement instruction to the machine (robot) based on the UI display displayed on the display unit of the UI unit 122.
  • the machine (robot) moves along a preset route, and information on the position and orientation in the map coordinate system of the markers recognized sequentially during the movement is written to the information storage unit 115 as information on the position and orientation of the landmark.
  • the route along which the machine (robot) moves can be set by the user through operation from the operation unit, for example, through a touch panel operation, based on the map displayed on the display unit of the UI unit 122.
  • the information storage unit 115 also stores map information, and the map displayed on the display unit of the UI unit 122 is based on this map information.
  • This map information may be provided to the robot system 11B in advance from an external source, or may be created within the robot system 11B.
  • the application section 121 of the robot operation application 12B When performing the landmark registration operation automatically as described above, the application section 121 of the robot operation application 12B periodically acquires position and orientation information of the machine (robot), markers, and landmarks in the map coordinate system from the information storage section 115 of the robot system 11B, and displays the machine (robot), markers, and registered landmarks as icons at the corresponding positions on the map displayed on the display section of the UI section 122.
  • the icon of the machine (robot) displayed on the map moves along the route as the machine (robot) moves. Also, in this case, an icon corresponding to the currently recognized marker is displayed on the map. Also, in this case, icons corresponding to all registered landmarks are displayed on the map.
  • the application unit 121 of the robot operation application 12B sends a landmark deletion instruction to the landmark registration unit 116 of the robot system 11B. Based on this, the landmark registration unit 116 deletes the information on the position and orientation of the landmark in the map coordinate system from the information storage unit 115.
  • FIG. 19(a) shows an example of the display on the display unit of the UI unit 122 in a state where the user has set a movement route by operating the operation unit before the automatic landmark registration operation is performed.
  • an icon indicating the location of the machine (robot) is displayed, and the set movement route from this location as the start location to the destination location (end location) indicated by a "flag" is displayed.
  • Figure 19 (b) shows an example of the display on the display unit of the UI unit 122 when the machine (robot) has moved partway along its travel path.
  • icons indicating Landmark 1, Landmark 2, and Landmark 3, which were registered while the machine (robot) was moving, are displayed in the corresponding positions.
  • an icon indicating Marker 1, which the machine (robot) recognizes at its current position is displayed in the corresponding position. Note that this Marker 1 will later be registered as Landmark 4.
  • Figure 19 (c) shows an example of the display on the display unit of the UI unit 122 when the machine (robot) has moved to the destination position (end position) of the movement path.
  • icons indicating Landmark 1, Landmark 2, Landmark 3, and Landmark 4, which were registered during the movement of the machine (robot) are displayed in the corresponding positions.
  • Figure 19 (d) shows an example of the display on the display unit of the UI unit 122 when the user performs an operation to select Landmark 1 after the machine (robot) has moved to the destination position (end position) of the movement route and the automatic landmark registration operation has ended.
  • an operation screen for deleting Landmark 1 from the map is displayed as a pop-up screen.
  • “Type” indicates the type of Landmark 1, which in this case indicates that it is a two-dimensional barcode.
  • “Pose” indicates the position and orientation information of Landmark 1 in the map coordinate system, "(x, y)” indicates the position, and "yaw” indicates the orientation.
  • the "Delete Map” button is a button that allows the user to perform the map deletion operation. When the user operates the "Delete Map” button, operation information for deleting the map is sent from the UI unit 122 to the application unit 121.
  • the landmark map deletion function allows users to delete landmarks that have been inappropriately registered on the map, such as when two landmarks are registered close to each other even though only one marker has been placed, or when a landmark is registered in an unexpected location that does not correspond to the placement of the marker.
  • the user can select a specific landmark from among the registered landmarks and perform the map deletion operation after the machine (robot) has moved to the destination position (end position) of the travel route and the automatic landmark registration operation has been completed, but even before the machine (robot) has moved to the destination position (end position) of the travel route, it is possible for the user to select a specific landmark from among the registered landmarks and perform the map deletion operation in a similar manner.
  • the flowchart in FIG. 20 shows an example of the processing procedure of the coordinate conversion unit 114.
  • the coordinate conversion unit 114 starts processing in step ST51.
  • step ST52 the coordinate conversion unit 114 acquires information on the position and attitude of the marker in the aircraft coordinate system from the recognition unit 113.
  • step ST53 the coordinate conversion unit 114 acquires information on the position and attitude of the aircraft in the map coordinate system at the same time as the information on the position and attitude of the marker in the aircraft coordinate system acquired in step ST52 from the self-position estimation unit 112.
  • step ST54 the coordinate conversion unit 114 converts the information on the position and attitude of the marker from the aircraft coordinate system to the map coordinate system based on the information on the position and attitude of the aircraft in the map coordinate system.
  • step ST55 the coordinate conversion unit 114 writes the information on the position and attitude of the marker in the map coordinate system to the information storage unit 115. After the processing of this step ST55, the coordinate conversion unit 114 returns to the processing of step ST52 and repeats the same processing as described above.
  • the information on the position and orientation of the marker in the map coordinate system written in the information storage unit 115 is automatically deleted after a certain period of time has passed. Therefore, for a specific marker placed in the environment, the information on the position and orientation of the specific marker in the map coordinate system will continue to be stored in the information storage unit 115 as long as the specific marker is recognized by the recognition unit 113, but once it is no longer recognized by the recognition unit 113, the information on the position and orientation of the specific marker in the map coordinate system will no longer be stored in the information storage unit 115.
  • the information on the position and orientation in the map coordinate system of the marker that is no longer recognized by the recognition unit 113 is deleted from the information storage unit 115, so that the markers displayed on the map in the UI unit 122 of the robot operation application 12A are only those that correspond to the markers currently recognized by the recognition unit 113.
  • the flowchart in FIG. 21 shows an example of the processing procedure of the application unit 121/UI unit 122.
  • the application unit 121/UI unit 122 starts processing in step ST61.
  • step ST62 the application unit 121/UI unit 122 acquires information on the position and orientation of the aircraft, markers, and landmarks in the map coordinate system from the information storage unit 115.
  • step ST63 the application unit 121/UI unit 122 displays the aircraft, markers, and landmarks on the map based on the information on the position and orientation of the aircraft, markers, and landmarks in the map coordinate system.
  • step ST64 the application unit 121/UI unit 122 determines whether the user has performed a map deletion operation. If a map deletion operation has been performed, in step ST65, the application unit 121/UI unit 122 sends a landmark deletion instruction to the landmark registration unit 116.
  • step ST65 the application unit 121/UI unit 122 returns to processing step ST62 and repeats the same processing as described above. Note that if there is no map deletion operation in step ST64, the application unit 121/UI unit 122 immediately returns to processing step ST62.
  • the flowchart in FIG. 22 shows an example of the processing procedure of the landmark registration unit 116.
  • the landmark registration unit 116 starts processing in step ST71.
  • step ST72 the landmark registration unit 116 acquires information on the position and orientation of the marker in the map coordinate system from the information storage unit 115.
  • step ST73 the landmark registration unit 116 determines whether the information on the position and orientation of the marker in the map coordinate system acquired in step ST72 has already been registered as information on the position and orientation of the landmark. If it has not yet been registered, in step ST74, the landmark registration unit 116 writes and registers the information on the position and orientation of the marker in the map coordinate system acquired in step ST72 in the information storage unit 115 as information on the position and orientation of the landmark.
  • step ST74 the landmark registration unit 116 proceeds to processing step ST75. If the landmark has already been registered in step ST73, the landmark registration unit 116 immediately proceeds to processing step ST75. In step ST75, the landmark registration unit 116 determines whether or not an instruction to delete the landmark has been received from the application unit 121.
  • step ST76 the landmark registration unit 116 deletes information on the position and orientation of the landmark in the map coordinate system related to the landmark deletion instruction from the information storage unit 115. After processing in step ST76, the landmark registration unit 116 returns to processing in step ST72 and repeats the same processing as described above. Note that if there is no landmark deletion instruction in step ST75, the landmark registration unit 116 immediately returns to processing in step ST72.
  • markers placed in the environment can be automatically registered as landmarks on the map. Furthermore, in system 10B of Example 2, when a deletion instruction is given for a specific landmark displayed on the map, landmark registration unit 116 deletes the position and orientation information of the specific landmark from information storage unit 115, making it possible to easily delete landmarks that have been inappropriately registered on the map from the map, such as when two landmarks are registered close to each other even though only one marker has been placed, or when a landmark is registered in an unexpected position that does not correspond to the placement position of the marker. In this case, if ID information of the specific marker is also recognized by recognition unit 113 as described above, the information of the registered landmark includes ID information in addition to position and orientation information.
  • Example 3 is an example in which AR (Augmented Reality) markers are placed in the environment as landmarks, and the machine (robot) recognizes the landmarks to calculate its own position and attitude with high accuracy, and presents accompanying information associated with the landmarks to the user.
  • AR Augmented Reality
  • the machine recognizes landmarks (AR markers) placed in the environment, acquires the position and ID information (identifier information) of the landmark in the machine's coordinate system, and identifies the relevant landmark from among the multiple landmarks registered on the map, i.e., stored in the memory, from this ID information, and acquires information on the position of the recognized landmark in the map coordinate system.
  • the machine (robot) then calculates its own position in the map coordinate system with high accuracy from the information on the position of the recognized landmark in the map coordinate system and the information on the position of that landmark in the machine's coordinate system.
  • the accompanying information presented to the user here is information about areas suitable for calculating the user's own position information in the map coordinate system, and more specifically, information about areas where the recognition accuracy of the landmark (AR marker) position is above a certain level.
  • the accuracy of the calculated self-position depends on the recognition accuracy of the position of the AR marker on the aircraft (robot).
  • the distance at which recognition accuracy is good changes depending on the size of the AR marker. For example, if the AR marker is small, high recognition accuracy cannot be achieved unless the distance is close, and if the AR marker is large, high recognition accuracy can be achieved even if the distance is far.
  • recognition accuracy changes depending on the angle at which the AR marker is viewed. For example, recognition accuracy is high when viewed from the front, and low when viewed from an angle.
  • FIG. 23 shows an example of the configuration of a system 10C for implementing the third embodiment.
  • This system 10C has a robot system 11C and a robot operation application 12C.
  • the robot system 11C has an information storage unit 141.
  • This information storage unit 141 stores ID information and information on the position and orientation in the map coordinate system of each landmark (AR marker) registered on a map.
  • This information storage unit 141 also stores information on the position and orientation of the machine (robot) in the map coordinate system calculated by a self-position estimation unit (not shown). In this case, the information on the position and orientation of the machine in the map coordinate system stored in the information storage unit 141 is updated every time the self-position estimation unit obtains information on the position and orientation of the machine in the map coordinate system.
  • the robot operation application 12C has an application section 151, a UI section 152, and a prior knowledge storage section 153.
  • the prior knowledge storage unit 153 stores the relationship equation between distance and recognition accuracy for each size of the landmark (AR marker), the relationship equation between angle and recognition accuracy, and observable range parameters (minimum observable distance, maximum observable distance, minimum observable angle, maximum observable angle).
  • the UI unit 152 includes a display unit that displays and presents information to the user, and an operation unit that accepts user operations.
  • the application unit 151 generates display information based on information stored in the information storage unit 141 of the robot system 11C, user operation information, information stored in the prior knowledge storage unit 153, and the like, and sends the display information to the UI unit 152.
  • This display information also includes display information for areas where the recognition accuracy of the position and orientation of the landmark (AR marker) is above a certain level.
  • Figure 24(a) shows the correspondence between the distance d and the standard deviation ⁇ in formula (1).
  • the minimum observable distance corresponds to the minimum distance at which the aircraft (robot) can recognize the landmark (AR marker).
  • the maximum observable distance corresponds to the maximum distance that satisfies the required recognition accuracy.
  • Figure 24(b) shows the correspondence between the angle ⁇ and the standard deviation ⁇ in formula (2).
  • the minimum observable angle corresponds to the angle in the negative direction that satisfies the required recognition accuracy.
  • the maximum observable angle corresponds to the maximum angle in the positive direction that satisfies the required recognition accuracy.
  • the sector-shaped area that satisfies the inequalities of minimum observable distance ⁇ d ⁇ maximum observable distance and minimum observable angle ⁇ ⁇ ⁇ maximum observable angle can be obtained as the area where the recognition accuracy of the position and orientation of the landmark (AR marker) is above a certain level.
  • the application unit 151 acquires from the prior knowledge storage unit 153 the relational equation between distance and recognition accuracy corresponding to the size of the landmark (AR marker), the relational equation between angle and recognition accuracy, and information on observable range parameters, and determines an area (sector area) associated with the landmark in which the recognition accuracy of the position and orientation of the landmark (AR marker) is at or above a certain level, and sends display information for displaying that area to the UI unit 152 together with display information for displaying the landmark (AR marker).
  • the application unit 151 acquires the size information of the landmark (AR marker), for example, from the UI unit 152 as user operation information, or from the information storage unit 141 of the robot system 11C, or from other sources.
  • the UI unit 152 displays a landmark (AR marker) at a corresponding position on the map based on the display information supplied from the application unit 151, and also displays a sector-shaped area associated with the landmark where the recognition accuracy of the position and orientation of the landmark (AR marker) is equal to or higher than a certain level.
  • the information storage unit 141 also stores map information, and the map displayed on the display unit of the UI unit 152 is based on this map information.
  • This map information may be supplied to the robot system 11C in advance from an external source, or may be created within the robot system 11C.
  • Figure 25 (a) shows an example of landmarks (AR marks) displayed on a map, and sector-shaped areas where the recognition accuracy of the landmark (AR marker) position is above a certain level.
  • three registered landmarks (AR marks), Landmark 1, Landmark 2, and Landmark 3 are displayed as icons, and for each, a sector-shaped area where the recognition accuracy of the landmark (AR marker) position is above a certain level is displayed.
  • the sectorial area is divided into three stages, with the innermost area representing the area with higher recognition accuracy.
  • the sectorial area may not be divided into multiple stages in this way, and recognition accuracy may be represented by a gradation.
  • FIG. 25(b) shows an example of the display on the display unit of the UI unit 152 when the user has set a movement route by operating the operation unit.
  • the set movement route is displayed from the position where the machine (robot) is located as the start position to the destination position (end position) indicated by a "flag.”
  • the application unit 151 may display a warning, for example as shown in the example, based on the positional relationship between the set travel route and the sector-shaped area of each landmark.
  • This warning display can prompt the user to change the travel route or the installation position of the landmark (AR mark).
  • the application unit 151 acquires from the prior knowledge storage unit 153 the relational equation between distance and recognition accuracy corresponding to the size of the landmark (AR marker), the relational equation between angle and recognition accuracy, and information on observable range parameters, and acquires an area (sector area) associated with the landmark in which the recognition accuracy of the landmark (AR marker) position is equal to or higher than a certain level, but does not mention the type, performance, etc. of the sensor mounted on the robot system 11C for recognizing the landmark (AR mark).
  • the prior knowledge holding unit 153 may be configured to hold the relational equation between distance and recognition accuracy, the relational equation between angle and recognition accuracy, and the observable range parameters for each size of the landmark (AR marker) and for each sensor of different type or performance
  • the application unit 151 may acquire information on the relational equation between distance and recognition accuracy, the relational equation between angle and recognition accuracy, and the observable range parameters that correspond to the size of the landmark (AR marker) and the sensor mounted on the robot system 11C from the prior knowledge holding unit 153, acquire an area (sector area) associated with the landmark where the recognition accuracy of the position of the landmark (AR marker) is equal to or higher than a certain level, and send display information for displaying the area to the UI unit 152 for display.
  • the application unit 151 obtains information about the sensors installed in the robot system 11C, for example, from the UI unit 152 as user operation information, or from the information storage unit 141 of the robot system 11C, or from other sources.
  • the display unit of the UI unit 152 of the robot operation app 12C displays on a map information on areas suitable for calculating the position information of the self, i.e., the machine (robot), in the map coordinate system using landmarks (AR markers), specifically, information on areas where the recognition accuracy of the landmark (AR marker) position is at least a certain level.
  • a map information on areas suitable for calculating the position information of the self i.e., the machine (robot)
  • AR markers specifically, information on areas where the recognition accuracy of the landmark (AR marker) position is at least a certain level.
  • landmarks AR markers
  • Example 4 is an example in which retroreflective markers are placed in the environment as landmarks, and a machine (robot) recognizes the landmarks to calculate its own position with high accuracy, and presents accompanying information to a user in association with the landmarks.
  • the retroreflective markers are markers made of retroreflective material.
  • the machine recognizes landmarks (retroreflective markers) placed in the environment and determines the triangle specified by the three landmarks from the distances and angles to the three landmarks.
  • the machine identifies three landmarks that are located in a position that forms a triangle that is congruent with the identified triangle from among the multiple landmarks registered on the map, i.e., stored in the memory unit.
  • the machine (robot) then calculates its own position in the map coordinate system with high accuracy using the well-known principles of triangulation from the distances and angles to the three identified landmarks and information on the positions of the three identified landmarks in the map coordinate system.
  • FIG. 26 shows a configuration example of a system 10D for implementing the fourth embodiment.
  • This system 10D has a robot system 11D and a robot operation application 12D.
  • the robot system 11D has an information storage unit 161.
  • This information storage unit 161 stores information on the position and orientation in the map coordinate system of each landmark (retroreflective marker) registered on a map.
  • This information storage unit 161 also stores information on the position and orientation in the map coordinate system of the machine (robot) calculated by a self-position estimation unit (not shown). In this case, every time the self-position estimation unit obtains information on the position and orientation of the machine in the map coordinate system, the information on the position and orientation of the machine in the map coordinate system stored in the information storage unit 161 is updated.
  • the robot operation application 12D has an application section 171, a UI section 172, and a prior knowledge holding section 173.
  • the prior knowledge holding section 173 holds information such as the recognizable angle information of landmarks (retroreflective markers).
  • the UI unit 172 includes a display unit that displays and presents information to the user, and an operation unit that accepts user operations.
  • the application unit 171 generates display information based on information stored in the information storage unit 161 of the robot system 11D, user operation information, information stored in the prior knowledge storage unit 173, and the like, and sends the display information to the UI unit 172.
  • This display information also includes information for displaying an area in which three or more landmarks (retroreflective markers) can be recognized.
  • the application unit 171 obtains information on the recognizable angle of the landmark (retroreflective marker) from the prior knowledge storage unit 173, determines an area in which three or more landmarks (retroreflective markers) can be recognized, and sends display information for displaying that area to the UI unit 172 together with display information for displaying the landmark (AR marker).
  • the UI unit 172 displays landmarks (retroreflective markers) at corresponding positions on the map based on the display information supplied from the application unit 171, and also displays areas in which three or more landmarks can be recognized in association with the landmarks, that is, areas in which the vehicle's position and orientation can be calculated with high accuracy by recognizing the landmarks (retroreflective markers).
  • the information storage unit 161 also stores map information, and the map displayed on the display unit of the UI unit 172 is based on this map information.
  • This map information may be provided to the robot system 11D in advance from an external source, or may be created within the robot system 11D.
  • Figure 27 shows an example of landmarks (retroreflective markers) displayed on a map, and areas in which three or more landmarks can be recognized.
  • landmarks retroreflective markers
  • three registered landmarks AR marks
  • Landmark 1, Landmark 2, and Landmark 3 are displayed as icons, and the areas in which each landmark can be recognized are displayed.
  • the overlapping area of the areas in which each landmark can be recognized is shown as the area in which three or more landmarks can be recognized.
  • the areas in which three landmarks (AR marks), Landmark 1, Landmark 2, and Landmark 3, can each be recognized are displayed, and the overlapping area of these areas is shown to the user as an area in which three or more landmarks can be recognized; however, a configuration in which only the overlapping area (the area in which three or more landmarks can be recognized) is also conceivable.
  • the display unit of the UI unit 172 of the robot operation app 12D displays areas on a map where three or more landmarks can be recognized, that is, areas where landmarks (retroreflective markers) can be recognized and the user's own position and orientation can be calculated with high accuracy.
  • the user can easily grasp the areas where the user's own position can be calculated with high accuracy, and it becomes easy to, for example, set the movement path of the machine (robot) so that the user's own position can be calculated well, or change the placement of landmarks to an appropriate position on the movement path of the machine (robot).
  • Example 5 is an example in which retroreflective markers are placed in the environment as landmarks, and the machine (robot) recognizes the landmarks to calculate its own position with high accuracy, and in which accompanying information associated with the landmarks is presented to the user.
  • the additional information presented to the user is information on the accuracy of the self-location calculated at the position specified by the user, or information on the position where additional landmarks (retroreflective markers) should be placed to improve the accuracy of the self-location calculated at the position specified by the user.
  • FIG. 28 shows an example of the configuration of a system 10E for implementing Example 5.
  • parts corresponding to those in FIG. 26 are given the same reference numerals, and detailed descriptions thereof will be omitted as appropriate.
  • This system 10E has a robot system 11E and a robot operation application 12E.
  • the robot system 11E has an information holding unit 161.
  • This information holding unit 161 holds information on the position and orientation in the map coordinate system of each landmark (retroreflective marker) registered on a map.
  • This information holding unit 161 also holds information on the position and orientation in the map coordinate system of the aircraft (robot) calculated by a self-position estimation unit (not shown). In this case, every time information on the position and orientation of the aircraft in the map coordinate system is obtained by the self-position estimation unit, the information on the position and orientation of the aircraft in the map coordinate system held in the information holding unit 161 is updated.
  • the robot operation application 12E has an application section 171, a UI section 172, and a prior knowledge holding section 173.
  • the prior knowledge holding section 173 holds a distance measurement error model for landmarks (retroreflective markers), i.e., the relational equation between distance and distance measurement error.
  • the UI unit 172 includes a display unit that displays and presents information to the user, and an operation unit that accepts user operations.
  • the application unit 171 generates display information based on information stored in the information storage unit 161 of the robot system 11E, user operation information, information stored in the prior knowledge storage unit 173, and the like, and sends the display information to the UI unit 172.
  • This display information also includes information that displays the accuracy of the user's own position calculated at a position specified by the user, and information that displays the position where an additional landmark (retroreflective marker) should be placed to improve the accuracy of the user's own position calculated at a position specified by the user.
  • the application unit 171 obtains a landmark (retroreflective marker) ranging error model, i.e., the relational equation between distance and ranging error, from the prior knowledge storage unit 173, and determines the positions where additional landmarks (retroreflective markers) should be placed to improve the accuracy (ranging error) of the self-location calculated at the position specified by the user and the accuracy of the self-location calculated at the position specified by the user, and sends the display information for the landmarks to the UI unit 172, together with display information for displaying the landmarks.
  • a landmark radioreflective marker
  • the UI unit 172 displays landmarks (retroreflective markers) at corresponding positions on the map based on the display information supplied from the application unit 171, and, in association with the landmarks, displays the accuracy (ranging error) of the user's position calculated at the position specified by the user, as well as positions where additional landmarks (retroreflective markers) should be placed to improve the accuracy of the user's position calculated at the position specified by the user.
  • Figure 29 shows an example of landmarks (retroreflective markers) displayed on a map, and the accuracy of self-location (distance measurement error) calculated at a position specified by the user.
  • landmarks proreflective markers
  • the accuracy of self-location distance measurement error
  • three registered landmarks, Landmark 1, Landmark 2, and Landmark 3 are displayed as icons, and the distance measurement error of each landmark at the position specified by the user (see arrow cursor) is displayed as a donut circle.
  • the larger the distance measurement error the thicker the donut circle.
  • the accuracy of the user's position (ranging error) calculated at the position specified by the user, calculated from the overlapping shape of the doughnut circles for each landmark, is further displayed as a circle or ellipse.
  • the larger the circle or ellipse the worse the accuracy of the user's position, i.e., the larger the ranging error.
  • the accuracy of the user's position (ranging error) is shown as an ellipse, it represents the uncertainty in the major axis direction, i.e., the larger the ranging error.
  • Figure 30 shows another example of landmarks (retroreflective markers) displayed on a map and the accuracy of self-location (distance measurement error) calculated at a position specified by the user.
  • landmarks progenitor markers
  • the accuracy of self-location (distance measurement error) calculated at the position specified by the user is displayed as an ellipse.
  • three landmarks are lined up horizontally (left and right in the figure), which results in a large uncertainty in the horizontal direction, i.e., a large distance measurement error.
  • a warning message may be displayed to the user (not shown).
  • FIG. 31 shows yet another example of landmarks (retroreflective markers) displayed on a map and the accuracy (ranging error) of the self-location calculated at a position specified by the user.
  • positions where additional landmarks (retroreflective markers) should be placed to improve the accuracy of the self-location calculated at the position specified by the user are displayed as suggested. In this way, the positions where additional landmarks (retroreflective markers) should be placed are required to cancel out the large lateral uncertainty in the accuracy (ranging error) of the self-location calculated at the position specified by the user.
  • the display unit of the UI unit 172 of the robot operation app 12E displays information on the accuracy of the self-location calculated at a position specified by the user on a map, allowing the user to easily grasp the accuracy of the self-location calculated at the specified position and easily set a travel route so that the self-location is calculated with high accuracy.
  • the display unit of the UI unit 172 of the robot operation application 12E displays information on the positions where additional landmarks (retroreflective markers) should be placed on the map to improve the accuracy of the self-position calculated at the position specified by the user, so that the user can easily understand the positions where additional landmarks should be placed to improve the accuracy of the self-position calculated at the specified position, and can easily place additional landmarks so that the self-position is calculated with high accuracy at the specified position.
  • Example 6 Example 6, like Example 4, is an example in which retroreflective markers are placed in the environment as landmarks, and the machine (robot) recognizes the landmarks to calculate its own position with high accuracy, and in which accompanying information associated with the landmarks is presented to the user.
  • the accompanying information presented to the user here may be warning information in cases where the placement of landmarks may cause malfunction, or information on recommended landmark placement in cases where the placement of landmarks may cause malfunction, etc.
  • the machine identifies three landmarks located in positions that form a triangle that is congruent with the triangle identified by the three recognized landmarks from among multiple landmarks registered on the map, and calculates its own position in the map coordinate system with high accuracy using the principles of triangulation from the distances and angles to the three identified landmarks and the information on the positions of the three identified landmarks in the map coordinate system.
  • the recognized landmark may not be correctly identified from among the multiple landmarks registered on the map.
  • FIG. 32 shows an example of the configuration of a system 10F for implementing Example 6.
  • parts corresponding to those in FIG. 26 are given the same reference numerals, and detailed descriptions thereof will be omitted as appropriate.
  • This system 10F has a robot system 11F and a robot operation application 12F.
  • the robot system 11F has an information holding unit 161.
  • This information holding unit 161 holds information on the position and orientation in the map coordinate system of each landmark (retroreflective marker) registered on a map.
  • This information holding unit 161 also holds information on the position and orientation in the map coordinate system of the aircraft (robot) calculated by a self-position estimation unit (not shown). In this case, every time the self-position estimation unit obtains information on the position and orientation of the aircraft in the map coordinate system, the information on the position and orientation of the aircraft in the map coordinate system held in the information holding unit 161 is updated.
  • the robot operation application 12F has an application section 171, a UI section 172, and a prior knowledge holding section 173.
  • the prior knowledge holding section 173 holds template information indicating landmark arrangements that may cause malfunctions, that is, information such as distance thresholds and equilateral triangles.
  • the UI unit 172 includes a display unit that displays and presents information to the user, and an operation unit that accepts user operations.
  • the application unit 171 generates display information based on information stored in the information storage unit 161 of the robot system 11F, user operation information, information stored in the prior knowledge storage unit 173, and the like, and sends the display information to the UI unit 172.
  • This display information also includes information that displays a warning when there is a possibility of malfunction due to the landmark placement, and information that displays recommended landmark placement when there is a possibility of malfunction due to the landmark placement.
  • the application unit 171 acquires template information (information such as distance thresholds, equilateral triangles, etc.) indicating landmark placements that may cause malfunctions from the prior knowledge storage unit 173, and determines, based on the template information, whether or not there are any landmark placements that may cause malfunctions among the multiple landmarks registered on the map, and if so, sends display information for displaying a warning or recommended positions together with display information for displaying the landmarks.
  • template information information such as distance thresholds, equilateral triangles, etc.
  • the UI unit 172 displays landmarks (retroreflective markers) at corresponding positions on the map based on the display information supplied by the application unit 171, and when there is a landmark placement that may cause a malfunction, it displays a warning or a recommended position in association with the landmark.
  • landmarks retroreflective markers
  • Figure 33 shows an example of landmarks (retroreflective markers) displayed on a map, along with warnings and recommended locations.
  • landmarks retroreflective markers
  • four registered landmarks, Landmark 1, Landmark 2, Landmark 3, and Landmark 4 are displayed as icons.
  • Landmark 2 and Landmark 3 are located close to each other, a warning is displayed, and the recommended location for Landmark 2 is displayed as Suggested.
  • Figure 34 shows another example of landmarks (retroreflective markers) and warnings and recommended locations displayed on a map.
  • landmarks retroreflective markers
  • three registered landmarks, Landmark 1, Landmark 2, and Landmark 3 are displayed as icons.
  • Landmarks 1 to 3 are positioned in an equilateral triangle, a warning is displayed, and the recommended location for Landmark 3 is displayed as Suggested.
  • Example 7 Example 7, like Example 4, is an example in which retroreflective markers are placed in the environment as landmarks, and the machine (robot) recognizes the landmarks to calculate its own position with high accuracy, and in which accompanying information associated with the landmarks is presented to the user.
  • the additional information presented to the user here is information about areas on the map where there is a possibility of erroneous distance measurement to landmarks along the movement path of the machine (robot).
  • the range of erroneous distance measurement in this case is the range of a circle with a radius r (e.g., 1.5 m) centered on the aircraft (robot), as shown in Figure 35 (a).
  • a gimbal error may occur when the landmark falls within a certain angle of view and distance measurement, resulting in an erroneous distance measurement.
  • the range of erroneous distance measurement in this case is a sector-shaped range defined by the angle of view (deg) and distance measurement distance (min, max), as shown in Figure 35 (b).
  • FIG. 36 shows an example of the configuration of a system 10G for implementing Example 7.
  • parts corresponding to those in FIG. 26 are given the same reference numerals, and detailed descriptions thereof will be omitted as appropriate.
  • This system 10G has a robot system 11G and a robot operation application 12G.
  • the robot system 11G has an information holding unit 161.
  • This information holding unit 161 holds information on the position and orientation in the map coordinate system of each landmark (retroreflective marker) registered on a map.
  • This information holding unit 161 also holds information on the position and orientation in the map coordinate system of the aircraft (robot) calculated by a self-position estimation unit (not shown). In this case, every time the self-position estimation unit obtains information on the position and orientation of the aircraft in the map coordinate system, the information on the position and orientation of the aircraft in the map coordinate system held in the information holding unit 161 is updated.
  • the robot operation application 12G has an application section 171, a UI section 172, and a prior knowledge holding section 173.
  • the prior knowledge holding section 173 holds set information such as the type of sensor (the sensor used to recognize landmarks) and the range of erroneous distance measurement.
  • the UI unit 172 includes a display unit that displays and presents information to the user, and an operation unit that accepts user operations.
  • the application unit 171 generates display information based on information stored in the information storage unit 161 of the robot system 11G, user operation information, information stored in the prior knowledge storage unit 173, and the like, and sends the display information to the UI unit 172.
  • This display information includes information that displays the movement path of the machine (robot) set by the user, and information that displays areas in which there is a possibility of erroneous distance measurement of landmarks on the movement path of the machine (robot) set on a map.
  • the application unit 171 sends display information for displaying areas on the movement path of the machine (robot) set on a map where there is a possibility of erroneous distance measurement of landmarks, based on the set information, together with display information for displaying the landmarks and the movement path.
  • the application unit 171 obtains information on the sensor for recognizing landmarks mounted on the machine (robot), for example, as user operation information from the UI unit 172, or from the information holding unit 161 of the robot system 11G, or from other sources.
  • the UI unit 172 displays landmarks (retroreflective markers) at corresponding positions on the map and the travel route set by user operation based on the display information supplied from the application unit 171, and if the sensor for recognizing the landmark mounted on the machine (robot) has a range of mismeasurement, it further displays areas on the travel route where there is a possibility of mismeasurement of the landmark.
  • landmarks retroreflective markers
  • Figure 37 shows an example of landmarks (retroreflective markers) displayed on a map, a movement route set by user operation, and an area where distance to the landmarks may be erroneously measured.
  • landmarks proliferative markers
  • three registered landmarks, Landmark 1, Landmark 2, and Landmark 3 are displayed as icons.
  • the set movement route from the location where the machine (robot) is located as the start position to the destination position (end position) indicated by a "flag" is displayed.
  • Area RE where there is a possibility of erroneous distance measurement is indicated by an oval at a position close to landmark 2 on this movement path. This means that if the aircraft (robot) moves along its movement path and enters area RE, there is a possibility of erroneous distance measurement with respect to landmark 2.
  • the warning message "Warning!" associated with the display of area RE.
  • the display unit of the UI unit 172 of the robot operation app 12G displays areas where there is a possibility of erroneous distance measurement of landmarks along the movement route set by user operation, allowing the user to easily grasp areas where there is a possibility of erroneous distance measurement of landmarks along the movement route, and easily enabling measures such as changing the route or changing the placement position of the landmark to avoid erroneous distance measurement of the landmark.
  • Example 8 Example 8, like Example 4, is an example in which retroreflective markers are placed in the environment as landmarks, and the machine (robot) recognizes the landmarks to calculate its own position with high accuracy, and in which accompanying information associated with the landmarks is presented to the user.
  • the additional information presented to the user here is a warning when the path of the machine (robot) set on the map contains a location with a curvature greater than or equal to a set value within an area where three or more landmarks (retroreflective markers) can be recognized.
  • attitude attitude
  • a slight deviation in attitude can appear as a position deviation of several tens of centimeters several tens of meters ahead.
  • FIG. 38 shows an example of the configuration of a system 10H for implementing Example 7.
  • parts corresponding to those in FIG. 26 are given the same reference numerals, and detailed descriptions thereof will be omitted as appropriate.
  • This system 10H has a robot system 11H and a robot operation application 12H.
  • the robot system 11H has an information holding unit 161.
  • This information holding unit 161 holds information on the position and orientation in the map coordinate system of each landmark (retroreflective marker) registered on a map.
  • This information holding unit 161 also holds information on the position and orientation in the map coordinate system of the aircraft (robot) calculated by a self-position estimation unit (not shown). In this case, every time the self-position estimation unit obtains information on the position and orientation of the aircraft in the map coordinate system, the information on the position and orientation of the aircraft in the map coordinate system held in the information holding unit 161 is updated.
  • the robot operation application 12H has an application section 171, a UI section 172, and a prior knowledge holding section 173.
  • the prior knowledge holding section 173 holds the set value of the curvature that issues a warning, information on the recognizable angle of a landmark, etc.
  • the UI unit 172 includes a display unit that displays and presents information to the user, and an operation unit that accepts user operations.
  • the application unit 171 generates display information based on information stored in the information storage unit 161 of the robot system 11H, user operation information, information stored in the prior knowledge storage unit 173, and the like, and sends the display information to the UI unit 172.
  • This display information also includes information that displays a warning when a location with a curvature equal to or greater than a set value is included in the movement path of the machine (robot) set on a map within an area where three or more landmarks can be recognized (area where self-position correction can be performed).
  • the application unit 171 calculates an area in which three or more landmarks can be recognized based on the landmark recognizable angle information stored in the prior knowledge storage unit 173, and determines whether that area includes a location on the movement path of the machine (robot) set on the map that has a curvature equal to or greater than a set value. If so, the application unit 171 sends display information for displaying a warning along with the landmarks, the movement path, and further display information for displaying the area in which three or more landmarks can be recognized.
  • the UI unit 172 displays landmarks and the travel route set by user operation at corresponding positions on the map based on the display information supplied from the application unit 171, as well as areas in which three or more landmarks can be recognized, and displays a warning if the area in which three or more landmarks can be recognized contains a location on the travel route with a curvature equal to or greater than a set value.
  • Figure 39 shows an example of the display of landmarks (retroreflective markers) on a map, a movement route set by user operation, and an area where three or more landmarks can be recognized.
  • landmarks retroreflective markers
  • three registered landmarks, Landmark 1, Landmark 2, and Landmark 3 are displayed as icons.
  • the area RE where landmarks 1 to 3 can be recognized is displayed as a triangle.
  • a warning "Warning!" is also displayed.
  • the display unit of the UI unit 172 of the robot operation application 12H displays a warning if a location on the set movement path of the machine (robot) has a curvature greater than or equal to a set value within an area where three or more landmarks can be recognized.
  • a location on the set movement path with a curvature greater than or equal to a set value is within an area where three or more landmarks can be recognized (area where self-position correction can be performed), and allows the user to easily take action such as changing the movement path or changing the placement position of the landmark.
  • the series of processes in the robot system and the robot operation application in each of the above-mentioned embodiments can be executed by hardware, but can also be executed by software.
  • the programs constituting the software are installed from a recording medium into a computer incorporated in dedicated hardware, or into, for example, a general-purpose computer capable of executing various functions by installing various programs.
  • FIG. 40 is a block diagram showing an example of the hardware configuration of a computer 400.
  • the computer 400 has a CPU 401, a ROM 402, a RAM 403, a bus 404, an input/output interface 405, an input unit 406, an output unit 407, a storage unit 408, a drive 409, a connection port 410, and a communication unit 411.
  • the hardware configuration shown here is an example, and some of the components may be omitted.
  • the computer 400 may include further components other than those shown here.
  • the CPU 401 functions, for example, as an arithmetic processing device or control device, and controls the overall operation or part of the operation of each component based on various programs recorded in the ROM 402, the RAM 403, the storage unit 408, or the removable recording medium 501.
  • ROM 402 is a means for storing programs loaded into CPU 401 and data used in calculations.
  • RAM 403 temporarily or permanently stores, for example, programs loaded into CPU 401 and various parameters that change as appropriate when the programs are executed.
  • the CPU 401, ROM 402, and RAM 403 are connected to each other via a bus 404. Meanwhile, various components are connected to the bus 404 via an interface 405.
  • the input unit 406 may be, for example, a mouse, a keyboard, a touch panel, a button, a switch, a lever, or the like. Furthermore, the input unit 406 may also be a remote controller (hereinafter, a remote control) capable of transmitting control signals using infrared rays or other radio waves.
  • a remote controller capable of transmitting control signals using infrared rays or other radio waves.
  • the output unit 407 is a device capable of visually or audibly notifying the user of acquired information, such as a display device such as a CRT (Cathode Ray Tube), LCD, or organic EL, an audio output device such as a speaker or headphones, a printer, a mobile phone, or a facsimile.
  • a display device such as a CRT (Cathode Ray Tube), LCD, or organic EL
  • an audio output device such as a speaker or headphones
  • a printer such as a printer, a mobile phone, or a facsimile.
  • the storage unit 408 is a device for storing various types of data.
  • a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device may be used as the storage unit 408.
  • the drive 409 is a device that reads information recorded on a removable recording medium 501, such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, or writes information to the removable recording medium 501.
  • a removable recording medium 501 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory
  • the removable recording medium 501 is, for example, a DVD medium, a Blu-ray (registered trademark) medium, an HD DVD medium, or various semiconductor storage media.
  • the removable recording medium 501 may also be, for example, an IC card equipped with a non-contact IC chip, or an electronic device, etc.
  • the connection port 410 is a port for connecting an external device 502, such as a Universal Serial Bus (USB) port, an IEEE 1394 port, a Small Computer System Interface (SCSI), an RS-232C port, or an optical audio terminal.
  • the external device 502 is, for example, a printer, a portable music player, a digital camera, a digital video camera, or an IC recorder.
  • the communication unit 411 is a communication device for connecting to the network 503, and may be, for example, a communication card for wired or wireless LAN, Bluetooth (registered trademark), or WUSB (Wireless USB), a router for optical communications, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various types of communications.
  • the program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.
  • the present technology can also be configured as follows.
  • an information storage unit that stores information on landmarks that are used as indicators for calculating a self-position; an information generating unit that generates associated information according to a type of landmark based on the landmark information stored in the information storing unit; 1.
  • An information processing device comprising: a display unit that displays landmarks on a map based on landmark information stored in the information storage unit, and displays the associated information in association with the landmarks.
  • a recognition unit that recognizes a predetermined marker;
  • (3) further comprising a display unit configured to display the predetermined marker on a map based on information of the map coordinate system of the predetermined marker recognized by the recognition unit;
  • the information processing device described in (2) wherein when a registration instruction is given for a specific marker displayed on the map, the landmark registration unit writes information of the map coordinate system of the specific marker to the information storage unit as information of the landmark.
  • the predetermined marker is made of a retroreflective material
  • the information processing device according to (7), wherein the recognition unit extracts a 2D point cloud corresponding to the predetermined marker from the 2D point cloud obtained by the 2D LiDAR in accordance with a level of reflected light.
  • the information processing device according to any one of (1) to (8), wherein an AR marker is used as the landmark.
  • the information processing device according to (9), wherein the associated information is information on an area suitable for calculating the position and orientation of the information processing device.
  • a retroreflective marker is used as the landmark.
  • the information processing device (12) The information processing device according to (11), wherein the associated information is information on an area in which three or more of the landmarks can be recognized. (13) The information processing device according to (11) or (12), wherein the associated information is information on accuracy of a self-location calculated at a position designated by a user. (14) The information processing device described in any one of (11) to (13), wherein the associated information is information on a location where an additional landmark should be placed to improve accuracy of a self-location calculated at a location specified by a user. (15) The information processing device according to any one of (11) to (14), wherein the associated information is warning information in the case where there is a possibility of malfunction due to the arrangement of the landmarks.
  • the information processing device according to any one of (11) to (15), wherein the associated information is information on a recommended arrangement of the landmarks in a case where there is a possibility of malfunction due to the arrangement of the landmarks.
  • the associated information is information on an area in which there is a possibility of erroneous distance measurement of the landmark on a route set on a map.
  • the associated information is warning information when a location on a route set on a map has a curvature greater than or equal to a set value within an area where three or more of the landmarks can be recognized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'objectif de la présente invention est de permettre à un calcul de position propre satisfaisant d'être réalisé sur la base de points de repère. Le dispositif de traitement d'informations est muni d'une unité de stockage d'informations pour le stockage d'informations relatives à des points de repère servant d'indicateurs pour le calcul de position propre. Une unité de génération d'informations génère des informations d'accompagnement correspondant à des types de points de repère sur la base d'informations relatives aux points de repère stockées dans l'unité de stockage d'informations. Une unité d'affichage affiche les points de repère sur une carte sur la base des informations relatives aux points de repère stockées dans l'unité de stockage d'informations, et affiche les informations d'accompagnement en association avec chaque point de repère. Un utilisateur peut efficacement régler ou modifier un itinéraire d'un corps mobile tel qu'un robot, ou des positions d'agencement des points de repère, en référence à l'affichage des informations d'accompagnement, ce qui rend simple de réaliser de manière satisfaisante le calcul de propre position en utilisant les informations relatives aux points de repère.
PCT/JP2023/037819 2022-11-08 2023-10-19 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2024101104A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-178734 2022-11-08
JP2022178734 2022-11-08

Publications (1)

Publication Number Publication Date
WO2024101104A1 true WO2024101104A1 (fr) 2024-05-16

Family

ID=91032780

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/037819 WO2024101104A1 (fr) 2022-11-08 2023-10-19 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2024101104A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006234453A (ja) * 2005-02-23 2006-09-07 Mitsubishi Heavy Ind Ltd 自己位置標定用ランドマーク位置の登録方法
JP2019534486A (ja) * 2017-04-21 2019-11-28 エックス デベロップメント エルエルシー ネガティブマッピングを用いる位置測定
US20200249032A1 (en) * 2018-01-15 2020-08-06 Sk Telecom Co., Ltd. Apparatus and method for updating high definition map for autonomous driving
JP2021038939A (ja) * 2019-08-30 2021-03-11 株式会社豊田中央研究所 キャリブレーション装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006234453A (ja) * 2005-02-23 2006-09-07 Mitsubishi Heavy Ind Ltd 自己位置標定用ランドマーク位置の登録方法
JP2019534486A (ja) * 2017-04-21 2019-11-28 エックス デベロップメント エルエルシー ネガティブマッピングを用いる位置測定
US20200249032A1 (en) * 2018-01-15 2020-08-06 Sk Telecom Co., Ltd. Apparatus and method for updating high definition map for autonomous driving
JP2021038939A (ja) * 2019-08-30 2021-03-11 株式会社豊田中央研究所 キャリブレーション装置

Similar Documents

Publication Publication Date Title
US11531354B2 (en) Image processing apparatus and image processing method
US10982968B2 (en) Sensor fusion methods for augmented reality navigation
JP2019045892A (ja) 情報処理装置、情報処理方法、プログラム、及び、移動体
JP7143857B2 (ja) 情報処理装置、情報処理方法、プログラム、及び、移動体
WO2019181284A1 (fr) Dispositif de traitement d'informations, dispositif de mouvement, procédé et programme
US20240054793A1 (en) Information processing device, information processing method, and program
US20220253065A1 (en) Information processing apparatus, information processing method, and information processing program
US20230230368A1 (en) Information processing apparatus, information processing method, and program
CN113841100A (zh) 自主行驶控制设备、自主行驶控制系统和自主行驶控制方法
US20240069564A1 (en) Information processing device, information processing method, program, and mobile apparatus
WO2019111549A1 (fr) Corps mobile, système, programme et procédé de positionnement
WO2023153083A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et dispositif de déplacement
WO2019097884A1 (fr) Dispositif de traitement d'informations, procédé et dispositif de gestion, et programme
WO2024101104A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20230251846A1 (en) Information processing apparatus, information processing method, information processing system, and program
WO2022024602A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2023062484A (ja) 情報処理装置、情報処理方法及び情報処理プログラム
JP2022098397A (ja) 情報処理装置、および情報処理方法、並びにプログラム
WO2020090250A1 (fr) Appareil de traitement d'image, procédé de traitement d'image et programme
WO2023149089A1 (fr) Dispositif d'apprentissage, procédé d'apprentissage, et programme d'apprentissage
WO2023162497A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image
WO2023063145A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
WO2022024569A1 (fr) Dispositif et procédé de traitement d'informations, et programme
WO2023145460A1 (fr) Système de détection de vibration et procédé de détection de vibration
WO2024024471A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23888455

Country of ref document: EP

Kind code of ref document: A1