WO2023228283A1 - Information processing system, movable body, information processing method, and program - Google Patents

Information processing system, movable body, information processing method, and program Download PDF

Info

Publication number
WO2023228283A1
WO2023228283A1 PCT/JP2022/021272 JP2022021272W WO2023228283A1 WO 2023228283 A1 WO2023228283 A1 WO 2023228283A1 JP 2022021272 W JP2022021272 W JP 2022021272W WO 2023228283 A1 WO2023228283 A1 WO 2023228283A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
self
coordinate system
reference coordinate
sensor
Prior art date
Application number
PCT/JP2022/021272
Other languages
French (fr)
Japanese (ja)
Inventor
成史 大畑
錦先 項
Original Assignee
株式会社センシンロボティクス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社センシンロボティクス filed Critical 株式会社センシンロボティクス
Priority to PCT/JP2022/021272 priority Critical patent/WO2023228283A1/en
Publication of WO2023228283A1 publication Critical patent/WO2023228283A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D1/00Measuring arrangements giving results other than momentary value of variable, of general application
    • G01D1/10Measuring arrangements giving results other than momentary value of variable, of general application giving differentiated values
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system

Definitions

  • the present invention relates to an information processing system, a mobile object, an information processing method, and a program.
  • flying objects such as drones and unmanned aerial vehicles (UAVs), and running objects such as unmanned ground vehicles (UGVs)
  • UAVs unmanned aerial vehicles
  • UUVs unmanned ground vehicles
  • Possible mobile objects are beginning to be used in industry.
  • Patent Document 1 discloses a system in which a flying object sequentially photographs an object at a plurality of preset waypoints.
  • Patent Document 1 uses GNSS (global navigation satellite system) outdoors for self-position estimation, and creates a movement route for a mobile body based on latitude and longitude information, which is difficult to use when moving indoors. A similar method cannot be used for the movement path of the body.
  • GNSS global navigation satellite system
  • a technique such as Visual SLAM (Simultaneous Localization and Mapping) is used to manually control the movement of a moving object.
  • One possible method is to obtain three-dimensional indoor information in advance based on sensor information from a sensor installed in the vehicle, and then allow the user to set a travel route based on this information.
  • the methods for creating outdoor travel routes and the methods for creating indoor travel routes are different, and the desire to create flight routes that span inside and outside of structures has not been sufficiently studied. .
  • the present invention was made in view of this background, and an object thereof is to provide an information processing system etc. that can self-estimate movement routes that span inside and outside of a structure.
  • the main invention of the present invention for solving the above problems is an information processing system for estimating the self-position of a mobile object, which includes a receiver that acquires information received from a satellite positioning system, and a sensor that acquires environmental information. a reference coordinate conversion unit that converts both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates; A self-positioning device that estimates third self-location information expressed in the reference coordinate system based on first self-location information and second self-location information calculated by comparing the environmental information and reference environmental information. A position estimation unit.
  • FIG. 1 is a diagram showing the configuration of an information processing system according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing the hardware configuration of the management server in FIG. 1.
  • FIG. 2 is a block diagram showing the hardware configuration of the user terminal in FIG. 1.
  • FIG. 2 is a block diagram showing the hardware configuration of the mobile body in FIG. 1.
  • FIG. 2 is a block diagram showing the functions of each component in FIG. 1.
  • FIG. FIG. 3 is a diagram illustrating coordinate transformation according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating coordinate transformation according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating coordinate transformation according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating environmental information according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating reference environment information according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a comparison between environmental information and reference environmental information according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating self-position estimation according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating self-position estimation according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating self-position estimation according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating an example of estimating third self-location information according to an embodiment of the present invention.
  • 3 is a flowchart of a travel route generation method according to an embodiment of the present invention.
  • An information processing system etc. has the following configuration.
  • An information processing system for estimating the self-position of a mobile object comprising a receiver that acquires received information from a satellite positioning system, and a sensor that acquires environmental information, a reference coordinate conversion unit that converts both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates; A third self-position expressed in the reference coordinate system based on first self-position information indicated by the received information and second self-position information calculated by comparing the environment information and reference environment information.
  • the self-position estimation unit includes a state estimation filter that receives the first self-position information and the second self-position information and outputs the third self-position information.
  • the information processing system according to item 1, characterized in that: [Item 4] the receiver is a GPS receiver; The information processing system according to any one of items 1 to 3, characterized in that: [Item 5] the sensor is a LiDAR sensor; The information processing system according to any one of items 1 to 4, characterized in that: [Item 6] The sensor is a visual sensor.
  • the information processing system according to any one of items 1 to 4, characterized in that: [Item 7] further comprising a movement control unit that controls the movement of the moving body by comparing the third self-position information expressed in the reference coordinate system and movement route information expressed in the reference coordinate system;
  • the information processing system according to any one of items 1 to 6, characterized in that: [Item 8] further comprising a movement route information correction unit that corrects the movement route information when the sensor detects an obstacle on the movement path of the moving body;
  • the information processing system according to item 7, characterized in that: [Item 9] An information processing system for estimating the self-position of a mobile object, comprising a receiver that acquires received information from a satellite positioning system, and a sensor that acquires environmental information, a reference coordinate conversion unit that converts both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates; A third self-position expressed in the reference coordinate system based on first self-position information indicated by the received information
  • An information processing method for estimating the self-position of a mobile object comprising a receiver that acquires information received from a satellite positioning system, and a sensor that acquires environmental information, the method comprising: using a reference coordinate conversion unit to convert both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
  • the self-position estimating unit calculates the self-position information expressed in the reference coordinate system based on the first self-position information indicated by the received information and the second self-position information calculated by comparing the environment information and reference environment information.
  • a program that causes a computer to execute an information processing method for estimating the self-position of a mobile object which includes a receiver that acquires received information from a satellite positioning system and a sensor that acquires environmental information, the program comprising: using a reference coordinate conversion unit to convert both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
  • the self-position estimating unit calculates the self-position information expressed in the reference coordinate system based on the first self-position information indicated by the received information and the second self-position information calculated by comparing the environment information and reference environment information. estimating third self-location information;
  • the information processing system includes a management server 1, one or more user terminals 2, and one or more mobile objects 4 (for example, a flying object, a traveling object, etc.). It has the above mobile object storage device 5.
  • the management server 1, the user terminal 2, the mobile object 4, and the mobile object storage device 5 are connected to each other via a network so that they can communicate with each other.
  • the illustrated configuration is an example, and is not limited to this. For example, a configuration that does not include the movable body storage device 5 and is carried by the user may be used.
  • FIG. 2 is a diagram showing the hardware configuration of the management server 1. Note that the illustrated configuration is an example, and other configurations may be used.
  • a management server 1 is connected to a user terminal 2, a mobile object 4, and a mobile object storage device 5, and constitutes a part of this system.
  • the management server 1 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
  • the management server 1 includes at least a processor 10, a memory 11, a storage 12, a transmitting/receiving section 13, an input/output section 14, etc., which are electrically connected to each other via a bus 15.
  • the processor 10 is an arithmetic device that controls the overall operation of the management server 1, controls the transmission and reception of data between each element, and performs information processing necessary for application execution and authentication processing.
  • the processor 10 is a CPU (Central Processing Unit) and/or a GPU (Graphics Processing Unit), and executes programs for this system stored in the storage 12 and developed in the memory 11 to perform various information processing. .
  • the memory 11 includes a main memory configured with a volatile storage device such as a DRAM (Dynamic Random Access Memory), and an auxiliary memory configured with a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). .
  • the memory 11 is used as a work area for the processor 10, and also stores a BIOS (Basic Input/Output System) executed when the management server 1 is started, various setting information, and the like.
  • BIOS Basic Input/Output System
  • the storage 12 stores various programs such as application programs.
  • a database storing data used for each process may be constructed in the storage 12.
  • the transmitting/receiving unit 13 connects the management server 1 to the network.
  • the transmitting/receiving unit 13 may include a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
  • the input/output unit 14 is information input devices such as a keyboard and mouse, and output devices such as a display.
  • the bus 15 is commonly connected to each of the above elements and transmits, for example, address signals, data signals, and various control signals.
  • the user terminal 2 shown in FIG. 3 also includes a processor 20, a memory 21, a storage 22, a transmitting/receiving section 23, an input/output section 24, etc., which are electrically connected to each other through a bus 25. Since the functions of each element can be configured in the same manner as the management server 1 described above, a detailed explanation of each element will be omitted.
  • the user terminal 2 is, for example, an information processing device such as a personal computer or a tablet terminal, but may also be configured by a smartphone, a mobile phone, a PDA, or the like.
  • the input/output unit 24 is composed of a display, a keyboard, and a mouse when the user terminal 2 is composed of a personal computer, and is composed of a touch panel etc. when the user terminal 2 is composed of a smartphone or a tablet terminal. Ru.
  • the moving object 4 is a known moving object including a flying object such as a drone or an unmanned aerial vehicle, a running object such as an unmanned ground vehicle, and is particularly a moving object that can be autonomously controlled.
  • a flying object such as a drone or an unmanned aerial vehicle
  • a running object such as an unmanned ground vehicle
  • FIG. 4 is a block diagram showing the hardware configuration of the flying object 4.
  • Flight controller 41 may include one or more processors, such as a programmable processor (eg, a central processing unit (CPU)).
  • the flight controller 41 has a memory 411 and can access the memory.
  • Memory 411 stores logic, code, and/or program instructions executable by the flight controller to perform one or more steps.
  • the flight controller 41 may include sensors 412 such as an inertial sensor (acceleration sensor, gyro sensor), a GPS sensor, a proximity sensor (eg, lidar), and the like.
  • the memory 411 may include, for example, a separable medium or external storage device such as an SD card or random access memory (RAM). Data acquired from cameras/sensors 42 may be communicated directly to and stored in memory 411. For example, still image/video data taken with a camera etc. may be recorded in the built-in memory or external memory, but is not limited to this. It may be recorded in either the user terminal 2 or the mobile storage device 5.
  • the camera 42 may be installed on the flying object 4 via a gimbal 43.
  • Flight controller 41 includes a control module (not shown) configured to control the state of the aircraft.
  • the control module may be configured to adjust the spatial position, velocity, and/or acceleration of an air vehicle with six degrees of freedom (translational motion x, y, and z, and rotational motion ⁇ x , ⁇ y , and ⁇ z ).
  • the propulsion mechanism (motor 45, etc.) of the aircraft via an ESC 44 (Electric Speed Controller).
  • a propeller 46 is rotated by a motor 45 supplied with power from a battery 48, thereby generating lift of the flying object.
  • the control module can control one or more of the states of the mounting section and sensors.
  • Flight controller 41 also transmits and/or transmits data to one or more external devices (e.g., transceiver 49, management server 1, user terminal 2, display device, or other remote controller). Alternatively, it can communicate with a transmitter/receiver 47 configured to be able to receive data from an external device.
  • Transceiver 49 may use any suitable communication means, such as wired or wireless communication.
  • the flight controller 41 not only functions to control the state of the flying object, such as controlling the state of the flying object described above, but also executes application programs in response to instructions from external devices (particularly the user terminal 2). , it may be possible to realize various functions related to data processing, and for example, it may be possible to execute functions corresponding to a movement route generation section 430 and a movement instruction section 440, which will be described later.
  • the flight controller 41 may have a function related to data processing so that it can be used for both the mobile object state control function and the data processing function, but instead of this, a separate processor (control unit) dedicated to the data processing function may be provided. You may also do so.
  • the transmitter/receiver 47 uses, for example, one or more of a local area network (LAN), wide area network (WAN), infrared rays, wireless, WiFi, point-to-point (P2P) network, telecommunications network, cloud communication, etc. can do.
  • LAN local area network
  • WAN wide area network
  • infrared rays wireless
  • WiFi point-to-point
  • P2P point-to-point
  • telecommunications network telecommunications network
  • cloud communication etc.
  • the transmitting/receiving unit 47 transmits and/or transmits one or more of data acquired by the camera/sensors 42, processing results generated by the flight controller 41, predetermined control data, user commands from a terminal or remote controller, etc. or can be received.
  • the cameras/sensors 42 include an inertial sensor (acceleration sensor, gyro sensor), a receiver (RTK-GPS sensor) that acquires information received from a satellite positioning system, and a sensor (proximity sensor) that acquires environmental information.
  • an inertial sensor acceleration sensor, gyro sensor
  • RTK-GPS sensor receiver
  • sensor proximity sensor
  • environmental information for example, LiDAR (Light Detection and Ranging), etc.)
  • a visual sensor for example, including a camera
  • image sensor for example, including a camera
  • FIG. 5 is a block diagram illustrating functions implemented in the mobile object 4.
  • the self-position of a mobile body is estimated, which includes a receiver that acquires received information from a satellite positioning system and a sensor that acquires environmental information, and a first a reference coordinate conversion unit that converts both the coordinate system and a second coordinate system representing environmental information into a reference coordinate system based on the base point coordinates, first self-position information indicated by the received information, environmental information and a reference; It has various functional units for estimating third self-position information expressed in a reference coordinate system based on second self-position information calculated by comparison with environmental information. Note that some or all of the various functional units may be realized by an information processing device (processor, control unit) installed in at least one of the management server 1 and the user terminal 2.
  • an information processing device processor, control unit
  • the moving body 4 includes a reference coordinate conversion section 410, a self-position estimation section 420, a movement route generation section 430, a movement instruction section 440, a movement path correction section 450, and a storage section 470.
  • the storage unit 470 includes various databases such as a movement information storage unit 471 and a movement route information storage unit 472.
  • the reference coordinate conversion unit 410 converts a first coordinate system (for example, a latitude/longitude/height coordinate system (LLA coordinate system)) representing received information from a satellite positioning system acquired by a receiver mounted on the mobile body 4 and a mobile body
  • a second coordinate system e.g., Point Cloud Map coordinate system
  • the first coordinate system and the second coordinate system are transformed into the reference coordinate system using, for example, transformation information between coordinates stored in advance (for example, transformation information T1 and T2 described later).
  • the point cloud map coordinate system and the reference coordinate system exist independently.
  • P1 to P3 in the figure represent point clouds (for example, point clouds indicating objects such as buildings) included in the point cloud map.
  • the reference coordinate system is expressed as a three-dimensional coordinate system (XYZ coordinate system) with the origin O at an arbitrary position.
  • the latitude/longitude/height coordinate system of RTK-GPS is expressed in the reference coordinate system based on conversion information T2 (not shown) with the reference coordinate system.
  • conversion information T2 not shown
  • the point cloud map coordinate system sets a predetermined position (for example, the position where the power of the sensor is turned on, the position where the reset process is performed, the position where the mobile object 4 starts moving, etc.) as the origin O'. It is expressed in a three-dimensional coordinate system (X'Y'Z' coordinate system).
  • a predetermined position for example, the position where the power of the sensor is turned on, the position where the reset process is performed, the position where the mobile object 4 starts moving, etc.
  • X'Y'Z' coordinate system three-dimensional coordinate system
  • the position of the mobile object 4 at a certain time is determined to be one, but the position information of the mobile object 4 obtained at the same time from the receiver and the sensor at a certain time is "information indicating the same location". Both pieces of location information are associated with the same time as a pair and stored in, for example, the movement information storage unit 471.
  • FIG. 7 an example is shown in which position information of points whose positions are particularly easy to associate with each other (for example, corners of the movement route of a moving object) are stored in pairs.
  • the conversion information T1 is calculated based on the relationship between the origins in the positional relationship that minimizes the sum of the distances between the stored pairs of positional information.
  • the first coordinate system e.g. latitude/longitude/height coordinate system
  • the second coordinate system for example, the Point Cloud Map coordinate system
  • the positional information expressed and the positional information expressed in the second coordinate system into the reference coordinate system.
  • this method is not limited to this method, and any method may be used as long as it is possible to integrate the position information expressed in the first coordinate system and the position information expressed in the second coordinate system into the reference coordinate system. Good too.
  • the self-position estimating unit 420 calculates the self-position based on the first self-position information indicated by the received information, which has been converted into the reference coordinate system, and the second self-position information calculated by comparing the environment information and the reference environment information. , estimate third self-position information expressed in a reference coordinate system.
  • the first self-location information may be, for example, location information obtained by RTK-GPS.
  • the second self-position information includes, for example, environmental information acquired by a LiDAR sensor or a visual sensor (including a camera) (for example, three-dimensional point cloud map information acquired by LiDAR etc. in FIG. 9), This is position information determined by comparing the reference environment information acquired by the sensor that acquires the information (for example, in FIG.
  • the reference three-dimensional point cloud map information acquired in advance with LiDAR, etc. By comparison, it is determined from which observation position the environmental information was acquired, and the observation position is set as the second self-location information.
  • environmental information and reference environmental information are expressed as point cloud maps
  • the shapes of both environmental information are compared like a puzzle using known techniques such as NDT registration and NDT Scan Matching (for example, 11 shows the information obtained by superimposing the point cloud map information and the reference point cloud map information)
  • the second self-position information can be estimated by finding the coordinate transformation that maximizes the degree of agreement between the two. It is possible.
  • the self-position estimating unit 420 may, for example, store first self-position information (e.g. self-position information based on RTK-GPS) converted to the reference coordinate system and second self-position information (self-position information based on RTK-GPS).
  • first self-position information e.g. self-position information based on RTK-GPS
  • second self-position information self-position information based on RTK-GPS
  • it may include a state estimation filter such as a Kalman filter or a particle filter that estimates third self-position information by inputting self-position information (for example, self-position information based on a LiDAR sensor).
  • the Kalman filter is a filter that has the function of integrating multiple observed values and estimating a plausible state quantity. Since it becomes possible to use the third self-position information, it becomes possible to estimate the third self-position information.
  • the first self-location information or the second self-location information does not have sufficient sensitivity (accuracy) to estimate the third self-location information, that is, if the first self-location information is In the case where the sensitivity of the RTK-GPS falls below a predetermined value depending on the communication status of the RTK-GPS and the standard is not satisfied (for example, in FIG. 0), or in a format where reference environment information is used for the second self-location information, the location is not a location where sufficient reference environment information to be compared is prepared, but NDT registration etc. In cases where the score does not meet the standard (for example, FIG. 12 shows that there is little reference environment information around the moving object 4 and the score indicating the index of mismatch is high), sufficient sensitivity is required. Self-location information for which no information can be obtained may not be used as input.
  • the self-location estimating unit 420 when the first self-location information or the second self-location information does not have sufficient sensitivity to estimate the third self-location information, the self-location estimating unit 420 Similarly to the above, the third self-position information is estimated by employing self-position information showing sufficient sensitivity (for example, the first self-position information in FIG. 12 and the second self-position information in FIG. 14). , as shown in FIG. 13, if both self-location information shows sufficient sensitivity, for example, the self-location information to be adopted preferentially may be set in advance by user operation, or Based on the comparison result between one or more set reference sensitivities and each sensitivity, third self-location information is weighted so that self-location information with higher sensitivity is considered to be more reliable.
  • the first self-location information and the first self-location information may be weighted equally, and The center position of the 2nd self-location information is estimated as the 3rd self-location information, and if one of the sensitivities is shown to be highly reliable, the ratio shifted to the side of the 3rd self-location information that is shown to be more sensitive. (e.g., estimating the location as third self-location information).
  • the emergency stop function may be activated by transmitting a signal to stop the movement of the mobile body 4 to a movement instruction unit 440, which will be described later.
  • the first self-position information and the second self-position information converted to the same reference coordinate system can be used in a plurality of pieces of observation information obtained from different configurations and expressed by different coordinate systems. This makes it possible to estimate highly accurate third self-position information.
  • the movement route generation unit 430 performs a user's selection operation on three-dimensional model data (for example, reference three-dimensional point cloud map information as shown in FIG. 10) displayed on the user terminal 2, for example.
  • three-dimensional model data for example, reference three-dimensional point cloud map information as shown in FIG. 10.
  • Set one or more waypoint information sequentially from the start point to the end point, or set any points in any order, generate travel route information by a known method based on the waypoint information, and store the travel route information.
  • the three-dimensional environment data may be stored and managed in the unit 472, or the three-dimensional environment data may be analyzed to determine the location of specific or all structures inside and outside the structure (for example, internal walls, columns, ceilings, windows, doors, stairs, etc.).
  • a travel route may be calculated in which waypoint information from which information on external components (such as external components) can be obtained is set, and this may be stored and managed in the travel route information storage unit 472 as travel route information.
  • the movement route may be generated, for example, by setting the position of the moving body storage device 5 as the movement start position and the movement end position, and passing through each waypoint, or conversely, Alternatively, the configuration may be such that the position where the aircraft is carried by the user is set as the movement start position, or the user collects the aircraft at the movement end position.
  • the mobile body storage device 5 managed in the storage unit of the mobile body storage device 5 the mobile body storage device 5 selected as the movement start position or movement end position is It may also be configured to be generated as a movement route including the location.
  • the three-dimensional model data may use reference environment information such as reference three-dimensional point cloud map information as described above, but is not limited to this, and may, for example, be data created with CAD (Computer-Aided Design) design software.
  • the model may be a three-dimensional model data reconstructed from BIM (Building Information Modeling) data, CIM (Construction Information Modeling) data, CAD data, BIM data, etc. , three-dimensional model data obtained by generating a structure with a predetermined height based on two-dimensional blueprint data, or three-dimensional model data such as CityGML (Generalized Markup Language), CityJson, GeoTIFF, etc. It may be city model data or three-dimensional city model data stored in a three-dimensional city model database external to this system. Note that the reconstruction of the three-dimensional model data, etc. may be executed in the processor of the management server 1 or the user terminal 2, or may be executed outside the management server 1 or the user terminal 2 and acquired therein.
  • the reference three-dimensional point cloud map information may be information acquired in advance by a sensor such as LiDAR as described above, but for example, 1 or the processor of the user terminal 2, etc., may use three-dimensional point cloud model data obtained by converting the model surfaces inside and outside the structure of the three-dimensional model data described above into a point group.
  • a sensor such as LiDAR as described above
  • the method of generating three-dimensional point cloud model data for example, by moving a virtual moving body 4 equipped with a virtual sensor (for example, virtual LiDAR) inside or outside the structure of the three-dimensional model data.
  • Three-dimensional point cloud model data regarding components within or outside the structure may be generated.
  • 3D point cloud model data As a result, it is theoretically possible to generate point cloud data close to point cloud sensing data obtained when the inside or outside of the structure is actually measured using the sensor of the moving body 4.
  • Other methods of generating 3D point cloud model data include forming 3D model data into a point cloud evenly at predetermined intervals, or if the 3D model data is polygon data, points are placed at each vertex. may be arranged to form a point cloud, or may be formed into a point cloud using a known point cloud technology (conversion technology to point cloud data).
  • the generated three-dimensional point cloud model data is stored in the storage unit 470, the management server 1, or the user terminal 2.
  • the movement instruction unit 440 refers to the movement route information stored in the movement route information storage unit 472, and instructs the movement of the moving body 4 according to the coordinates indicated by the movement route information and the third self-position information estimated as described above.
  • the mobile unit 4 transmits information instructing the mobile unit 4 to the mobile unit 4.
  • the first self-location information is particularly sensitive outside a structure (such as a building) and does not require prior acquisition of reference information, and the first self-location information is sensitive even inside a structure, although it is necessary to obtain reference information in advance.
  • the structure Even if it is inside a building or outside a structure for which there is no reference environment information, it is possible to compare the information indicated by the movement route information with the third self-position information, so the movement route that spans inside and outside the structure can be compared. It is also possible to seamlessly control the movement instructions of the mobile body 4 even in the case of the present invention.
  • the movement route information may be generated by the above-mentioned movement route generation unit 430, or may be generated and stored by an external system.
  • the moving body 4 may further include a moving route correction section 450.
  • the movement route correction unit 450 corrects the movement instruction unit 440 when it is confirmed by a sensor for acquiring environmental information that an obstacle exists near the movement route (within a predetermined distance range of the movement route including on the movement route). Correct the travel route information referenced by.
  • the movement route can be expressed in the reference coordinate system across the inside and outside of the structure. For example, when there are obstacles around the entrance and exit of the structure, the correction range of the movement path is limited to the structure. It becomes possible to target movement routes that span inside and outside of objects.
  • the movement information storage unit 471 stores parameters used when the movement route generation unit 430 generates a movement route, the movement instruction unit 440 instructs the autonomously controlled moving body 4 to move on the movement route, etc. It stores information, information acquired during movement, etc. acquired on the movement route. Examples of specific parameters include moving speed, flight altitude (if the moving object 4 is a flying object), overlap rate of captured images, information acquired during movement (for example, image information, video information, environmental information, etc.). etc.) etc.
  • the movement route information storage unit 472 stores coordinate information (so-called waypoint information) on the movement route generated by the movement route generation unit 430. Note that, as described above, travel route information generated by the management server 1, the processor of the user terminal 2, or an external system may be stored.
  • FIG. 5 is a block diagram also illustrating functions implemented in the user terminal 2. Note that some or all of the various functional units may be realized by an information processing device (processor, control unit) installed in at least one of the management server 1 and the mobile body 4.
  • the user terminal 2 includes a communication section 210, a screen information generation section 220, and a storage section 270.
  • the communication unit 210 communicates with the management server 1, the mobile body 4, and the mobile body storage device 5.
  • the communication unit 210 also functions as a reception unit that receives various requests, data, etc. from the management server 1, the mobile unit 4, and the mobile unit storage device 5.
  • the screen information generation unit 220 generates screen information displayed via the user interface of the user terminal 2. For example, screen information for configuring a user interface screen generated by arranging various images and texts based on predetermined layout rules, and displaying various information acquired by the mobile object 4 on the user interface screen. generate.
  • FIG. 16 illustrates a flowchart of the information processing method according to this embodiment.
  • this flowchart exemplarily shows a configuration in which an application is started on the user terminal 2, the configuration is not limited to this.
  • the configuration may include an input/output device and allow various settings and the like.
  • the user starts, for example, an application on the user terminal 2 that operates the mobile object 4 and displays acquired information (SQ101).
  • This application may be stored in the user terminal 2, for example, or may be software (so-called SaaS) provided from the management server 1, mobile device 4, or other external server (not shown) connected via a network. There may be.
  • SaaS software
  • a login screen may be displayed as necessary, and a configuration may be adopted in which, for example, a login ID and password are requested.
  • the user creates a new travel plan (SQ102). For example, set the "plan name”, “area name”, “address”, etc., acquire and display the 3D model data to be moved on the user terminal 2, and start creating a new movement plan. do.
  • the user generates a travel route for the movement of the mobile object 4 (SQ103).
  • one or more waypoint information (for example, expressed in the latitude, longitude, and height coordinate system on the user terminal 2) is set by the user's selection operation for the three-dimensional model data displayed on the user terminal 2.
  • the three-dimensional model data and waypoint information are transmitted to the moving object 4, and the moving object 4 uses a known method (for example, moving between the set waypoints in a straight line) based on the three-dimensional model data and waypoint information. travel route information is generated.
  • the user instructs the moving body 4 to start moving (SQ104).
  • SQ104 For example, with reference to the movement information storage section 471 and the movement route information storage section 472, movement of the mobile object 4 for the purpose of inspection, security, construction progress management, etc. is executed.
  • the first self-position information acquired by the receiver of the mobile object 4 and the second self-position information obtained as a result of comparing the environmental information obtained by the sensor and the reference environmental information are respectively set in the reference coordinate system.
  • the movement of the moving body 4 is controlled based on the third self-position information expressed in the reference coordinate system estimated based on the converted self-position information, the waypoint information of the movement route information, etc.
  • the user instructs the user terminal 2 to output the acquired information (SQ105).
  • the acquired information (still images, moving images, audio, and other information) acquired by the mobile object 4 on the movement route can be displayed, and the positions associated with the position information of the acquired information (especially waypoints) can be displayed.
  • a mark such as a symbol that serves as a link for viewing acquired information corresponding to location information) may also be attached. Then, by selecting the link on the user terminal 2, the corresponding acquired information may be displayed.
  • the present invention can provide an information processing system and the like that can autonomously control the movement of the mobile object 4 while seamlessly and accurately estimating its own position even on a movement route that spans inside and outside of a structure.
  • the acquisition of information inside and outside the structure by the moving object 4 was taken as a specific example, but it may also be an inspection of the structure, and the presence or absence of a predetermined event on the inner wall and/or outer wall of the structure may be used. It may also be equipped with equipment, equipment, etc. used for inspecting. More specifically, imaging devices (visible light cameras, infrared cameras, metal detectors, ultrasonic measuring devices, etc.), keying devices, detection devices (metal detectors), sound collection devices, odor measuring devices, gas detection devices, etc. All devices necessary to know the condition of a structure to be inspected having an inner wall or an outer wall, such as an air contamination measuring device, a detection device (device for detecting cosmic rays, radiation, electromagnetic waves, etc.), etc., can be employed.
  • the embodiment may be, for example, security or monitoring inside a structure, and may include devices, equipment, etc. used for security or monitoring. More specifically, structures to be guarded and monitored, such as imaging devices (visible light cameras, infrared cameras, night vision cameras, metal detectors, ultrasonic measuring instruments, etc.) and sensor devices (motion sensors, infrared sensors, etc.) All devices necessary to image and detect abnormalities, intruders, etc. can be employed.
  • imaging devices visible light cameras, infrared cameras, night vision cameras, metal detectors, ultrasonic measuring instruments, etc.
  • sensor devices motion sensors, infrared sensors, etc.
  • the mobile object of the present invention can be suitably used as a mobile object for photographing equipped with a camera, etc., and can also be used in the security field, infrastructure monitoring, surveying, and in buildings and structures such as sports venues, factories, warehouses, etc. It can also be used in various industries such as inspection and disaster response.

Abstract

[Problem] The present invention provides an information processing system and the like which enable autonomous movement control of a movable body while seamlessly performing highly accurate self-position estimation in a movement path over the inside and the outside of a structure. [Solution] An information processing system according to the present invention is for self-position estimation of a movable body provided with a receiver for acquiring reception information from a satellite positioning system and with a sensor for acquiring environment information. The information processing system is provided with: a reference coordinate conversion unit for converting both a first coordinate system indicating the reception information and a second coordinate system indicating the environment information into a reference coordinate system based on base point coordinates; and a self-position estimation unit for estimating third self-position information represented by the reference coordinate system, on the basis of first self-position information indicated by the reception information and second self-position information that is calculated by comparing the environment information with reference environment information.

Description

情報処理システム及び移動体、情報処理方法、プログラムInformation processing system, mobile object, information processing method, program
 本発明は、情報処理システム及び移動体、情報処理方法、プログラムに関する。 The present invention relates to an information processing system, a mobile object, an information processing method, and a program.
 近年、ドローン(Drone)や無人航空機(UAV:Unmanned Aerial Vehicle)などの飛行体(以下、「飛行体」と総称する)や無人地上車両(UGV:Unmanned Ground Vehicle)などの走行体などの自律制御可能な移動体が産業に利用され始めている。こうした中で、特許文献1には、飛行体が予め設定された複数のウェイポイントにおいて撮影対象を順次撮影するシステムが開示されている。 In recent years, autonomous control of flying objects (hereinafter collectively referred to as "flying objects") such as drones and unmanned aerial vehicles (UAVs), and running objects such as unmanned ground vehicles (UGVs) has been increasing. Possible mobile objects are beginning to be used in industry. Under these circumstances, Patent Document 1 discloses a system in which a flying object sequentially photographs an object at a plurality of preset waypoints.
特開2014-089160号公報JP2014-089160A
 しかしながら、上記特許文献1の開示技術は、屋外においてGNSS(global navigation satellite system)を自己位置推定に使用し、緯度経度情報に基づいて移動体の移動経路を作成するものであり、屋内での移動体の移動経路においては同様の手法を用いることができない。 However, the technology disclosed in Patent Document 1 uses GNSS (global navigation satellite system) outdoors for self-position estimation, and creates a movement route for a mobile body based on latitude and longitude information, which is difficult to use when moving indoors. A similar method cannot be used for the movement path of the body.
 また、屋内(例えば建物等の構造物内)での移動体の移動経路を生成する場合には、例えばVisual SLAM(Simultaneous Localization and Mapping)等の技術を用いて、手動で移動制御される移動体に搭載されたセンサのセンサ情報に基づき、屋内の3次元情報を予め取得し、これに基づいてユーザが移動経路の設定作業を行う方法が考えられる。しかしながら、屋外の移動経路の作成手法と屋内の移動経路の作成手法は別個のものであり、構造物の内外で跨る飛行経路を作成したいという要望については、十分に検討されているとはいえない。 In addition, when generating a movement route for a moving object indoors (for example, inside a structure such as a building), a technique such as Visual SLAM (Simultaneous Localization and Mapping) is used to manually control the movement of a moving object. One possible method is to obtain three-dimensional indoor information in advance based on sensor information from a sensor installed in the vehicle, and then allow the user to set a travel route based on this information. However, the methods for creating outdoor travel routes and the methods for creating indoor travel routes are different, and the desire to create flight routes that span inside and outside of structures has not been sufficiently studied. .
 本発明はこのような背景を鑑みてなされたものであり、構造物内外に跨る移動経路において自己推定可能な情報処理システム等を提供することを目的とする。 The present invention was made in view of this background, and an object thereof is to provide an information processing system etc. that can self-estimate movement routes that span inside and outside of a structure.
 上記課題を解決するための本発明の主たる発明は、衛星測位システムからの受信情報を取得する受信機と、環境情報を取得するセンサと、を備える移動体の自己位置を推定する情報処理システムであって、前記受信情報を表す第1座標系と、前記環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換する基準座標変換部と、前記受信情報が示す第1の自己位置情報と、前記環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、前記基準座標系で表される第3の自己位置情報を推定する自己位置推定部と、を備える。 The main invention of the present invention for solving the above problems is an information processing system for estimating the self-position of a mobile object, which includes a receiver that acquires information received from a satellite positioning system, and a sensor that acquires environmental information. a reference coordinate conversion unit that converts both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates; A self-positioning device that estimates third self-location information expressed in the reference coordinate system based on first self-location information and second self-location information calculated by comparing the environmental information and reference environmental information. A position estimation unit.
 本発明によれば、特に、造物内外に跨る移動経路において自己推定可能な情報処理システム等を提供することができる。 According to the present invention, it is possible to provide an information processing system and the like that can self-estimate travel routes that straddle the inside and outside of a structure.
本発明の実施の形態にかかる情報処理システムの構成を示す図である。1 is a diagram showing the configuration of an information processing system according to an embodiment of the present invention. 図1の管理サーバのハードウェア構成を示すブロック図である。FIG. 2 is a block diagram showing the hardware configuration of the management server in FIG. 1. FIG. 図1のユーザ端末のハードウェア構成を示すブロック図である。FIG. 2 is a block diagram showing the hardware configuration of the user terminal in FIG. 1. FIG. 図1の移動体のハードウェア構成を示すブロック図である。FIG. 2 is a block diagram showing the hardware configuration of the mobile body in FIG. 1. FIG. 図1の各構成の機能を示すブロック図である。2 is a block diagram showing the functions of each component in FIG. 1. FIG. 本発明の実施の形態にかかる座標変換を説明する図である。FIG. 3 is a diagram illustrating coordinate transformation according to an embodiment of the present invention. 本発明の実施の形態にかかる座標変換を説明する図である。FIG. 3 is a diagram illustrating coordinate transformation according to an embodiment of the present invention. 本発明の実施の形態にかかる座標変換を説明する図である。FIG. 3 is a diagram illustrating coordinate transformation according to an embodiment of the present invention. 本発明の実施の形態にかかる環境情報を説明する図である。FIG. 3 is a diagram illustrating environmental information according to an embodiment of the present invention. 本発明の実施の形態にかかる基準環境情報を説明する図である。FIG. 3 is a diagram illustrating reference environment information according to an embodiment of the present invention. 本発明の実施の形態にかかる環境情報と基準環境情報との比較を説明する図である。FIG. 3 is a diagram illustrating a comparison between environmental information and reference environmental information according to an embodiment of the present invention. 本発明の実施の形態にかかる自己位置推定を説明する図である。FIG. 3 is a diagram illustrating self-position estimation according to an embodiment of the present invention. 本発明の実施の形態にかかる自己位置推定を説明する図である。FIG. 3 is a diagram illustrating self-position estimation according to an embodiment of the present invention. 本発明の実施の形態にかかる自己位置推定を説明する図である。FIG. 3 is a diagram illustrating self-position estimation according to an embodiment of the present invention. 本発明の実施の形態にかかる第3の自己位置情報の推定の一例を説明する図である。FIG. 7 is a diagram illustrating an example of estimating third self-location information according to an embodiment of the present invention. 本発明の実施の形態にかかる移動経路生成方法のフローチャートである。3 is a flowchart of a travel route generation method according to an embodiment of the present invention.
 本発明の実施形態の内容を列記して説明する。本発明の実施の形態による情報処理システム等は、以下のような構成を備える。
[項目1]
 衛星測位システムからの受信情報を取得する受信機と、環境情報を取得するセンサと、を備える移動体の自己位置を推定する情報処理システムであって、
 前記受信情報を表す第1座標系と、前記環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換する基準座標変換部と、
 前記受信情報が示す第1の自己位置情報と、前記環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、前記基準座標系で表される第3の自己位置情報を推定する自己位置推定部と、
 を備える、
 ことを特徴とする情報処理システム。
[項目2]
 前記自己位置推定部は、前記第1の自己位置情報及び前記第2の自己位置情報が入力され、前記第3の自己位置情報を出力する状態推定フィルタを含む、
 ことを特徴とする項目1に記載の情報処理システム。
[項目3]
 前記自己位置推定部は、前記受信機の第1の感度と前記センサの第2の感度と各感度に対応する基準感度との比較結果に応じて、前記第1の自己位置情報と前記第2の自己位置情報の少なくともいずれかに基づき、前記基準座標系で表される第3の自己位置情報を推定する、
 ことを特徴とする項目1に記載の情報処理システム。
[項目4]
 前記受信機は、GPS受信機である、
 ことを特徴とする項目1ないし3のいずれかに記載の情報処理システム。
[項目5]
 前記センサは、LiDARセンサである、
 ことを特徴とする項目1ないし4のいずれかに記載の情報処理システム。
[項目6]
 前記センサは、Visualセンサである、
 ことを特徴とする項目1ないし4のいずれかに記載の情報処理システム。
[項目7]
 前記基準座標系で表される前記第3の自己位置情報と、前記基準座標系で表される移動経路情報とを比較して、前記移動体の移動を制御する移動制御部をさらに備える、
 ことを特徴とする項目1ないし6のいずれかに記載の情報処理システム。
[項目8]
 前記センサにより前記移動体の移動経路上に障害物を感知した場合、前記移動経路情報を修正する移動経路情報修正部をさらに備える、
 ことを特徴とする項目7に記載の情報処理システム。
[項目9]
 衛星測位システムからの受信情報を取得する受信機と、環境情報を取得するセンサと、を備える移動体の自己位置を推定する情報処理システムであって、
 前記受信情報を表す第1座標系と、前記環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換する基準座標変換部と、
 前記受信情報が示す第1の自己位置情報と、前記環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、前記基準座標系で表される第3の自己位置情報を推定する自己位置推定部と、
 を備える、
 ことを特徴とする移動体。
[項目10]
 衛星測位システムからの受信情報を取得する受信機と、環境情報を取得するセンサと、を備える移動体の自己位置を推定する情報処理方法であって、
 基準座標変換部により、前記受信情報を表す第1座標系と、前記環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換するステップと、
 自己位置推定部により、前記受信情報が示す第1の自己位置情報と、前記環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、前記基準座標系で表される第3の自己位置情報を推定するステップと、
 をコンピュータに実行させることを特徴とする情報処理方法。
[項目11]
 衛星測位システムからの受信情報を取得する受信機と、環境情報を取得するセンサと、を備える移動体の自己位置を推定する情報処理方法をコンピュータに実行させるプログラムであって、
 基準座標変換部により、前記受信情報を表す第1座標系と、前記環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換するステップと、
 自己位置推定部により、前記受信情報が示す第1の自己位置情報と、前記環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、前記基準座標系で表される第3の自己位置情報を推定するステップと、
 を前記コンピュータに実行させることを特徴とするプログラム。
The contents of the embodiments of the present invention will be listed and explained. An information processing system etc. according to an embodiment of the present invention has the following configuration.
[Item 1]
An information processing system for estimating the self-position of a mobile object, comprising a receiver that acquires received information from a satellite positioning system, and a sensor that acquires environmental information,
a reference coordinate conversion unit that converts both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
A third self-position expressed in the reference coordinate system based on first self-position information indicated by the received information and second self-position information calculated by comparing the environment information and reference environment information. a self-position estimation unit that estimates information;
Equipped with
An information processing system characterized by:
[Item 2]
The self-position estimation unit includes a state estimation filter that receives the first self-position information and the second self-position information and outputs the third self-position information.
The information processing system according to item 1, characterized in that:
[Item 3]
The self-position estimating unit calculates the first self-position information and the second self-position information according to a comparison result between a first sensitivity of the receiver, a second sensitivity of the sensor, and a reference sensitivity corresponding to each sensitivity. estimating third self-position information expressed in the reference coordinate system based on at least one of the self-position information of
The information processing system according to item 1, characterized in that:
[Item 4]
the receiver is a GPS receiver;
The information processing system according to any one of items 1 to 3, characterized in that:
[Item 5]
the sensor is a LiDAR sensor;
The information processing system according to any one of items 1 to 4, characterized in that:
[Item 6]
The sensor is a visual sensor.
The information processing system according to any one of items 1 to 4, characterized in that:
[Item 7]
further comprising a movement control unit that controls the movement of the moving body by comparing the third self-position information expressed in the reference coordinate system and movement route information expressed in the reference coordinate system;
The information processing system according to any one of items 1 to 6, characterized in that:
[Item 8]
further comprising a movement route information correction unit that corrects the movement route information when the sensor detects an obstacle on the movement path of the moving body;
The information processing system according to item 7, characterized in that:
[Item 9]
An information processing system for estimating the self-position of a mobile object, comprising a receiver that acquires received information from a satellite positioning system, and a sensor that acquires environmental information,
a reference coordinate conversion unit that converts both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
A third self-position expressed in the reference coordinate system based on first self-position information indicated by the received information and second self-position information calculated by comparing the environment information and reference environment information. a self-position estimation unit that estimates information;
Equipped with
A mobile object characterized by:
[Item 10]
An information processing method for estimating the self-position of a mobile object, comprising a receiver that acquires information received from a satellite positioning system, and a sensor that acquires environmental information, the method comprising:
using a reference coordinate conversion unit to convert both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
The self-position estimating unit calculates the self-position information expressed in the reference coordinate system based on the first self-position information indicated by the received information and the second self-position information calculated by comparing the environment information and reference environment information. estimating third self-location information;
An information processing method characterized by causing a computer to execute.
[Item 11]
A program that causes a computer to execute an information processing method for estimating the self-position of a mobile object, which includes a receiver that acquires received information from a satellite positioning system and a sensor that acquires environmental information, the program comprising:
using a reference coordinate conversion unit to convert both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
The self-position estimating unit calculates the self-position information expressed in the reference coordinate system based on the first self-position information indicated by the received information and the second self-position information calculated by comparing the environment information and reference environment information. estimating third self-location information;
A program that causes the computer to execute.
<実施の形態の詳細>
 以下、本発明の実施の形態による情報処理システム等についての実施の形態を説明する。添付図面において、同一または類似の要素には同一または類似の参照符号及び名称が付され、各実施形態の説明において同一または類似の要素に関する重複する説明は省略することがある。また、各実施形態で示される特徴は、互いに矛盾しない限り他の実施形態にも適用可能である。
<Details of embodiment>
Embodiments of information processing systems and the like according to embodiments of the present invention will be described below. In the accompanying drawings, the same or similar elements are given the same or similar reference numerals and names, and redundant description of the same or similar elements may be omitted in the description of each embodiment. Furthermore, features shown in each embodiment can be applied to other embodiments as long as they do not contradict each other.
<構成>
 図1に示されるように、本実施の形態における情報処理システムは、管理サーバ1と、一以上のユーザ端末2と、一以上の移動体4(例えば、飛行体や走行体など)と、一以上の移動体格納装置5とを有している。管理サーバ1と、ユーザ端末2と、移動体4と、移動体格納装置5は、ネットワークを介して互いに通信可能に接続されている。なお、図示された構成は一例であり、これに限らず、例えば、移動体格納装置5を有さずに、ユーザにより持ち運びされる構成などでもよい。
<Configuration>
As shown in FIG. 1, the information processing system according to the present embodiment includes a management server 1, one or more user terminals 2, and one or more mobile objects 4 (for example, a flying object, a traveling object, etc.). It has the above mobile object storage device 5. The management server 1, the user terminal 2, the mobile object 4, and the mobile object storage device 5 are connected to each other via a network so that they can communicate with each other. Note that the illustrated configuration is an example, and is not limited to this. For example, a configuration that does not include the movable body storage device 5 and is carried by the user may be used.
<管理サーバ1>
 図2は、管理サーバ1のハードウェア構成を示す図である。なお、図示された構成は一例であり、これ以外の構成を有していてもよい。
<Management server 1>
FIG. 2 is a diagram showing the hardware configuration of the management server 1. Note that the illustrated configuration is an example, and other configurations may be used.
 図示されるように、管理サーバ1は、ユーザ端末2と、移動体4、移動体格納装置5と接続され本システムの一部を構成する。管理サーバ1は、例えばワークステーションやパーソナルコンピュータのような汎用コンピュータとしてもよいし、或いはクラウド・コンピューティングによって論理的に実現されてもよい。 As illustrated, a management server 1 is connected to a user terminal 2, a mobile object 4, and a mobile object storage device 5, and constitutes a part of this system. The management server 1 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
 管理サーバ1は、少なくとも、プロセッサ10、メモリ11、ストレージ12、送受信部13、入出力部14等を備え、これらはバス15を通じて相互に電気的に接続される。 The management server 1 includes at least a processor 10, a memory 11, a storage 12, a transmitting/receiving section 13, an input/output section 14, etc., which are electrically connected to each other via a bus 15.
 プロセッサ10は、管理サーバ1全体の動作を制御し、各要素間におけるデータの送受信の制御、及びアプリケーションの実行及び認証処理に必要な情報処理等を行う演算装置である。例えばプロセッサ10はCPU(Central Processing Unit)および/またはGPU(Graphics Processing Unit)であり、ストレージ12に格納されメモリ11に展開された本システムのためのプログラム等を実行して各情報処理を実施する。 The processor 10 is an arithmetic device that controls the overall operation of the management server 1, controls the transmission and reception of data between each element, and performs information processing necessary for application execution and authentication processing. For example, the processor 10 is a CPU (Central Processing Unit) and/or a GPU (Graphics Processing Unit), and executes programs for this system stored in the storage 12 and developed in the memory 11 to perform various information processing. .
 メモリ11は、DRAM(Dynamic Random Access Memory)等の揮発性記憶装置で構成される主記憶と、フラッシュメモリやHDD(Hard Disc Drive)等の不揮発性記憶装置で構成される補助記憶と、を含む。メモリ11は、プロセッサ10のワークエリア等として使用され、また、管理サーバ1の起動時に実行されるBIOS(Basic Input / Output System)、及び各種設定情報等を格納する。 The memory 11 includes a main memory configured with a volatile storage device such as a DRAM (Dynamic Random Access Memory), and an auxiliary memory configured with a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). . The memory 11 is used as a work area for the processor 10, and also stores a BIOS (Basic Input/Output System) executed when the management server 1 is started, various setting information, and the like.
 ストレージ12は、アプリケーション・プログラム等の各種プログラムを格納する。各処理に用いられるデータを格納したデータベースがストレージ12に構築されていてもよい。 The storage 12 stores various programs such as application programs. A database storing data used for each process may be constructed in the storage 12.
 送受信部13は、管理サーバ1をネットワークに接続する。なお、送受信部13は、Bluetooth(登録商標)及びBLE(Bluetooth Low Energy)の近距離通信インターフェースを備えていてもよい。 The transmitting/receiving unit 13 connects the management server 1 to the network. Note that the transmitting/receiving unit 13 may include a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
 入出力部14は、キーボード・マウス類等の情報入力機器、及びディスプレイ等の出力機器である。 The input/output unit 14 is information input devices such as a keyboard and mouse, and output devices such as a display.
 バス15は、上記各要素に共通に接続され、例えば、アドレス信号、データ信号及び各種制御信号を伝達する。 The bus 15 is commonly connected to each of the above elements and transmits, for example, address signals, data signals, and various control signals.
<ユーザ端末2>
 図3に示されるユーザ端末2もまた、プロセッサ20、メモリ21、ストレージ22、送受信部23、入出力部24等を備え、これらはバス25を通じて相互に電気的に接続される。各要素の機能は、上述した管理サーバ1と同様に構成することが可能であることから、各要素の詳細な説明は省略する。
<User terminal 2>
The user terminal 2 shown in FIG. 3 also includes a processor 20, a memory 21, a storage 22, a transmitting/receiving section 23, an input/output section 24, etc., which are electrically connected to each other through a bus 25. Since the functions of each element can be configured in the same manner as the management server 1 described above, a detailed explanation of each element will be omitted.
 ユーザ端末2は、例えば、パーソナルコンピュータやタブレット端末等の情報処理装置であるが、スマートフォンや携帯電話、PDA等により構成してもよい。特に、入出力部24は、ユーザ端末2がパーソナルコンピュータで構成されている場合はディスプレイとキーボードやマウスにより構成され、ユーザ端末2がスマートフォンまたはタブレット端末で構成されている場合はタッチパネル等から構成される。 The user terminal 2 is, for example, an information processing device such as a personal computer or a tablet terminal, but may also be configured by a smartphone, a mobile phone, a PDA, or the like. In particular, the input/output unit 24 is composed of a display, a keyboard, and a mouse when the user terminal 2 is composed of a personal computer, and is composed of a touch panel etc. when the user terminal 2 is composed of a smartphone or a tablet terminal. Ru.
<移動体4>
 移動体4は、ドローンや無人航空機などの飛行体や無人地上車両などの走行体などを含む既知の移動体であって、特に自律制御可能な移動体である。移動体4の具体的な例として、飛行体4を例示して以下で説明する。図4は、飛行体4のハードウェア構成を示すブロック図である。フライトコントローラ41は、プログラマブルプロセッサ(例えば、中央演算処理装置(CPU))などの1つ以上のプロセッサを有することができる。
<Mobile object 4>
The moving object 4 is a known moving object including a flying object such as a drone or an unmanned aerial vehicle, a running object such as an unmanned ground vehicle, and is particularly a moving object that can be autonomously controlled. As a specific example of the moving object 4, the flying object 4 will be explained below. FIG. 4 is a block diagram showing the hardware configuration of the flying object 4. As shown in FIG. Flight controller 41 may include one or more processors, such as a programmable processor (eg, a central processing unit (CPU)).
 また、フライトコントローラ41は、メモリ411を有しており、当該メモリにアクセス可能である。メモリ411は、1つ以上のステップを行うためにフライトコントローラが実行可能であるロジック、コード、および/またはプログラム命令を記憶している。また、フライトコントローラ41は、慣性センサ(加速度センサ、ジャイロセンサ)、GPSセンサ、近接センサ(例えば、ライダー)等のセンサ類412を含みうる。 Additionally, the flight controller 41 has a memory 411 and can access the memory. Memory 411 stores logic, code, and/or program instructions executable by the flight controller to perform one or more steps. Further, the flight controller 41 may include sensors 412 such as an inertial sensor (acceleration sensor, gyro sensor), a GPS sensor, a proximity sensor (eg, lidar), and the like.
 メモリ411は、例えば、SDカードやランダムアクセスメモリ(RAM)などの分離可能な媒体または外部の記憶装置を含んでいてもよい。カメラ/センサ類42から取得したデータは、メモリ411に直接に伝達されかつ記憶されてもよい。例えば、カメラ等で撮影した静止画・動画データが内蔵メモリ又は外部メモリに記録されてもよいが、これに限らず、カメラ/センサ42または内蔵メモリからネットワークNWを介して、少なくとも管理サーバ1やユーザ端末2、移動体格納装置5のいずれか1つに記録されてもよい。カメラ42は飛行体4にジンバル43を介して設置されてもよい。 The memory 411 may include, for example, a separable medium or external storage device such as an SD card or random access memory (RAM). Data acquired from cameras/sensors 42 may be communicated directly to and stored in memory 411. For example, still image/video data taken with a camera etc. may be recorded in the built-in memory or external memory, but is not limited to this. It may be recorded in either the user terminal 2 or the mobile storage device 5. The camera 42 may be installed on the flying object 4 via a gimbal 43.
 フライトコントローラ41は、飛行体の状態を制御するように構成された図示しない制御モジュールを含んでいる。例えば、制御モジュールは、6自由度(並進運動x、y及びz、並びに回転運動θ、θ及びθ)を有する飛行体の空間的配置、速度、および/または加速度を調整するために、ESC44(Electric Speed Controller)を経由して飛行体の推進機構(モータ45等)を制御する。バッテリー48から給電されるモータ45によりプロペラ46が回転することで飛行体の揚力を生じさせる。制御モジュールは、搭載部、センサ類の状態のうちの1つ以上を制御することができる。 Flight controller 41 includes a control module (not shown) configured to control the state of the aircraft. For example, the control module may be configured to adjust the spatial position, velocity, and/or acceleration of an air vehicle with six degrees of freedom (translational motion x, y, and z, and rotational motion θ x , θ y , and θ z ). , and controls the propulsion mechanism (motor 45, etc.) of the aircraft via an ESC 44 (Electric Speed Controller). A propeller 46 is rotated by a motor 45 supplied with power from a battery 48, thereby generating lift of the flying object. The control module can control one or more of the states of the mounting section and sensors.
 また、フライトコントローラ41は、1つ以上の外部のデバイス(例えば、送受信機(プロポ)49、管理サーバ1、ユーザ端末2、表示装置、または他の遠隔の制御器)へのデータの送信および/または外部のデバイスからのデータの受信が可能に構成された送受信部47と通信可能である。送受信機49は、有線通信または無線通信などの任意の適当な通信手段を使用することができる。 Flight controller 41 also transmits and/or transmits data to one or more external devices (e.g., transceiver 49, management server 1, user terminal 2, display device, or other remote controller). Alternatively, it can communicate with a transmitter/receiver 47 configured to be able to receive data from an external device. Transceiver 49 may use any suitable communication means, such as wired or wireless communication.
 さらに、フライトコントローラ41は、上述の飛行体の状態の制御などの移動体状態制御機能のみならず、外部のデバイス(特にユーザ端末2)からの指示に応じてアプリケーション・プログラムを実行するなどして、データ処理に関する各種機能を実現するようにしてもよく、例えば後述する移動経路生成部430や移動指示部440に対応する機能を実行可能であってもよい。なお、フライトコントローラ41にデータ処理に関する機能を実現して移動体状態制御機能とデータ処理機能とで兼用としてもよいが、これに代えて、データ処理機能専用のプロセッサ(制御部)を別途備えるようにしてもよい。 Furthermore, the flight controller 41 not only functions to control the state of the flying object, such as controlling the state of the flying object described above, but also executes application programs in response to instructions from external devices (particularly the user terminal 2). , it may be possible to realize various functions related to data processing, and for example, it may be possible to execute functions corresponding to a movement route generation section 430 and a movement instruction section 440, which will be described later. Note that the flight controller 41 may have a function related to data processing so that it can be used for both the mobile object state control function and the data processing function, but instead of this, a separate processor (control unit) dedicated to the data processing function may be provided. You may also do so.
 送受信部47は、例えば、ローカルエリアネットワーク(LAN)、ワイドエリアネットワーク(WAN)、赤外線、無線、WiFi、ポイントツーポイント(P2P)ネットワーク、電気通信ネットワーク、クラウド通信などのうちの1つ以上を利用することができる。 The transmitter/receiver 47 uses, for example, one or more of a local area network (LAN), wide area network (WAN), infrared rays, wireless, WiFi, point-to-point (P2P) network, telecommunications network, cloud communication, etc. can do.
 送受信部47は、カメラ/センサ類42で取得したデータ、フライトコントローラ41が生成した処理結果、所定の制御データ、端末または遠隔の制御器からのユーザコマンドなどのうちの1つ以上を送信および/または受け取ることができる。 The transmitting/receiving unit 47 transmits and/or transmits one or more of data acquired by the camera/sensors 42, processing results generated by the flight controller 41, predetermined control data, user commands from a terminal or remote controller, etc. or can be received.
 本実施の形態によるカメラ/センサ類42は、慣性センサ(加速度センサ、ジャイロセンサ)、衛星測位システムからの受信情報を取得する受信機(RTK-GPSセンサ)、環境情報を取得するセンサ(近接センサ(例えば、LiDAR(Light Detection And Ranging)等)、またはVisualセンサ(例えば、カメラを含む)、イメージセンサ)を含み得る。 The cameras/sensors 42 according to the present embodiment include an inertial sensor (acceleration sensor, gyro sensor), a receiver (RTK-GPS sensor) that acquires information received from a satellite positioning system, and a sensor (proximity sensor) that acquires environmental information. (for example, LiDAR (Light Detection and Ranging), etc.), or a visual sensor (for example, including a camera), an image sensor).
<移動体4の機能>
 図5は、移動体4に実装される機能を例示したブロック図である。本発明の実施の形態においては、衛星測位システムからの受信情報を取得する受信機と環境情報を取得するセンサとを備える移動体の自己位置を推定するものであって、受信情報を表す第1座標系と、環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換する基準座標変換部と、受信情報が示す第1の自己位置情報と、環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、基準座標系で表される第3の自己位置情報を推定するために各種機能部を有している。なお、各種機能部の一部または全部は、管理サーバ1やユーザ端末2の少なくともいずれかに搭載される情報処理装置(プロセッサ、制御部)にて実現されてもよい。
<Functions of mobile object 4>
FIG. 5 is a block diagram illustrating functions implemented in the mobile object 4. As shown in FIG. In an embodiment of the present invention, the self-position of a mobile body is estimated, which includes a receiver that acquires received information from a satellite positioning system and a sensor that acquires environmental information, and a first a reference coordinate conversion unit that converts both the coordinate system and a second coordinate system representing environmental information into a reference coordinate system based on the base point coordinates, first self-position information indicated by the received information, environmental information and a reference; It has various functional units for estimating third self-position information expressed in a reference coordinate system based on second self-position information calculated by comparison with environmental information. Note that some or all of the various functional units may be realized by an information processing device (processor, control unit) installed in at least one of the management server 1 and the user terminal 2.
 本実施の形態においては、移動体4は、基準座標変換部410、自己位置推定部420、移動経路生成部430、移動指示部440、移動経路修正部450、記憶部470を備えている。また、記憶部470は、移動情報記憶部471、移動経路情報記憶部472の各種データベースを含む。 In the present embodiment, the moving body 4 includes a reference coordinate conversion section 410, a self-position estimation section 420, a movement route generation section 430, a movement instruction section 440, a movement path correction section 450, and a storage section 470. Furthermore, the storage unit 470 includes various databases such as a movement information storage unit 471 and a movement route information storage unit 472.
 基準座標変換部410は、移動体4に搭載される受信機により取得する衛星測位システムからの受信情報を表す第1座標系(例えば緯度経度高さ座標系(LLA座標系))と、移動体4に搭載されるセンサにより取得される環境情報を表す第2座標系(例えばPoint Cloud Map(点群地図)座標系)と、を共にベースポイント座標を基準(例えば原点)とした基準座標系に変換する。より具体的には、例えば事前に記憶される座標間の変換情報(例えば後述の変換情報T1、T2)を利用して、第1座標系や第2座標系を基準座標系に変換する。 The reference coordinate conversion unit 410 converts a first coordinate system (for example, a latitude/longitude/height coordinate system (LLA coordinate system)) representing received information from a satellite positioning system acquired by a receiver mounted on the mobile body 4 and a mobile body A second coordinate system (e.g., Point Cloud Map coordinate system) representing the environmental information acquired by the sensor mounted on Convert. More specifically, the first coordinate system and the second coordinate system are transformed into the reference coordinate system using, for example, transformation information between coordinates stored in advance (for example, transformation information T1 and T2 described later).
 ここで、基準座標変換部410において各座標系を基準座標系に変換するために参照する変換情報T1の例を示す。図6に例示されるように点群地図座標系と基準座標系は独立に存在している。図中のP1乃至P3は、点群地図がもつ点群(例えば建物などのオブジェクトを示す点群等)を表すものである。基準座標系は、任意の位置を原点Oとした三次元座標系(XYZ座標系)で表現されている。図6の右においては、RTK-GPSの緯度経度高さ座標系が基準座標系との変換情報T2(不図示)に基づき基準座標系で表現されている。図6の左においては、点群地図座標系が所定の位置(例えば、センサの電源をオンにした位置やリセット処理をした位置、移動体4が移動を開始する位置など)を原点O’とする三次元座標系(X’Y’Z’座標系)で表現されている。受信機とセンサの両方が正しく動作している状態で移動体4が移動すると、両者から移動体4の位置を得ることができる。なお、図6に例示されるように、取得した一連の自己位置の結果を軌跡として描くことができる。 Here, an example of conversion information T1 referred to in order to convert each coordinate system to the reference coordinate system in the reference coordinate conversion unit 410 will be shown. As illustrated in FIG. 6, the point cloud map coordinate system and the reference coordinate system exist independently. P1 to P3 in the figure represent point clouds (for example, point clouds indicating objects such as buildings) included in the point cloud map. The reference coordinate system is expressed as a three-dimensional coordinate system (XYZ coordinate system) with the origin O at an arbitrary position. On the right side of FIG. 6, the latitude/longitude/height coordinate system of RTK-GPS is expressed in the reference coordinate system based on conversion information T2 (not shown) with the reference coordinate system. On the left side of FIG. 6, the point cloud map coordinate system sets a predetermined position (for example, the position where the power of the sensor is turned on, the position where the reset process is performed, the position where the mobile object 4 starts moving, etc.) as the origin O'. It is expressed in a three-dimensional coordinate system (X'Y'Z' coordinate system). When the moving object 4 moves while both the receiver and the sensor are operating correctly, the position of the moving object 4 can be obtained from both. Note that, as illustrated in FIG. 6, a series of acquired self-position results can be drawn as a trajectory.
 そして、現実の世界においては、ある時刻における移動体4の位置は一つに定まるところ、ある時刻において同時に受信機とセンサから取得される移動体4の位置情報を「同じ場所を示す情報同士のペア」として両位置情報を同じ時刻に関連付けて、例えば移動情報記憶部471に記憶する。図7においては、特に互いに対応付ける位置がわかりやすい点(例えば、移動体の移動経路上の角にあたる部分など)の位置情報をペアで記憶する例が示されている。図8においては、このような記憶された各ペアの位置情報同士の距離の合計が最小となる位置関係における原点同士の関係性などに基づき変換情報T1を算出する。この変換情報T1(及び変換情報T2)を用いることで、衛星測位システムからの受信情報を表す第1座標系(例えば緯度経度高さ座標系)と、移動体4に搭載されるセンサにより取得される環境情報を表す第2座標系(例えばPoint Cloud Map(点群地図)座標系)と、を共にベースポイント座標を基準とした基準座標系に変換することが可能となり、すなわち第1座標系で表現される位置情報と第2座標系で表現される位置情報とを基準座標系に統合することが可能となる。なお、この方法に限らず、第1座標系で表現される位置情報と第2座標系で表現される位置情報とを基準座標系に統合することが可能であればどのような方法であってもよい。また、第1座標系及び第2座標系で表現される各位置情報を基準座標系に変換して統合する構成について説明したが、この時、第1座標系及び第2座標系で表現される各位置における姿勢情報についても同様に基準座標系に変換して統合される。 In the real world, the position of the mobile object 4 at a certain time is determined to be one, but the position information of the mobile object 4 obtained at the same time from the receiver and the sensor at a certain time is "information indicating the same location". Both pieces of location information are associated with the same time as a pair and stored in, for example, the movement information storage unit 471. In FIG. 7, an example is shown in which position information of points whose positions are particularly easy to associate with each other (for example, corners of the movement route of a moving object) are stored in pairs. In FIG. 8, the conversion information T1 is calculated based on the relationship between the origins in the positional relationship that minimizes the sum of the distances between the stored pairs of positional information. By using this conversion information T1 (and conversion information T2), the first coordinate system (e.g. latitude/longitude/height coordinate system) representing the received information from the satellite positioning system and the sensor mounted on the mobile object 4 can be used. It is possible to convert both the second coordinate system (for example, the Point Cloud Map coordinate system) representing the environmental information that is displayed into the reference coordinate system based on the base point coordinates. It becomes possible to integrate the positional information expressed and the positional information expressed in the second coordinate system into the reference coordinate system. Note that this method is not limited to this method, and any method may be used as long as it is possible to integrate the position information expressed in the first coordinate system and the position information expressed in the second coordinate system into the reference coordinate system. Good too. In addition, we have explained the configuration in which each position information expressed in the first coordinate system and the second coordinate system is converted into the reference coordinate system and integrated. The posture information at each position is similarly converted to the reference coordinate system and integrated.
 自己位置推定部420は、基準座標系に変換された、受信情報が示す第1の自己位置情報と、環境情報と基準環境情報との比較によって算出される第2の自己位置情報と、に基づき、基準座標系で表される第3の自己位置情報を推定する。第1の自己位置情報は、例えば、RTK-GPSにより取得された位置情報であり得る。第2の自己位置情報は、例えば、LiDARセンサやVisualセンサ(カメラ含む)などにより取得される環境情報(例えば、図9にはLiDAR等で取得した三次元点群地図情報)と、事前に環境情報を取得するセンサにより取得された基準環境情報(例えば、図10には事前にLiDAR等で取得した基準三次元点群地図情報)とを互いに比較することで判定された位置情報であり、当該比較により何れの観測位置から取得した環境情報であるかを判定し、当該観測位置を第2の自己位置情報とする。環境情報及び基準環境情報が点群地図として表現されている場合には、例えば、NDT registratiоn、NDT Scan Matchingなどの既知の技術により両環境情報の形状をパズル合わせのように比較し(例えば、図11には点群地図情報と基準点群地図情報とを重ね合わせた情報が示されている)、両者の一致度が最も高くなる座標変換を求めることで、第2の自己位置情報の推定が可能である。 The self-position estimating unit 420 calculates the self-position based on the first self-position information indicated by the received information, which has been converted into the reference coordinate system, and the second self-position information calculated by comparing the environment information and the reference environment information. , estimate third self-position information expressed in a reference coordinate system. The first self-location information may be, for example, location information obtained by RTK-GPS. The second self-position information includes, for example, environmental information acquired by a LiDAR sensor or a visual sensor (including a camera) (for example, three-dimensional point cloud map information acquired by LiDAR etc. in FIG. 9), This is position information determined by comparing the reference environment information acquired by the sensor that acquires the information (for example, in FIG. 10, the reference three-dimensional point cloud map information acquired in advance with LiDAR, etc.). By comparison, it is determined from which observation position the environmental information was acquired, and the observation position is set as the second self-location information. When environmental information and reference environmental information are expressed as point cloud maps, the shapes of both environmental information are compared like a puzzle using known techniques such as NDT registration and NDT Scan Matching (for example, 11 shows the information obtained by superimposing the point cloud map information and the reference point cloud map information), the second self-position information can be estimated by finding the coordinate transformation that maximizes the degree of agreement between the two. It is possible.
 さらに、具体的な一例としては、自己位置推定部420は、例えば、基準座標系に変換された第1の自己位置情報(例えばRTK-GPSに基づく自己位置情報)及び第2の自己位置情報(例えばLiDARセンサに基づく自己位置情報)を入力として、第3の自己位置情報を推定するカルマンフィルタやパーティクルフィルタ等の状態推定フィルタを含み得る。このうちカルマンフィルタは、複数の観測値を統合して尤もらしい状態量を推定する機能を持つフィルタであるところ、同一の基準座標系に変換された第1の自己位置情報及び第2の自己位置情報を用いることが可能となることから第3の自己位置情報を推定することが可能となる。なお、第1の自己位置情報または第2の自己位置情報が第3の自己位置情報を推定するために十分な感度(精度)でない場合、すなわち、第1の自己位置情報が例えばRTK-GPSに基づくものである形態においてRTK-GPSの通信状況に応じて所定値以下の感度となり基準を満たさない場合(例えば図14には、移動体4が構造物内に入ったためにRTK-GPSの感度が0になったことが示されている)や、第2の自己位置情報のために基準環境情報を利用する形態において比較対象となる基準環境情報が十分用意されている位置ではなくNDT registratiоn等のスコアが基準を満たさない場合(例えば図12には、移動体4周辺の基準環境情報が少なく、不一致の指数を示すスコアが高くなっていることが示されている)などにおいては、十分な感度が得られなかった自己位置情報を入力として採用しないようにしてもよい。 Further, as a specific example, the self-position estimating unit 420 may, for example, store first self-position information (e.g. self-position information based on RTK-GPS) converted to the reference coordinate system and second self-position information (self-position information based on RTK-GPS). For example, it may include a state estimation filter such as a Kalman filter or a particle filter that estimates third self-position information by inputting self-position information (for example, self-position information based on a LiDAR sensor). Among these, the Kalman filter is a filter that has the function of integrating multiple observed values and estimating a plausible state quantity. Since it becomes possible to use the third self-position information, it becomes possible to estimate the third self-position information. Note that if the first self-location information or the second self-location information does not have sufficient sensitivity (accuracy) to estimate the third self-location information, that is, if the first self-location information is In the case where the sensitivity of the RTK-GPS falls below a predetermined value depending on the communication status of the RTK-GPS and the standard is not satisfied (for example, in FIG. 0), or in a format where reference environment information is used for the second self-location information, the location is not a location where sufficient reference environment information to be compared is prepared, but NDT registration etc. In cases where the score does not meet the standard (for example, FIG. 12 shows that there is little reference environment information around the moving object 4 and the score indicating the index of mismatch is high), sufficient sensitivity is required. Self-location information for which no information can be obtained may not be used as input.
 また、具体的な他の例としては、自己位置推定部420は、第1の自己位置情報または第2の自己位置情報が第3の自己位置情報を推定するために十分な感度でない場合に、上記同様に十分な感度を示している自己位置情報(例えば、図12では第1の自己位置情報、図14では第2の自己位置情報)を採用して第3の自己位置情報を推定するが、図13が示すように両自己位置情報において十分な感度を示している場合には、例えば、ユーザ操作により優先して採用する自己位置情報を事前に設定してもよいし、または、事前に設定された一以上の基準感度と各感度との比較結果に基づいて、より感度が高く示している自己位置情報が信頼性が高いとされるように重みづけをして第3の自己位置情報を推定してもよい(すなわち、図15の左に例示されるように、両者の感度が同じ程度の信頼性を示す場合には、同等の重みづけであって第1の自己位置情報及び第2の自己位置情報の中央位置を第3の自己位置情報と推定し、一方の感度の信頼性が高いと示される場合には、感度が高く示されている自己位置情報側に寄せた比率の位置を第3の自己位置情報と推定するなど)。 As another specific example, when the first self-location information or the second self-location information does not have sufficient sensitivity to estimate the third self-location information, the self-location estimating unit 420 Similarly to the above, the third self-position information is estimated by employing self-position information showing sufficient sensitivity (for example, the first self-position information in FIG. 12 and the second self-position information in FIG. 14). , as shown in FIG. 13, if both self-location information shows sufficient sensitivity, for example, the self-location information to be adopted preferentially may be set in advance by user operation, or Based on the comparison result between one or more set reference sensitivities and each sensitivity, third self-location information is weighted so that self-location information with higher sensitivity is considered to be more reliable. (In other words, as illustrated on the left side of FIG. 15, if both sensitivities show the same level of reliability, the first self-location information and the first self-location information may be weighted equally, and The center position of the 2nd self-location information is estimated as the 3rd self-location information, and if one of the sensitivities is shown to be highly reliable, the ratio shifted to the side of the 3rd self-location information that is shown to be more sensitive. (e.g., estimating the location as third self-location information).
 また、自己位置推定部420は、第1の自己位置情報及び第2の自己位置情報のいずれも第3の自己位置情報を推定するために十分な感度でないと判定される場合(例えば、上述の基準感度に基づく判定であってもよい)には、後述の移動指示部440に対して移動体4の移動を停止させる信号を送信することで緊急停止機能を作動させるようにしてもよい。 Furthermore, when it is determined that neither the first self-location information nor the second self-location information has sufficient sensitivity to estimate the third self-location information (for example, the above-mentioned (The determination may be based on the reference sensitivity), the emergency stop function may be activated by transmitting a signal to stop the movement of the mobile body 4 to a movement instruction unit 440, which will be described later.
 このようにして、それぞれ異なる構成から取得され、異なる座標系によって表現された複数の観測情報において、同一の基準座標系に変換された第1の自己位置情報及び第2の自己位置情報を用いることが可能となることから精度の高い第3の自己位置情報を推定することが可能となる。 In this way, the first self-position information and the second self-position information converted to the same reference coordinate system can be used in a plurality of pieces of observation information obtained from different configurations and expressed by different coordinate systems. This makes it possible to estimate highly accurate third self-position information.
 移動経路生成部430は、例えばユーザ端末2上に表示される三次元モデルデータ(例えば、図10に示されるような基準三次元点群地図情報であってもよい)に対してユーザの選択操作により一以上のウェイポイント情報を始点から終点まで順次に、または、任意の点を任意の順で設定し、当該ウェイポイント情報に基づいて既知の方法により移動経路情報を生成し、移動経路情報記憶部472に記憶され、管理するようにしてもよいし、三次元環境データを解析し、例えば構造物内外の特定の又は全ての構成物(例えば、内壁、柱、天井、窓、ドア、階段、内部設備などの内部構成物や、外壁、屋根、外部設備、窓、ドア、階段、道路、線路、駅、街路灯、バス停、橋梁、トンネル、地形、植生、水域、ガスメーターほか計量器類などの外部構成物など)の情報を取得可能なウェイポイント情報を設定した移動経路を算出し、これを移動経路情報として移動経路情報記憶部472に記憶され、管理するようにしてもよい。 The movement route generation unit 430 performs a user's selection operation on three-dimensional model data (for example, reference three-dimensional point cloud map information as shown in FIG. 10) displayed on the user terminal 2, for example. Set one or more waypoint information sequentially from the start point to the end point, or set any points in any order, generate travel route information by a known method based on the waypoint information, and store the travel route information. The three-dimensional environment data may be stored and managed in the unit 472, or the three-dimensional environment data may be analyzed to determine the location of specific or all structures inside and outside the structure (for example, internal walls, columns, ceilings, windows, doors, stairs, etc.). Internal components such as internal equipment, external walls, roofs, external equipment, windows, doors, stairs, roads, railways, stations, street lights, bus stops, bridges, tunnels, terrain, vegetation, water bodies, gas meters and other measuring instruments, etc. A travel route may be calculated in which waypoint information from which information on external components (such as external components) can be obtained is set, and this may be stored and managed in the travel route information storage unit 472 as travel route information.
 なお、移動経路は、例えば、移動体格納装置5の位置を移動開始位置及び移動終了位置として、各ウェイポイントを通過する移動経路を生成するようにしてもよいし、逆に移動体格納装置5を有さずに、ユーザにより機体を持ち運びされた位置を移動開始位置としたり、移動終了位置においてユーザが機体を回収したりする構成などでもよいし、管理サーバ1やユーザ端末2、移動体4の記憶部において管理された移動体格納装置5の情報(例えば、位置情報や格納状態情報、格納機情報など)を基に、移動開始位置または移動終了位置として選択された移動体格納装置5の位置も含めた移動経路として生成される構成でもよい。 Note that the movement route may be generated, for example, by setting the position of the moving body storage device 5 as the movement start position and the movement end position, and passing through each waypoint, or conversely, Alternatively, the configuration may be such that the position where the aircraft is carried by the user is set as the movement start position, or the user collects the aircraft at the movement end position. Based on the information (for example, position information, storage status information, storage machine information, etc.) of the mobile body storage device 5 managed in the storage unit of the mobile body storage device 5, the mobile body storage device 5 selected as the movement start position or movement end position is It may also be configured to be generated as a movement route including the location.
 三次元モデルデータは、上述のとおり基準三次元点群地図情報等の基準環境情報を利用してもよいが、これに限らず、例えば、CAD(Computer―Aided Design)設計ソフトウェアで作成されたデータを基にして作成されたモデルであって、BIM(Building Information Modeling)データやCIM(Construction Information Modeling)データ、CADデータ、BIMデータ等から再構築された三次元モデルデータなどであってもよいし、二次元の設計図データに基づき所定の高さを有する構成物を生成することで得られた三次元モデルデータであってもよいし、CityGML(Generalized Markup Language)、CityJson、GeoTIFFなどの三次元都市モデルデータや本システム外部の三次元都市モデルデータベースに格納される三次元都市モデルデータであってもよい。なお、三次元モデルデータの再構築等は、管理サーバ1やユーザ端末2のプロセッサにおいて実行されてもよいし、管理サーバ1やユーザ端末2の外部で実行されて内部へ取得されてもよい。 The three-dimensional model data may use reference environment information such as reference three-dimensional point cloud map information as described above, but is not limited to this, and may, for example, be data created with CAD (Computer-Aided Design) design software. The model may be a three-dimensional model data reconstructed from BIM (Building Information Modeling) data, CIM (Construction Information Modeling) data, CAD data, BIM data, etc. , three-dimensional model data obtained by generating a structure with a predetermined height based on two-dimensional blueprint data, or three-dimensional model data such as CityGML (Generalized Markup Language), CityJson, GeoTIFF, etc. It may be city model data or three-dimensional city model data stored in a three-dimensional city model database external to this system. Note that the reconstruction of the three-dimensional model data, etc. may be executed in the processor of the management server 1 or the user terminal 2, or may be executed outside the management server 1 or the user terminal 2 and acquired therein.
 また、基準環境情報が基準三次元点群地図情報である場合、基準三次元点群地図情報は、上述のとおり事前にLiDAR等のセンサにより取得した情報であってもよいが、例えば、管理サーバ1やユーザ端末2のプロセッサ等において上述の三次元モデルデータの構造物内外のモデル表面(サーフェス)を点群化した三次元点群モデルデータを用いてもよい。三次元点群モデルデータの生成方法については、例えば、三次元モデルデータの構造物内または構造物外を仮想のセンサ(例えば、仮想のLiDAR)を搭載した仮想の移動体4を移動させることにより構造物内または構造物外の構成物に関する三次元点群モデルデータを生成してもよい。これにより、理論上、構造物内または構造物外を移動体4のセンサで実測した場合の点群センシングデータに近い点群データを生成することができる。また、他の三次元点群モデルデータの生成方法は、三次元モデルデータを所定の間隔で均等に点群化してもよいし、三次元モデルデータがポリゴンデータである場合には各頂点に点を配置して点群化してもよいし、既知の点群化技術(点群データへの変換技術)を用いて点群化してもよい。生成された三次元点群モデルデータは、記憶部470や管理サーバ1やユーザ端末2のいずれかの記憶部に記憶される。 Further, when the reference environment information is reference three-dimensional point cloud map information, the reference three-dimensional point cloud map information may be information acquired in advance by a sensor such as LiDAR as described above, but for example, 1 or the processor of the user terminal 2, etc., may use three-dimensional point cloud model data obtained by converting the model surfaces inside and outside the structure of the three-dimensional model data described above into a point group. Regarding the method of generating three-dimensional point cloud model data, for example, by moving a virtual moving body 4 equipped with a virtual sensor (for example, virtual LiDAR) inside or outside the structure of the three-dimensional model data. Three-dimensional point cloud model data regarding components within or outside the structure may be generated. As a result, it is theoretically possible to generate point cloud data close to point cloud sensing data obtained when the inside or outside of the structure is actually measured using the sensor of the moving body 4. Other methods of generating 3D point cloud model data include forming 3D model data into a point cloud evenly at predetermined intervals, or if the 3D model data is polygon data, points are placed at each vertex. may be arranged to form a point cloud, or may be formed into a point cloud using a known point cloud technology (conversion technology to point cloud data). The generated three-dimensional point cloud model data is stored in the storage unit 470, the management server 1, or the user terminal 2.
 移動指示部440は、移動経路情報記憶部472に記憶される移動経路情報を参照し、移動経路情報が示す座標及び上述のとおり推定される第3の自己位置情報に応じて移動体4の移動を指示する情報を移動体4へ送信する。すなわち、特に構造物(例えば建物など)の外部にて感度が高く、事前の基準情報取得が必須ではない第1の自己位置情報と、事前の基準情報取得が必要ではあるものの構造物の内部でも感度が高い第2の自己位置情報と、をそれぞれ基準座標系に変換し、変換後の自己位置情報に基づき推定される基準座標系で表される第3の自己位置情報を用いることで、構造物内であっても、基準環境情報がない構造物外であっても、移動経路情報が示す情報と第3の自己位置情報を比較することが可能となるため、構造物内外に跨る移動経路においてもシームレスに移動体4の移動指示制御が可能となる。なお、移動経路情報は、上述の移動経路生成部430に生成されたものであってもよいし、外部システムで生成されて記憶されたものであってもよい。 The movement instruction unit 440 refers to the movement route information stored in the movement route information storage unit 472, and instructs the movement of the moving body 4 according to the coordinates indicated by the movement route information and the third self-position information estimated as described above. The mobile unit 4 transmits information instructing the mobile unit 4 to the mobile unit 4. In other words, the first self-location information is particularly sensitive outside a structure (such as a building) and does not require prior acquisition of reference information, and the first self-location information is sensitive even inside a structure, although it is necessary to obtain reference information in advance. By converting the second self-position information, which has high sensitivity, into the reference coordinate system, and using the third self-position information expressed in the reference coordinate system estimated based on the converted self-position information, the structure Even if it is inside a building or outside a structure for which there is no reference environment information, it is possible to compare the information indicated by the movement route information with the third self-position information, so the movement route that spans inside and outside the structure can be compared. It is also possible to seamlessly control the movement instructions of the mobile body 4 even in the case of the present invention. Note that the movement route information may be generated by the above-mentioned movement route generation unit 430, or may be generated and stored by an external system.
 ここで、移動体4は、さらに移動経路修正部450を備えていてもよい。移動経路修正部450は、環境情報を取得するためのセンサにより移動経路近傍(移動経路上を含む移動経路の所定距離範囲内)に障害物が存在することが確認された時に、移動指示部440が参照している移動経路情報を修正する。なお、本システムにおいては、移動経路が構造物内外に跨って基準座標系にて表現可能であるところ、例えば構造物の出入り口周辺に障害物がある場合などにおいては、移動経路の修正範囲は構造物内外に跨る移動経路を対象とすることが可能となる。 Here, the moving body 4 may further include a moving route correction section 450. The movement route correction unit 450 corrects the movement instruction unit 440 when it is confirmed by a sensor for acquiring environmental information that an obstacle exists near the movement route (within a predetermined distance range of the movement route including on the movement route). Correct the travel route information referenced by. In addition, in this system, the movement route can be expressed in the reference coordinate system across the inside and outside of the structure. For example, when there are obstacles around the entrance and exit of the structure, the correction range of the movement path is limited to the structure. It becomes possible to target movement routes that span inside and outside of objects.
 移動情報記憶部471は、移動経路生成部430により移動経路を生成する際や、移動指示部440により当該移動経路上において自律制御された移動体4の移動が指示される際などに用いられるパラメータ情報や移動経路上で取得した移動時取得情報等を格納している。具体的なパラメータの例としては、例えば、移動速度、飛行高度(移動体4が飛行体である場合)、撮像画像のオーバーラップ率、移動時取得情報(例えば、画像情報や映像情報、環境情報等)などを含む。 The movement information storage unit 471 stores parameters used when the movement route generation unit 430 generates a movement route, the movement instruction unit 440 instructs the autonomously controlled moving body 4 to move on the movement route, etc. It stores information, information acquired during movement, etc. acquired on the movement route. Examples of specific parameters include moving speed, flight altitude (if the moving object 4 is a flying object), overlap rate of captured images, information acquired during movement (for example, image information, video information, environmental information, etc.). etc.) etc.
 移動経路情報記憶部472は、移動経路生成部430により生成された移動経路上の座標情報(いわゆるウェイポイント情報)等を格納している。なお、上述のとおり、管理サーバ1やユーザ端末2のプロセッサ等や外部システムにより生成された移動経路情報を格納していてもよい。 The movement route information storage unit 472 stores coordinate information (so-called waypoint information) on the movement route generated by the movement route generation unit 430. Note that, as described above, travel route information generated by the management server 1, the processor of the user terminal 2, or an external system may be stored.
<ユーザ端末2の機能>
 図5は、ユーザ端末2に実装される機能も例示したブロック図である。なお、各種機能部の一部または全部は、管理サーバ1または移動体4の少なくともいずれかに搭載される情報処理装置(プロセッサ、制御部)にて実現されてもよい。
<Functions of user terminal 2>
FIG. 5 is a block diagram also illustrating functions implemented in the user terminal 2. Note that some or all of the various functional units may be realized by an information processing device (processor, control unit) installed in at least one of the management server 1 and the mobile body 4.
 本実施の形態においては、ユーザ端末2は、通信部210、画面情報生成部220、記憶部270を備えている。 In this embodiment, the user terminal 2 includes a communication section 210, a screen information generation section 220, and a storage section 270.
 通信部210は、管理サーバ1や、移動体4、移動体格納装置5と通信を行う。通信部210は、管理サーバ1や、移動体4、移動体格納装置5からの各種要求やデータ等を受け付ける受付部としても機能する。 The communication unit 210 communicates with the management server 1, the mobile body 4, and the mobile body storage device 5. The communication unit 210 also functions as a reception unit that receives various requests, data, etc. from the management server 1, the mobile unit 4, and the mobile unit storage device 5.
 画面情報生成部220は、ユーザ端末2のユーザインターフェースを介して表示される画面情報を生成する。例えば、所定のレイアウト規則に基づいて、各種画像及びテキストを配置することで生成されるユーザインターフェース画面を構成し、当該ユーザインターフェース画面上で移動体4が取得した各種情報を表示するための画面情報を生成する。 The screen information generation unit 220 generates screen information displayed via the user interface of the user terminal 2. For example, screen information for configuring a user interface screen generated by arranging various images and texts based on predetermined layout rules, and displaying various information acquired by the mobile object 4 on the user interface screen. generate.
 図16を参照して、本実施形態にかかる情報処理方法について、本実施の形態における情報処理システムの動作も含めて説明する。図16には、本実施形態にかかる情報処理方法のフローチャートが例示されている。このフローチャートでは、例示的にユーザ端末2上でアプリケーションを起動する構成を示しているが、これに限らず、例えば管理サーバ1や移動体4、移動体格納装置5がアプリケーションを起動可能なプロセッサと入出力装置を有し、各種設定等が可能な構成であってもよい。 With reference to FIG. 16, the information processing method according to this embodiment will be described, including the operation of the information processing system in this embodiment. FIG. 16 illustrates a flowchart of the information processing method according to this embodiment. Although this flowchart exemplarily shows a configuration in which an application is started on the user terminal 2, the configuration is not limited to this. The configuration may include an input/output device and allow various settings and the like.
 まず、ユーザは、例えばユーザ端末2において、移動体4の操作や取得情報の表示を行うアプリケーションを起動する(SQ101)。このアプリケーションは、例えばユーザ端末2に記憶されていてもよいし、ネットワークを介して接続される管理サーバ1や移動体4または他の外部サーバ(不図示)から提供されるソフトウェア(いわゆるSaaS)であってもよい。必要に応じてログイン画面が表示され、例えばログインIDやパスワードを要求する構成にしてもよい。 First, the user starts, for example, an application on the user terminal 2 that operates the mobile object 4 and displays acquired information (SQ101). This application may be stored in the user terminal 2, for example, or may be software (so-called SaaS) provided from the management server 1, mobile device 4, or other external server (not shown) connected via a network. There may be. A login screen may be displayed as necessary, and a configuration may be adopted in which, for example, a login ID and password are requested.
 次に、ユーザは、新規の移動計画を作成する(SQ102)。例えば、「プラン名」や「エリア名」、「住所」などを設定して、ユーザ端末2上に、移動対象となる三次元モデルデータを取得して表示し、新規の移動計画の作成を開始する。 Next, the user creates a new travel plan (SQ102). For example, set the "plan name", "area name", "address", etc., acquire and display the 3D model data to be moved on the user terminal 2, and start creating a new movement plan. do.
 次に、ユーザは、移動体4の移動のための移動経路を生成する(SQ103)。例えば、ユーザ端末2上に表示される三次元モデルデータに対してユーザの選択操作により一以上のウェイポイント情報(例えば、ユーザ端末2上では緯度経度高さ座標系で表現される)を設定する。そして、移動体4に三次元モデルデータ及びウェイポイント情報を送信し、移動体4にて三次元モデルデータ及びウェイポイント情報に基づいて既知の方法(例えば設定されたウェイポイント間をそれぞれ直線にて結ぶ等)により移動経路情報が生成される。 Next, the user generates a travel route for the movement of the mobile object 4 (SQ103). For example, one or more waypoint information (for example, expressed in the latitude, longitude, and height coordinate system on the user terminal 2) is set by the user's selection operation for the three-dimensional model data displayed on the user terminal 2. . Then, the three-dimensional model data and waypoint information are transmitted to the moving object 4, and the moving object 4 uses a known method (for example, moving between the set waypoints in a straight line) based on the three-dimensional model data and waypoint information. travel route information is generated.
 次に、ユーザは、移動体4に移動の実行開始を指示する(SQ104)。例えば、移動情報記憶部471及び移動経路情報記憶部472を参照して、点検、警備、建築進捗管理等を目的とする移動体4の移動を実行する。この時、移動体4の受信機により取得される第1の自己位置情報と、センサによる環境情報と基準環境情報とを比較した結果により得られる第2の自己位置情報と、をそれぞれ基準座標系に変換し、変換後の自己位置情報に基づき推定される基準座標系で表される第3の自己位置情報と、移動経路情報のウェイポイント情報等に基づき移動体4の移動が制御される。 Next, the user instructs the moving body 4 to start moving (SQ104). For example, with reference to the movement information storage section 471 and the movement route information storage section 472, movement of the mobile object 4 for the purpose of inspection, security, construction progress management, etc. is executed. At this time, the first self-position information acquired by the receiver of the mobile object 4 and the second self-position information obtained as a result of comparing the environmental information obtained by the sensor and the reference environmental information are respectively set in the reference coordinate system. The movement of the moving body 4 is controlled based on the third self-position information expressed in the reference coordinate system estimated based on the converted self-position information, the waypoint information of the movement route information, etc.
 次に、ユーザは、ユーザ端末2へ取得情報の出力を指示する(SQ105)。例えば、ユーザ端末2上に表示される三次元モデルデータに移動体4が実際に移動した経路情報を重畳して表示してもよい。そのほか、移動経路上にて移動体4により取得された取得情報(静止画像、動画像、音声その他の情報)を表示したり、当該取得情報の位置情報に対応付けられた位置(特にウェイポイントの位置情報)に対応する取得情報を閲覧するためのリンクとなる記号などの印が付されてもよい。そして、当該リンクをユーザ端末2上で選択することで、対応する取得情報が表示されるようにしてもよい。 Next, the user instructs the user terminal 2 to output the acquired information (SQ105). For example, information on the route along which the mobile object 4 actually traveled may be superimposed and displayed on the three-dimensional model data displayed on the user terminal 2. In addition, the acquired information (still images, moving images, audio, and other information) acquired by the mobile object 4 on the movement route can be displayed, and the positions associated with the position information of the acquired information (especially waypoints) can be displayed. A mark such as a symbol that serves as a link for viewing acquired information corresponding to location information) may also be attached. Then, by selecting the link on the user terminal 2, the corresponding acquired information may be displayed.
 このように、本発明は、構造物内外に跨る移動経路においてもシームレスに精度の高い自己位置推定を行いながら移動体4の自律移動制御が可能な情報処理システム等を提供することができる。 As described above, the present invention can provide an information processing system and the like that can autonomously control the movement of the mobile object 4 while seamlessly and accurately estimating its own position even on a movement route that spans inside and outside of a structure.
 また、上述の実施例では移動体4による構造物内外での情報取得を具体例としたが、例えば構造物の点検であってもよく、構造物の内壁および/または外壁の所定の事象の有無を点検するために利用される装置、機器等を備えていてもよい。より具体的には、撮像装置(可視光カメラ、赤外線カメラ、金属探知機、超音波測定器等)や、打鍵装置等、探知装置(金属探知機)、集音装置、臭気測定器、ガス検知器、空気汚染測定器、検出装置(宇宙線、放射線、電磁波等を検出するための装置)等の内壁や外壁を有する点検対象構造物の状態を知るために必要な装置は全て採用され得る。 Further, in the above-described embodiment, the acquisition of information inside and outside the structure by the moving object 4 was taken as a specific example, but it may also be an inspection of the structure, and the presence or absence of a predetermined event on the inner wall and/or outer wall of the structure may be used. It may also be equipped with equipment, equipment, etc. used for inspecting. More specifically, imaging devices (visible light cameras, infrared cameras, metal detectors, ultrasonic measuring devices, etc.), keying devices, detection devices (metal detectors), sound collection devices, odor measuring devices, gas detection devices, etc. All devices necessary to know the condition of a structure to be inspected having an inner wall or an outer wall, such as an air contamination measuring device, a detection device (device for detecting cosmic rays, radiation, electromagnetic waves, etc.), etc., can be employed.
 また、実施例は例えば構造物内の警備や監視であってもよく、警備や監視のために利用される装置、機器等を備えていてもよい。より具体的には、撮像装置(可視光カメラ、赤外線カメラ、暗視カメラ、金属探知機、超音波測定器等)や、センサ装置(モーションセンサ、赤外線センサ等)等、警備・監視対象構造物の異常や侵入者等を撮像・検知するために必要な装置は全て採用され得る。 Further, the embodiment may be, for example, security or monitoring inside a structure, and may include devices, equipment, etc. used for security or monitoring. More specifically, structures to be guarded and monitored, such as imaging devices (visible light cameras, infrared cameras, night vision cameras, metal detectors, ultrasonic measuring instruments, etc.) and sensor devices (motion sensors, infrared sensors, etc.) All devices necessary to image and detect abnormalities, intruders, etc. can be employed.
 本発明の移動体は、カメラ等を搭載した撮影用の移動体としても好適に使用することができる他、セキュリティ分野、インフラ監視、測量、スポーツ会場・工場・倉庫等の建物や構造物内の点検、災害対応等の様々な産業にも利用することができる。 The mobile object of the present invention can be suitably used as a mobile object for photographing equipped with a camera, etc., and can also be used in the security field, infrastructure monitoring, surveying, and in buildings and structures such as sports venues, factories, warehouses, etc. It can also be used in various industries such as inspection and disaster response.
 上述した実施の形態は、本発明の理解を容易にするための例示に過ぎず、本発明を限定して解釈するためのものではない。本発明は、その趣旨を逸脱することなく、変更、改良することができると共に、本発明にはその均等物が含まれることは言うまでもない。 The embodiments described above are merely illustrative to facilitate understanding of the present invention, and are not intended to be interpreted as limiting the present invention. It goes without saying that the present invention can be modified and improved without departing from its spirit, and that the present invention includes equivalents thereof.
 1    管理サーバ
 2    ユーザ端末
 4    移動体
 5    移動体格納装置

 
1 Management server 2 User terminal 4 Mobile object 5 Mobile object storage device

Claims (11)

  1.  衛星測位システムからの受信情報を取得する受信機と、環境情報を取得するセンサと、を備える移動体の自己位置を推定する情報処理システムであって、
     前記受信情報を表す第1座標系と、前記環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換する基準座標変換部と、
     前記受信情報が示す第1の自己位置情報と、前記環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、前記基準座標系で表される第3の自己位置情報を推定する自己位置推定部と、
     を備える、
     ことを特徴とする情報処理システム。
    An information processing system for estimating the self-position of a mobile object, comprising a receiver that acquires received information from a satellite positioning system, and a sensor that acquires environmental information,
    a reference coordinate conversion unit that converts both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
    A third self-position expressed in the reference coordinate system based on first self-position information indicated by the received information and second self-position information calculated by comparing the environment information and reference environment information. a self-position estimation unit that estimates information;
    Equipped with
    An information processing system characterized by:
  2.  前記自己位置推定部は、前記第1の自己位置情報及び前記第2の自己位置情報が入力され、前記第3の自己位置情報を出力する状態推定フィルタを含む、
     ことを特徴とする請求項1に記載の情報処理システム。
    The self-position estimation unit includes a state estimation filter that receives the first self-position information and the second self-position information and outputs the third self-position information.
    The information processing system according to claim 1, characterized in that:
  3.  前記自己位置推定部は、前記受信機の第1の感度と前記センサの第2の感度と各感度に対応する基準感度との比較結果に応じて、前記第1の自己位置情報と前記第2の自己位置情報の少なくともいずれかに基づき、前記基準座標系で表される第3の自己位置情報を推定する、
     ことを特徴とする請求項1に記載の情報処理システム。
    The self-position estimating unit calculates the first self-position information and the second self-position information according to a comparison result between a first sensitivity of the receiver, a second sensitivity of the sensor, and a reference sensitivity corresponding to each sensitivity. estimating third self-position information expressed in the reference coordinate system based on at least one of the self-position information of
    The information processing system according to claim 1, characterized in that:
  4.  前記受信機は、GPS受信機である、
     ことを特徴とする請求項1ないし3のいずれかに記載の情報処理システム。
    the receiver is a GPS receiver;
    The information processing system according to any one of claims 1 to 3, characterized in that:
  5.  前記センサは、LiDARセンサである、
     ことを特徴とする請求項1ないし3のいずれかに記載の情報処理システム。
    the sensor is a LiDAR sensor;
    The information processing system according to any one of claims 1 to 3, characterized in that:
  6.  前記センサは、Visualセンサである、
     ことを特徴とする請求項1ないし3のいずれかに記載の情報処理システム。
    The sensor is a visual sensor.
    The information processing system according to any one of claims 1 to 3, characterized in that:
  7.  前記基準座標系で表される前記第3の自己位置情報と、前記基準座標系で表される移動経路情報とを比較して、前記移動体の移動を制御する移動制御部をさらに備える、
     ことを特徴とする請求項1ないし3のいずれかに記載の情報処理システム。
    further comprising a movement control unit that controls the movement of the moving body by comparing the third self-position information expressed in the reference coordinate system and movement route information expressed in the reference coordinate system;
    The information processing system according to any one of claims 1 to 3, characterized in that:
  8.  前記センサにより前記移動体の移動経路上に障害物を感知した場合、前記移動経路情報を修正する移動経路情報修正部をさらに備える、
     ことを特徴とする請求項7に記載の情報処理システム。
    further comprising a movement route information correction unit that corrects the movement route information when the sensor detects an obstacle on the movement path of the moving body;
    The information processing system according to claim 7, characterized in that:
  9.  衛星測位システムからの受信情報を取得する受信機と、環境情報を取得するセンサと、を備える移動体の自己位置を推定する情報処理システムであって、
     前記受信情報を表す第1座標系と、前記環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換する基準座標変換部と、
     前記受信情報が示す第1の自己位置情報と、前記環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、前記基準座標系で表される第3の自己位置情報を推定する自己位置推定部と、
     を備える、
     ことを特徴とする移動体。
    An information processing system for estimating the self-position of a mobile object, comprising a receiver that acquires received information from a satellite positioning system, and a sensor that acquires environmental information,
    a reference coordinate conversion unit that converts both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
    A third self-position expressed in the reference coordinate system based on first self-position information indicated by the received information and second self-position information calculated by comparing the environment information and reference environment information. a self-position estimation unit that estimates information;
    Equipped with
    A mobile object characterized by:
  10.  衛星測位システムからの受信情報を取得する受信機と、環境情報を取得するセンサと、を備える移動体の自己位置を推定する情報処理方法であって、
     基準座標変換部により、前記受信情報を表す第1座標系と、前記環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換するステップと、
     自己位置推定部により、前記受信情報が示す第1の自己位置情報と、前記環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、前記基準座標系で表される第3の自己位置情報を推定するステップと、
     をコンピュータに実行させることを特徴とする情報処理方法。
    An information processing method for estimating the self-position of a mobile object, comprising a receiver that acquires information received from a satellite positioning system, and a sensor that acquires environmental information, the method comprising:
    using a reference coordinate conversion unit to convert both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
    The self-position estimating unit calculates the self-position information expressed in the reference coordinate system based on the first self-position information indicated by the received information and the second self-position information calculated by comparing the environment information and reference environment information. estimating third self-location information;
    An information processing method characterized by causing a computer to execute.
  11.  衛星測位システムからの受信情報を取得する受信機と、環境情報を取得するセンサと、を備える移動体の自己位置を推定する情報処理方法をコンピュータに実行させるプログラムであって、
     基準座標変換部により、前記受信情報を表す第1座標系と、前記環境情報を表す第2座標系とを共にベースポイント座標を基準とした基準座標系に変換するステップと、
     自己位置推定部により、前記受信情報が示す第1の自己位置情報と、前記環境情報と基準環境情報との比較によって算出される第2の自己位置情報とに基づき、前記基準座標系で表される第3の自己位置情報を推定するステップと、
     を前記コンピュータに実行させることを特徴とするプログラム。

     
    A program that causes a computer to execute an information processing method for estimating the self-position of a mobile object, which includes a receiver that acquires received information from a satellite positioning system and a sensor that acquires environmental information, the program comprising:
    using a reference coordinate conversion unit to convert both a first coordinate system representing the received information and a second coordinate system representing the environmental information into a reference coordinate system based on base point coordinates;
    The self-position estimating unit calculates the self-position information expressed in the reference coordinate system based on the first self-position information indicated by the received information and the second self-position information calculated by comparing the environment information and reference environment information. estimating third self-location information;
    A program that causes the computer to execute.

PCT/JP2022/021272 2022-05-24 2022-05-24 Information processing system, movable body, information processing method, and program WO2023228283A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/021272 WO2023228283A1 (en) 2022-05-24 2022-05-24 Information processing system, movable body, information processing method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/021272 WO2023228283A1 (en) 2022-05-24 2022-05-24 Information processing system, movable body, information processing method, and program

Publications (1)

Publication Number Publication Date
WO2023228283A1 true WO2023228283A1 (en) 2023-11-30

Family

ID=88918638

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/021272 WO2023228283A1 (en) 2022-05-24 2022-05-24 Information processing system, movable body, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2023228283A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016080460A (en) * 2014-10-15 2016-05-16 シャープ株式会社 Moving body
JP2017501484A (en) * 2014-09-05 2017-01-12 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method for controlling a movable object in an environment, system for controlling a movable object in an environment, method for controlling an unmanned aircraft in an environment, and system for controlling an unmanned aircraft in an environment
JP2020008461A (en) * 2018-07-10 2020-01-16 株式会社豊田自動織機 Autonomous moving body location estimation device
JP2020034491A (en) * 2018-08-31 2020-03-05 株式会社日立産機システム Mobile entity position detecting device and mobile entity equipped with position detecting device
JP2022015978A (en) * 2020-07-10 2022-01-21 エヌ・ティ・ティ・エイ・ティ・システムズ株式会社 Unmanned aircraft control method, unmanned aircraft, and unmanned aircraft control program
JP7004374B1 (en) * 2021-06-18 2022-02-14 株式会社センシンロボティクス Movement route generation method and program of moving object, management server, management system
JP2022075256A (en) * 2020-11-06 2022-05-18 株式会社豊田自動織機 Parameter acquisition method and device for coordinate conversion and self-position estimation device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017501484A (en) * 2014-09-05 2017-01-12 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method for controlling a movable object in an environment, system for controlling a movable object in an environment, method for controlling an unmanned aircraft in an environment, and system for controlling an unmanned aircraft in an environment
JP2016080460A (en) * 2014-10-15 2016-05-16 シャープ株式会社 Moving body
JP2020008461A (en) * 2018-07-10 2020-01-16 株式会社豊田自動織機 Autonomous moving body location estimation device
JP2020034491A (en) * 2018-08-31 2020-03-05 株式会社日立産機システム Mobile entity position detecting device and mobile entity equipped with position detecting device
JP2022015978A (en) * 2020-07-10 2022-01-21 エヌ・ティ・ティ・エイ・ティ・システムズ株式会社 Unmanned aircraft control method, unmanned aircraft, and unmanned aircraft control program
JP2022075256A (en) * 2020-11-06 2022-05-18 株式会社豊田自動織機 Parameter acquisition method and device for coordinate conversion and self-position estimation device
JP7004374B1 (en) * 2021-06-18 2022-02-14 株式会社センシンロボティクス Movement route generation method and program of moving object, management server, management system

Similar Documents

Publication Publication Date Title
US11442473B2 (en) Systems and methods for surveillance with a visual marker
US11914369B2 (en) Multi-sensor environmental mapping
US20220074744A1 (en) Unmanned Aerial Vehicle Control Point Selection System
US11726501B2 (en) System and method for perceptive navigation of automated vehicles
Al-Darraji et al. A technical framework for selection of autonomous uav navigation technologies and sensors
JP7004374B1 (en) Movement route generation method and program of moving object, management server, management system
WO2023228283A1 (en) Information processing system, movable body, information processing method, and program
WO2023199477A1 (en) Information processing system, mobile body, information processing method, and program
JP7072311B1 (en) Movement route generation method and program of moving object, management server, management system
JP7441579B1 (en) Information processing system and information processing method
Calero et al. Autonomous Wheeled Robot Platform Testbed for Navigation and Mapping Using Low-Cost Sensors
Sarman Exploring the Use of Drones in the Bim Environment
KR20220031574A (en) 3D positioning and mapping system and method
Maghrebi et al. Autopilot Drone in Construction: A Proof of Concept for Handling Lightweight Instruments and Materials

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22943688

Country of ref document: EP

Kind code of ref document: A1