WO2023067892A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2023067892A1
WO2023067892A1 PCT/JP2022/032009 JP2022032009W WO2023067892A1 WO 2023067892 A1 WO2023067892 A1 WO 2023067892A1 JP 2022032009 W JP2022032009 W JP 2022032009W WO 2023067892 A1 WO2023067892 A1 WO 2023067892A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving object
movement
information processing
self
information
Prior art date
Application number
PCT/JP2022/032009
Other languages
English (en)
Japanese (ja)
Inventor
昇治 松田
裕崇 田中
知仁 織田
邦昭 野田
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023067892A1 publication Critical patent/WO2023067892A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • the present technology relates to an information processing device, an information processing method, and a program applicable to control of autonomous movement.
  • Patent Document 1 describes an information processing device that estimates the type of mobile object on which the user is riding, based on sensing data provided by a plurality of sensors carried or worn by the user.
  • information to be used for processing for obtaining the position of the user in the moving object is selected using the estimated type of the moving object. This makes it possible to improve the detection accuracy of the position within the moving body (paragraphs [0038] to [0053] of Patent Document 1, FIGS. 3 and 4, etc.).
  • an object of the present technology is to provide an information processing device, an information processing method, and a program capable of improving detection accuracy.
  • an information processing device includes a calculator.
  • the calculation unit generates first movement information about the moving object and first movement information about the own machine according to a first movement state of the moving object and a second movement state of the machine that moves with the moving object. 2, the position of the own machine is calculated.
  • this information processing device in accordance with a first movement state of a moving object and a second movement state of the own machine that moves with the moving object, first movement information about the moving object and Based on the second movement information, the self-position of the own aircraft is calculated. This makes it possible to improve detection accuracy.
  • the first movement information may include the self-position of the moving object and the movement vector of the moving object.
  • the second movement information may include the self-position of the aircraft and the movement vector of the aircraft.
  • the first moving state may include at least one of moving, rotating, and stopping the moving object.
  • the second moving state may include movement and stopping of the own aircraft.
  • the calculating unit subtracts the movement vector of the moving object from the movement vector of the own machine to The self-position of the aircraft may be calculated.
  • the first movement information may be acquired by an external sensor and an internal sensor mounted on the moving object.
  • the second movement information may be acquired by an external world sensor and an internal world sensor mounted on the own machine.
  • the aircraft may be a mobile object capable of flying.
  • the calculation unit adds or reduces the weighting of the internal sensor mounted on the own aircraft, may calculate the self-position of
  • the external sensor may include at least one of a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a ToF (Time of Flight) camera, or a stereo camera.
  • a LiDAR Light Detection and Ranging, Laser Imaging Detection and Ranging
  • a ToF Time of Flight
  • the internal sensor may include at least one of an IMU (Inertial Measurement Unit) or a GPS (Global Positioning System).
  • IMU Inertial Measurement Unit
  • GPS Global Positioning System
  • the information processing apparatus further comprising: image correction for controlling the external sensor based on a vibration system of the moving object and a vibration system of the own apparatus when the own apparatus is in contact with the moving object. You may have a part.
  • the imaging correction unit may perform control to match the vibration system of the moving object with the vibration system of the own machine.
  • An information processing method is an information processing method executed by a computer system, and includes a first movement state of a moving object and a second movement state of the own machine that moves accompanying the moving object. and calculating the self-position of the self-machine based on first movement information about the moving object and second movement information about the self-machine.
  • a program causes a computer system to execute the following steps. First movement information about the moving object and second movement information about the own machine according to a first movement state of the moving object and a second movement state of the machine that moves with the moving object. A step of calculating the self-position of the own aircraft based on.
  • FIG. 4 is a flowchart of robot self-position estimation.
  • FIG. 4 is a schematic diagram showing a robot that captures an image of the inside of a moving space of a moving object; It is a block diagram which shows the hardware structural example of an information processing apparatus.
  • FIG. 1 is a diagram schematically showing the movement space.
  • FIG. 1A is a schematic diagram showing a moving space.
  • FIG. 1B is a schematic diagram showing the robot in the moving space.
  • the robot 10 existing inside the moving space 1 has an external sensor and an internal sensor, and calculates the self position of the robot 10 .
  • the self-position is the position of the robot 10 with respect to the map that the robot 10 is aware of or is creating.
  • the moving space 1 is the space inside a moving moving object 5 such as a train or a ship. That is, the self-position, movement vector, etc. of the moving space 1 change according to the movement and rotation of the moving object.
  • the number and range of the moving spaces 1 in the moving object 5 are not limited.
  • the inside of one train car may be used as the movement space, or each section (tank section) of a ship may be used as the movement space.
  • the area in which the robot 10 moves may be used as the movement space.
  • the movement space may be a space a predetermined distance from the ground.
  • a space within a predetermined distance from the ground for example, an area in which the robot can self-run may be the movement space.
  • the robot 10 is an autonomously movable or operable body such as a drone.
  • the robot 10 has an external sensor and an internal sensor.
  • the moving object 5 has an external sensor and an internal sensor.
  • the external sensor is a sensor that detects information outside the moving object 5 and the robot 10 .
  • external sensors include LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), ToF (Time of Flight) cameras, stereo cameras, and the like.
  • the internal sensors include sensors that detect information inside the moving object 5 and the robot 10 .
  • the internal sensor includes an IMU (Inertial Measurement Unit), a GPS (Global Positioning System), and the like.
  • the sensors used for the external sensor and the internal sensor are not limited.
  • a depth sensor a temperature sensor, an air pressure sensor, a laser ranging sensor, a contact sensor, an ultrasonic sensor, an encoder, a gyro, etc. may be used.
  • the robot 10 moves in the movement space 1.
  • the estimation results of the self-position sensed by the IMU will not match the self-position of the robot 10 because the values due to the movement of the moving object 5 are also included.
  • the position relative to the moving object 5 erroneous recognition of the self position cannot be eliminated simply by sharing the self position of the moving object 5 and the robot 10.
  • first movement information including the self-position and movement vector of the moving object 5 is supplied to the robot 10 .
  • the robot 10 calculates its own position based on the first movement information and the second movement information including the robot 10's own position and movement vector. This makes it possible to improve the reliability of the self-position with respect to the environment map.
  • the movement vector refers to the direction, velocity, and acceleration of parallel movement and rotational movement.
  • FIG. 2 is a block diagram showing a configuration example of the moving object 5 and the robot 10. As shown in FIG.
  • the moving object 5 has a relative positioning sensor 6a, an absolute positioning sensor 7a, and a self-position estimator 8a.
  • the robot 10 has a relative positioning sensor 6b, an absolute positioning sensor 7b, and a self-position estimator 8b.
  • a relative position is a position relative to the moving object 5 . That is, even if the moving object 5 moves, the relative position does not change.
  • the self-position obtained by an external sensor such as LiDAR is referred to as a relative position.
  • An absolute position is a position relative to the ground. That is, the absolute position changes as the moving object 5 (moving space 1) moves.
  • the self-position acquired by an internal sensor such as an IMU or GPS is referred to as an absolute position.
  • the relative positioning sensor 6a (6b) acquires information on the relative position with the outside.
  • the relative positioning sensor 6a (6b) is a LiDAR, a ToF camera, a stereo camera, or the like, and acquires external sensor information such as the distance (positional relationship) to a specific object and relative speed.
  • the external sensor information of the moving object 5 and the robot 10 is acquired by SLAM (Simultaneous Localization and Mapping) using an imaging device such as a camera.
  • the external sensor information acquired by the relative positioning sensor 6a (6b) is supplied to the self-position estimation unit 8a (8b).
  • VSLAM Voice SLAM
  • the absolute positioning sensor 7a (7b) acquires information inside the moving object 5 and the robot 10.
  • the absolute positioning sensor 7a (7b) acquires internal sensor information such as the velocity, acceleration, and angular velocity of the moving object 5 and robot 10 .
  • the acquired internal sensor information of the moving object 5 and the robot 10 is supplied to the self-position estimator 8a (8b).
  • the self-position estimation unit 8a (8b) estimates the self-position of the moving object 5 and the robot 10 based on the external sensor information and the internal sensor information.
  • the self-position estimation unit 8b weights the external sensor and the internal sensor according to the movement state (first movement state) of the moving object 5 and the movement state (second movement state) of the robot 10. attach.
  • the first moving state includes at least one state of moving, rotating, and stopping the moving object 5 .
  • the second moving state is the moving state and the stopped state of the robot 10 .
  • the moving states of the moving object 5 and the robot 10 are classified into the following conditions.
  • the moving object 5 moves and the robot 10 moves (condition 1).
  • the moving object 5 is moving, and the robot 10 is stationary in the air (condition 2A).
  • the moving object 5 moves, and the robot 10 stands still on the ground (in contact with the moving object 5) (condition 2B).
  • the moving object 5 stops and the robot 10 moves (condition 3).
  • the self-position estimation unit 8b determines the current movement states of the moving object 5 and the robot 10 based on the external sensor information and the internal sensor information. For example, the amount of movement of the robot 10 is determined from internal sensor information obtained from the IMU.
  • the self-position estimation unit 8b reduces the weighting of the IMU in the sensor fusion process or adds the weighting of the VSLAM.
  • the self-position estimation unit 8b subtracts the movement vector of the moving object 5 from the movement vector of the robot 10 to estimate the self-position.
  • the self-position estimation unit 8b estimates the self-position by switching between correcting the VSLAM positioning result and using the IMU result according to each condition.
  • Different sensors may be used for the relative positioning sensor 6a (6b) and the absolute positioning sensor 7a (7b) mounted on the moving object 5 and the robot 10, respectively.
  • the self-position estimating unit 8b calculates the first movement state of the moving object according to the first movement state of the moving object and the second movement state of the self-machine that moves with the moving object. It corresponds to a calculation unit that calculates the own position of the own aircraft based on the information and the second movement information regarding the own aircraft.
  • FIG. 3 is a flowchart of self-position estimation of the robot 10.
  • the sensor fusion first equalizes the weights of the IMU and VSLAM (step 101)
  • the self-position estimation unit 8b estimates the self-position of the robot 10 from the internal sensor information obtained from the IMU (step 102). For example, the self-position estimator 8b uses dead reckoning or the like to integrate minute changes in internal sensors such as encoders (motor angle sensors, etc.) and gyros from the initial state to determine the position and posture (orientation), etc., of the robot 10. Estimate self-location.
  • the self-position estimation unit 8b determines whether or not the movement amount of the robot 10 is 0 from the IMU data (step 103).
  • condition 2A or condition 4 is assumed.
  • the self-position estimation unit 8b estimates the self-position of the robot 10 from the external sensor information obtained from the VSLAM (step 104). For example, the self-position estimator 8b measures the positions of known landmarks on the map using VSLAM, and estimates the self-position using star reckoning or the like for measuring the current position of the robot 10 .
  • the self-position estimation unit 8b determines whether or not the amount of movement acquired from the VSLAM is 0 (step 105).
  • condition 2A is assumed.
  • the self-position estimation unit 8b reduces the weighting of the IMU in the sensor fusion process or adds the weighting of the VSLAM (step 106).
  • the self-position estimation unit 8b performs sensor fusion processing to estimate the self-position of the robot 10 (step 107).
  • condition 4 is assumed. In this case, the process returns to step 102 .
  • condition 1, condition 2B, or condition 3 is assumed.
  • the self-position estimation unit 8b determines whether or not the wheels (encoders) of the robot 10 are rotating (step 108).
  • condition 1 or condition 3 is assumed.
  • the self-position estimation unit 8b performs the process of step 107.
  • condition 2B If there is no wheel rotation (NO in step 108), condition 2B is assumed. In this case, the self-position estimator 8b receives the IMU data of the moving object 5 from the self-position estimator 8a (step 109).
  • the movement vector of the moving object 5 is subtracted from the movement vector of the robot 10 by the self-position estimation unit 8b (step 110). Thereafter, the self-position estimation unit 8b performs sensor fusion processing to estimate the self-position of the robot 10 (step 107).
  • the robot 10 performs the first movement regarding the moving object 5 in accordance with the first movement state of the moving object 5 and the second movement state of the robot 10 moving accompanying the moving object 5 .
  • a self position of the robot 10 is calculated based on the information and the second movement information regarding the robot 10 . This makes it possible to improve detection accuracy.
  • This technology automatically switches the positioning sensor priority between the absolute coordinate system and the local coordinate system according to the moving state of the moving object and the robot.
  • the self-position of the robot 10 is estimated according to the movement state of the moving object 5.
  • a camera mounted on the robot 10 may be controlled.
  • FIG. 4 is a schematic diagram showing a robot that captures images of the inside of the moving space 1 of the moving object 5.
  • FIG. 4 is a schematic diagram showing a robot that captures images of the inside of the moving space 1 of the moving object 5.
  • a moving object 5 that vibrates as it moves such as a ship or train, is taken as an example.
  • the robot 10 has wheels and a camera, and is capable of running and taking an image of the subject 20 . That is, the robot 10 is affected by vibrations other than the vibrations caused by the movement of the robot 10 itself due to the vibrations of the moving object 5 .
  • the robot 10 shoots a subject (not shown) outside the moving object 5
  • the gimbal and camera shake correction is performed based on the internal sensor information acquired from the IMU mounted on the robot 10.
  • the robot is inside the moving object 5 (within the moving space 1), if the object 20 is imaged while the vibration of the moving object 5 is eliminated, the object 20 shaken is imaged.
  • the robot 10 includes an imaging correction unit that matches the vibration system of the subject 20 (the vibration system of the moving object) and the vibration system of the robot 10 .
  • the imaging correction unit determines whether the subject 20 and the robot 10 are present in the moving space 1 based on the external world sensor information and internal world sensor information acquired from the relative positioning sensor 6b and the absolute positioning sensor 7b. Further, when the robot 10 is present in the moving space 1, the imaging correction unit performs control to match the vibration system of the object 20 and the vibration system of the robot 10.
  • FIG. 1 The imaging correction unit determines whether the subject 20 and the robot 10 are present in the moving space 1 based on the external world sensor information and internal world sensor information acquired from the relative positioning sensor 6b and the absolute positioning sensor 7b. Further, when the robot 10 is present in the moving space 1, the imaging correction unit performs control to match the vibration system of the object 20 and the vibration system of the robot 10.
  • the self-position was estimated in the moving space of moving objects such as ships and trains.
  • the robot 10 may be used for various purposes, and may have a configuration necessary therefor.
  • the robot may be a body intended for in-vehicle sales such as bullet trains and airplanes.
  • a detection unit may be provided for detecting, recognizing, and tracking obstacles around the robot, and detecting the distance to the obstacle. This makes it possible to save manpower and reduce the risk of infection.
  • the robot 10 may be a body intended to patrol inside a building having an escalator or the like. That is, by accurately estimating the self-position, it becomes possible to move to a place where the robot itself cannot move by using a machine having driving capability other than the robot, such as an escalator. For example, even in situations where the environment map changes significantly, such as when a robot moves from a station platform to a train, it is possible to accurately estimate its own position.
  • the movement information included self-position and movement vector.
  • the movement information is not limited to this, and may include various information about moving objects and robots. For example, a current value of a rotor used for a propeller or the like, a voltage value of a rotor, and a rotational speed value of an ESC (Electric Speed Controller) may be included.
  • the movement information may also include information that prevents movement. For example, obstacle information existing in the moving direction of the robot and disturbance information such as wind may be included.
  • the first movement information was acquired by the external sensor and the internal sensor mounted on the moving object 5.
  • the first movement information may be acquired by any method without being limited to this.
  • the self-position estimation unit 8a (8b) is mounted on the moving object 5 and the robot 10.
  • the present invention is not limited to this, and the self-position estimation unit may be installed in an external information processing device.
  • the information processing device has an acquisition unit that acquires first movement information about the moving object and second movement information about the robot.
  • the self-position estimator estimates the self-positions of the moving object 5 and the robot 10 based on the first movement information and the second movement information acquired according to the first movement state and the second movement state. do.
  • the information processing device determines the first movement state and the second movement state based on the sensor information acquired by the relative positioning sensor 6a (6b) and the absolute positioning sensor 7a (7b). You may have a determination part.
  • FIG. 5 is a block diagram showing a hardware configuration example of the information processing device.
  • the information processing device includes a CPU 50, a ROM 51, a RAM 52, an input/output interface 54, and a bus 53 that connects these to each other.
  • a display unit 55, an input unit 56, a storage unit 57, a communication unit 58, a drive unit 59, and the like are connected to the input/output interface 54.
  • the display unit 55 is a display device using liquid crystal, EL, or the like, for example.
  • the input unit 56 is, for example, a keyboard, pointing device, touch panel, or other operating device. When input unit 56 includes a touch panel, the touch panel can be integrated with display unit 55 .
  • the storage unit 57 is a non-volatile storage device, such as an HDD, flash memory, or other solid-state memory.
  • the drive unit 59 is a device capable of driving a removable recording medium 60 such as an optical recording medium or a magnetic recording tape.
  • the communication unit 58 is a modem, router, or other communication equipment for communicating with other devices that can be connected to a LAN, WAN, or the like.
  • the communication unit 58 may use either wired or wireless communication.
  • the communication unit 58 is often used separately from the information processing device. In this embodiment, the communication unit 58 enables communication with other devices via the network.
  • Information processing by the information processing apparatus having the hardware configuration as described above is realized by cooperation between software stored in the storage unit 57 or the ROM 51 or the like and the hardware resources of the information processing apparatus.
  • the control method according to the present technology is realized by loading a program constituting software stored in the ROM 51 or the like into the RAM 52 and executing the program.
  • the program is installed in the information processing device via the recording medium 60, for example.
  • the program may be installed in the information processing device via a global network or the like.
  • any computer-readable non-transitory storage medium may be used.
  • An information processing method and a program according to the present technology are executed by linking a computer installed in a communication terminal with another computer that can communicate via a network or the like, and a signal processing unit according to the present technology is constructed. good too.
  • the information processing apparatus, information processing method, and program according to the present technology can be executed not only in a computer system configured by a single computer, but also in a computer system in which a plurality of computers operate in conjunction.
  • a system means a set of multiple components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a single device housing a plurality of modules within a single housing, are both systems.
  • Execution of the information processing device, information processing method, and program according to the present technology by a computer system can be performed, for example, when self-position estimation is performed by a single computer, and when each process is performed by a different computer. Including both. Execution of each process by a predetermined computer includes causing another computer to execute part or all of the process and obtaining the result.
  • the information processing device, information processing method, and program according to the present technology can be applied to a configuration of cloud computing in which a single function is shared by a plurality of devices via a network and processed jointly. .
  • the present technology can also adopt the following configuration.
  • an information processing apparatus comprising: a calculation unit that calculates the self-position of the own device based on the above.
  • the first movement information includes a self-position of the moving object and a movement vector of the moving object;
  • the information processing apparatus, wherein the second movement information includes the self-position of the device itself and the movement vector of the device itself.
  • the first movement state includes at least one or more of movement, rotation, and stoppage of the moving object;
  • the information processing apparatus, wherein the second moving state includes moving and stopping the own machine.
  • the calculating unit subtracts the movement vector of the moving object from the movement vector of the own machine to Information processing device that calculates the self-position of the aircraft.
  • the first movement information is acquired by an external sensor and an internal sensor mounted on the moving object;
  • the information processing apparatus, wherein the second movement information is acquired by an external sensor and an internal sensor mounted on the self-machine.
  • the information processing device is a mobile object capable of flying
  • the calculation unit adds or reduces the weighting of the internal sensor mounted on the self-machine when the moving object is moving and the self-machine is stationary in the air, and calculates the self-position of the self-machine.
  • An information processing device that calculates (7) The information processing device according to (5), The information processing apparatus, wherein the external sensor includes at least one of a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a ToF (Time of Flight) camera, or a stereo camera.
  • LiDAR Light Detection and Ranging
  • Laser Imaging Detection and Ranging Laser Imaging Detection and Ranging
  • ToF Time of Flight
  • the information processing device includes at least one of an IMU (Inertial Measurement Unit) and a GPS (Global Positioning System) Information processing apparatus.
  • the information processing device further comprising: An information processing apparatus comprising an imaging correction unit that controls the external sensor based on a vibration system of the moving object and a vibration system of the own device when the own device is in contact with the moving object.
  • An information processing apparatus comprising an imaging correction unit that controls the external sensor based on a vibration system of the moving object and a vibration system of the own device when the own device is in contact with the moving object.
  • the information processing device according to (9), The information processing apparatus, wherein the imaging correction unit performs control to match the vibration system of the moving object with the vibration system of the own device when capturing an image of a subject grounded on the moving object.
  • First movement information about the moving object and second movement information about the own machine according to a first movement state of the moving object and a second movement state of the machine that moves with the moving object An information processing method in which a computer system executes calculation of the self-position of the own aircraft based on.
  • First movement information about the moving object and second movement information about the own machine according to a first movement state of the moving object and a second movement state of the machine that moves with the moving object A program for causing a computer system to execute a step of calculating the self-position of the own aircraft based on.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Un mode de réalisation de la présente technologie porte sur un dispositif de traitement d'informations qui est pourvu d'une unité de calcul. L'unité de calcul calcule la position du dispositif hôte, qui se déplace avec un objet mobile, sur la base de premières informations de mouvement relatives à l'objet mobile et de secondes informations de mouvement relatives au dispositif hôte selon un premier état de mouvement de l'objet mobile et un second état de mouvement du dispositif hôte. Il est ainsi possible d'améliorer la précision de détection. En outre, il est possible d'améliorer la précision et la fiabilité de la position du dispositif hôte. Étant donné qu'aucune divergence ne se développe par rapport à la position du dispositif hôte dans un espace de mouvement, il est possible, même pour un drone volant dans les airs, d'éviter une collision avec un obstacle dans l'espace de mouvement.
PCT/JP2022/032009 2021-10-19 2022-08-25 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2023067892A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021170693 2021-10-19
JP2021-170693 2021-10-19

Publications (1)

Publication Number Publication Date
WO2023067892A1 true WO2023067892A1 (fr) 2023-04-27

Family

ID=86058979

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/032009 WO2023067892A1 (fr) 2021-10-19 2022-08-25 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2023067892A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01211408A (ja) * 1988-02-18 1989-08-24 Yanmar Agricult Equip Co Ltd 農作業機における作物列検出装置
WO2021177139A1 (fr) * 2020-03-06 2021-09-10 ソニーグループ株式会社 Procédé de traitement d'informations, dispositif de traitement d'informations et programme
JP2021144644A (ja) * 2020-03-13 2021-09-24 三菱電機株式会社 移動体システムおよび移動体制御装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01211408A (ja) * 1988-02-18 1989-08-24 Yanmar Agricult Equip Co Ltd 農作業機における作物列検出装置
WO2021177139A1 (fr) * 2020-03-06 2021-09-10 ソニーグループ株式会社 Procédé de traitement d'informations, dispositif de traitement d'informations et programme
JP2021144644A (ja) * 2020-03-13 2021-09-24 三菱電機株式会社 移動体システムおよび移動体制御装置

Similar Documents

Publication Publication Date Title
CN110244772B (zh) 移动机器人的领航跟随系统和领航跟随控制方法
EP3715785B1 (fr) Système de navigation inertielle assisté par slam
US10006772B2 (en) Map production method, mobile robot, and map production system
CN107272727B (zh) 自主移动体
JP6235213B2 (ja) 自律飛行ロボット
JP6029446B2 (ja) 自律飛行ロボット
EP3531223B1 (fr) Procédé et d'évitement d'obstacle et aéronef
CN107235013A (zh) 车载导航定位全景云台
CN207360243U (zh) 车载导航定位全景云台
JP6195450B2 (ja) 自律飛行ロボット
JP5990453B2 (ja) 自律移動ロボット
US9122278B2 (en) Vehicle navigation
JP6140458B2 (ja) 自律移動ロボット
JP6934116B1 (ja) 航空機の飛行制御を行う制御装置、及び制御方法
EP3905213B1 (fr) Dispositif de positionnement et corps mobile
JP6014484B2 (ja) 自律移動ロボット
JP2016173709A (ja) 自律移動ロボット
WO2020042159A1 (fr) Procédé et appareil de commande de rotation pour cardan, dispositif de commande et plateforme mobile
JP6469492B2 (ja) 自律移動ロボット
JP6832394B2 (ja) 自己位置推定装置、自己位置選択装置、及び学習器
WO2023067892A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
CN114003041A (zh) 一种多无人车协同探测系统
WO2019176278A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps mobile
Lee et al. Vision tracking of a moving robot from a second moving robot using both relative and absolute position referencing methods
EP4187277A1 (fr) Procédé de détection d'erreur d'installation de radar pour angle de tangage sur des véhicules autonomes

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22883204

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE