WO2019146551A1 - Dispositif de traitement d'informations - Google Patents

Dispositif de traitement d'informations Download PDF

Info

Publication number
WO2019146551A1
WO2019146551A1 PCT/JP2019/001696 JP2019001696W WO2019146551A1 WO 2019146551 A1 WO2019146551 A1 WO 2019146551A1 JP 2019001696 W JP2019001696 W JP 2019001696W WO 2019146551 A1 WO2019146551 A1 WO 2019146551A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
required time
processing
calculation unit
size
Prior art date
Application number
PCT/JP2019/001696
Other languages
English (en)
Japanese (ja)
Inventor
中川 宏
山田 和宏
陽平 大野
雄一朗 瀬川
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Priority to JP2019567059A priority Critical patent/JP6957651B2/ja
Publication of WO2019146551A1 publication Critical patent/WO2019146551A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • the present invention relates to a technique for estimating the time required for processing using a flying object.
  • Patent Document 1 generates a map in which a plurality of cells are arranged in a work area, calculates the area S of the work area based on the map, and calculates the required work time from the calculated area S. Is described.
  • NDVI Normalized Difference Vegetation Index
  • the calculation accuracy of the NDVI is different. For example, the higher the flight speed of the aircraft, the worse the accuracy of the calculated NDVI. Also, for example, the higher the flight altitude of the aircraft, the worse the accuracy of the calculated NDVI.
  • a specification unit for specifying a target accuracy which is a target value of accuracy in processing performed by a flying object on a processing target area, and a size of the processing target area
  • An information processing apparatus comprising: a calculation unit for calculating a required time required for the flight body to perform the process with the target accuracy specified for the processing target area having a different size.
  • the processing is processing performed based on imaging of the ground by the aircraft, and the calculation unit may calculate the required time based on the size of the range covered by one imaging. Good.
  • the calculation unit may change the size of the effective range in the captured image according to the condition.
  • the calculation unit may change the size of the effective range in accordance with the amount of light at the time of imaging or imaging timing.
  • the calculation unit may correct the required time according to the light amount at the time of imaging or the imaging time.
  • the calculation unit may compare the lower limit of the required time according to the execution of the calibration for the process with the target required time, and generate information according to the comparison result.
  • the required time required for the processing can be known.
  • FIG. 1 shows an example of the configuration of a flight control system 1.
  • FIG. FIG. 2 is a view showing an example of the appearance of a flying object 10; It is a figure which shows the hardware constitutions of the flying body 10.
  • FIG. 2 is a diagram showing a hardware configuration of a server device 20.
  • FIG. 1 is a diagram showing an example of the configuration of a flight control system 1.
  • the flight control system 1 is a system that controls the flight of the flying object 10.
  • the flight control system 1 includes a plurality of aircraft 10 and a server device 20.
  • the airframe 10 and the server device 20 can communicate with each other via a network.
  • the flying object 10 performs a process of imaging a plant in the field on a processing target area such as a field.
  • the server apparatus 20 is an example of the information processing apparatus according to the present invention, and the normalized vegetation index (NDVI: Normalized Difference Vegetation Index) is calculated from the spectral reflection characteristics of plants in the processing target area using the imaging results of the flying object 10. Perform processing to calculate.
  • the server device 20 performs processing for calculating the required time required for the processing.
  • FIG. 2 is a view showing an example of the appearance of the flying object 10.
  • the flying object 10 is, for example, a so-called drone, and includes a propeller 101, a drive device 102, and a battery 103.
  • the propeller 101 rotates about an axis. As the propeller 101 rotates, the flying object 10 flies.
  • the driving device 102 powers and rotates the propeller 101.
  • the drive device 102 includes, for example, a motor and a transmission mechanism that transmits the power of the motor to the propeller 101.
  • the battery 103 supplies power to each part of the aircraft 10 including the drive device 102.
  • FIG. 3 is a diagram showing the hardware configuration of the aircraft 10.
  • the flying object 10 is physically configured as a computer device including a processor 11, a memory 12, a storage 13, a communication device 14, a positioning device 15, an imaging device 16, a beacon device 17, a bus 18, and the like.
  • the term “device” can be read as a circuit, a device, a unit, or the like.
  • the processor 11 operates an operating system, for example, to control the entire computer.
  • the processor 11 may be configured by a central processing unit (CPU) including an interface with a peripheral device, a control device, an arithmetic device, a register, and the like.
  • CPU central processing unit
  • the processor 11 reads a program (program code), a software module or data from the storage 13 and / or the communication device 14 to the memory 12 and executes various processing according to these.
  • a program a program that causes a computer to execute at least a part of the operation of the flying object 10 is used.
  • the various processes performed in the aircraft 10 may be performed by one processor 11 or may be performed simultaneously or sequentially by two or more processors 11.
  • the processor 11 may be implemented by one or more chips.
  • the program may be transmitted from the network via a telecommunication line.
  • the memory 12 is a computer readable recording medium, and includes, for example, at least one of a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Erasable Programmable ROM), and a RAM (Random Access Memory). It may be done.
  • the memory 12 may be called a register, a cache, a main memory (main storage device) or the like.
  • the memory 12 can store a program (program code), a software module, and the like that can be executed to implement the flight control method according to the embodiment of the present invention.
  • the storage 13 is a computer readable recording medium, and is, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magnetooptical disk (for example, a compact disk, a digital versatile disk, Blu-ray A (registered trademark) disk, a smart card, a flash memory (for example, a card, a stick, a key drive), a floppy (registered trademark) disk, a magnetic strip, and the like may be used.
  • the storage 13 may be called an auxiliary storage device.
  • the communication device 14 is hardware (transmission / reception device) for performing communication between computers via a wired and / or wireless network, and is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the positioning device 15 measures the three-dimensional position of the aircraft 10.
  • the positioning device 15 is, for example, a GPS (Global Positioning System) receiver, and measures the current position of the aircraft 10 based on GPS signals received from a plurality of satellites.
  • GPS Global Positioning System
  • the imaging device 16 captures an image around the flying object 10.
  • the imaging device 16 is, for example, a camera, and captures an image by forming an image on an imaging element using an optical system.
  • the imaging device 16 captures an image of a predetermined range, for example, below the flying object 10.
  • the beacon device 17 transmits a beacon signal of a predetermined frequency, and also receives a beacon signal transmitted from another flying object 10.
  • the reach of this beacon signal is a predetermined distance such as 100 m.
  • the beacon signal includes aircraft identification information that identifies the aircraft 10 that transmits the beacon signal. This flying body identification information is used to prevent collision of the flying bodies 10 with each other.
  • the devices such as the processor 11 and the memory 12 described above are connected by a bus 18 for communicating information.
  • the bus 18 may be configured as a single bus or may be configured as different buses among the devices.
  • FIG. 4 is a diagram showing a hardware configuration of the server device 20.
  • the server device 20 is physically configured as a computer device including a processor 21, a memory 22, a storage 23, a communication device 24, a bus 25 and the like.
  • the processor 21, the memory 22, the storage 23, the communication device 24, and the bus 25 are similar to the processor 11, the memory 12, the storage 13, the communication device 14, and the bus 18 described above, and thus the description thereof is omitted.
  • FIG. 5 is a diagram showing an example of a functional configuration of the server device 20.
  • Each function in the server device 20 causes the processor 21 to perform an operation by reading predetermined software (program) on hardware such as the processor 21 and the memory 22, thereby performing communication by the communication device 24, the memory 22, and the storage 23. This is realized by controlling the reading and / or writing of data in
  • the tracking unit 200 records the flying object identification information of the flying object 10 under control of the server device 20 and the flight status thereof.
  • the flight status includes the position where the flying object 10 is flying and the date and time at that position.
  • the tracking unit 200 records position information and date and time information notified from the aircraft 10.
  • the tracking unit 200 determines whether the position information and the date and time information are within a previously planned flight plan, and records the determination result.
  • the input unit 201 inputs, to the server device 20, the target accuracy for the processing performed on the field which is the processing target area by the flying object 10 and the size of the processing target area.
  • the target accuracy here is, for example, a value predetermined as the accuracy of the NDVI.
  • the input unit 201 inputs image data indicating an image captured by the flying object 10 to the server device 20.
  • the identifying unit 202 identifies, based on the information input by the input unit 201, the target accuracy in the process performed on the processing target area by the aircraft 10 and the size of the processing target area.
  • the calculating unit 203 calculates the time required for the flight object 10 to perform imaging processing so that the target accuracy specified by the specifying unit 202 can be realized with respect to the processing target area of the size specified by the specifying unit 202. Do. This process is a process performed based on the imaging of the ground (field) by the aircraft 10, and the calculation unit 203 calculates the required time based on the size of the effective range in the imaged image. In addition, the calculation unit 203 changes the size of the effective range according to the condition. More specifically, the calculation unit 203 determines the size of the effective range using the condition of the light amount at the time of imaging. The larger the amount of light at the time of imaging, the larger the effective range, and the smaller the amount of light at the time of imaging, the smaller the effective range.
  • FIG. 6 is a diagram for explaining this effective range.
  • the lens used when the flying object 10 captures an image for NDVI calculation is a fisheye lens.
  • the image captured by this fisheye lens is a two-dimensional circular image, and NDVI is calculated from the spectral reflectance characteristics of plants in this image. Since the calculation result of the NDVI varies depending on the elevation angle at the time of imaging, in particular, the calculation result at the edge of the captured image changes significantly.
  • the effective range varies depending on the flight altitude of the flying object 10. That is, when the flight altitude is high, the imaging range P in one imaging becomes wide, and as a result, the effective range Pu also widens. On the other hand, when the flight altitude is low, the imaging range P in one imaging also becomes narrow, and as a result, the effective range Pu also becomes narrow.
  • the calculation unit 203 calculates the required time based on the size of the range covered by one imaging.
  • the NDVI of the field which is the processing target area it is desirable to calculate the NDVI from the spectral reflectance characteristics of the plant in the range of a predetermined ratio (for example 10%) of the size of the whole area.
  • a predetermined ratio for example 10%
  • the number of imaging processes is also different. Specifically, when the effective range in the captured image is large (larger than a certain value), the number of imaging processes decreases, and the effective range in the captured image is small (smaller than a certain value) In this case, the number of imaging processes also increases.
  • the generation unit 204 is information on the size of the processing target area that can be processed with the upper limit For example, the ratio of the size of the processing target area that can be processed at the above upper limit to the size of the entire processing target area is generated.
  • the degree of processing completion in the entire processing target area can be estimated.
  • the evaluation value calculation unit 206 calculates NDVI from the spectral reflection characteristics of the plants in the image based on the image data input by the input unit 201.
  • the output unit 205 outputs the required time calculated by the calculation unit 203, the information generated by the generation unit 204, or the NDVI calculated by the evaluation value calculation unit 206.
  • the processor 11 is read by reading predetermined software (program) on hardware such as the processor 11 and the memory 12.
  • the process is executed by performing an operation and controlling communication by the communication device 14 and reading and / or writing of data in the memory 12 and the storage 13. The same applies to the server device 20.
  • FIG. 8 is a flowchart showing an example of the required time calculation operation of the server device 20.
  • the input unit 201 inputs, to the server device 20, the target accuracy for the process performed on the field which is the processing target area by the flying object 10 and the size of the processing target area (step S11). This input may be performed by an input device connected to the server device 20 or may be performed by a remote control device of the aircraft 10.
  • the identifying unit 202 identifies, based on the information input by the input unit 201, the target accuracy for the process performed by the flying object 10 on the processing target area and the size of the processing target area (step S12).
  • the calculation unit 203 calculates the required time required for the flying object 10 to perform the imaging process as described below so as to achieve the specified target accuracy for the specified size of the processing target area (see below) Step S13).
  • the accuracy A is expressed by a function f of the flying height h of the flying object 10, as shown in the following equation.
  • A f (h)
  • this function f is designed to be a function such that the accuracy A becomes lower as the flight height h of the flying object 10 becomes higher.
  • the required time T is expressed by a function g of the flying height h and the flying speed v of the flying object 10, as shown in the following equation.
  • T g (h, v)
  • the flight speed v is a velocity at which the flying object 10 moves between the respective imaging positions.
  • the function g may include the area and shape of the processing target area (field), the effective range Pu, the photographing stop time (time to temporarily stop at the time of photographing), the number of times of photographing, etc. as variables.
  • the flying height h is determined from the precision A by the function f.
  • an imaging range to be covered by the flying object 10 in one imaging is determined.
  • a flight plan including various flight conditions is determined based on the one imaging range and the size of the processing target area.
  • the flight plan determines the time required for the aircraft 10 to process T. Therefore, as shown in FIG. 7, the required time T becomes long when imaging processing is performed such that the flight altitude is low and the flight path to the processing target area is dense.
  • the required time T becomes short.
  • step S14 If the required time T calculated by the calculation unit 203 exceeds the upper limit of the target required time (step S14; NO), the generation unit 204 can process the area with the upper limit.
  • the generation unit 204 generates (Sl / S) ⁇ 100 (%) as information on the size of the processing target area that can be processed at the upper limit.
  • the output unit 205 outputs the required time T calculated by the calculation unit 203 or information ((Sl / S) ⁇ 100 (%)) generated by the generation unit 204 (step S16).
  • Modifications The present invention is not limited to the embodiments described above. You may deform
  • the calculation unit 203 may compare the lower limit of the required time according to the execution of the calibration for the process with the target required time, and generate information according to the comparison result. This makes it possible to determine whether the processing can be completed within the target time when calibration is performed. Specifically, before the imaging process, for example, an object of a predetermined color such as a white board is imaged to perform calibration of the imaging apparatus. By this calibration, the lower limit of the time required for the aircraft 10 to perform processing at a certain flight speed and flight altitude with a predetermined target accuracy can be determined. The calculation unit 203 compares the lower limit of the required time according to the execution of the calibration for the process with the target required time, and generates information (for example, the ratio of the former to the latter) according to the comparison result.
  • information for example, the ratio of the former to the latter
  • the calculation unit 203 may change the effective range Pu in accordance with the imaging timing.
  • Modification 4 the process performed by the flying object is not limited to the imaging process of a field or the like. Further, each equation described above is merely an example, and addition of a constant or a coefficient to the equation described above is optional.
  • each functional block may be realized by one physically and / or logically coupled device, or directly and / or indirectly two or more physically and / or logically separated devices. It may be connected by (for example, wired and / or wireless) and realized by the plurality of devices.
  • at least a part of the functions of the server device 20 may be implemented on the aircraft 10.
  • at least part of the functions of the aircraft 10 may be implemented on the server device 20.
  • Each aspect / embodiment described in the present specification is LTE (Long Term Evolution), LTE-A (LTE-Advanced), SUPER 3G, IMT-Advanced, 4G, 5G, FRA (Future Radio Access), W-CDMA (Registered trademark), GSM (registered trademark), CDMA2000, UMB (Ultra Mobile Broadband), IEEE 802.11 (Wi-Fi), IEEE 802.16 (WiMAX), IEEE 802.20, UWB (Ultra-Wide Band),
  • the present invention may be applied to a system utilizing Bluetooth (registered trademark), other appropriate systems, and / or an advanced next-generation system based on these.
  • system and "network” as used herein are used interchangeably.
  • radio resources may be indexed.
  • determining may encompass a wide variety of operations. For example, “judgment” and “decision” may be judging, calculating, calculating, processing, processing, deriving, investigating, looking up (for example, a table) (Searching in a database or another data structure), ascertaining may be considered as “decision” or “decision”. Also, “determination” and “determination” are receiving (for example, receiving information), transmitting (for example, transmitting information), input (input), output (output), access (Accessing) (for example, accessing data in a memory) may be regarded as “determined” or “determined”.
  • the present invention may be provided as a flight control method or an information processing method including the steps of processing performed in the flight control system 1 or the server device 20. Also, the present invention may be provided as a program executed on the airframe 10 or the server device 20. Such a program may be provided in the form of being recorded in a recording medium such as an optical disk, or may be provided in the form of being downloaded to a computer via a network such as the Internet and installed and made available. It is possible.
  • Software, instructions, etc. may be sent and received via a transmission medium.
  • software may use a wireline technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or a website, server or other using wireless technology such as infrared, radio and microwave When transmitted from a remote source, these wired and / or wireless technologies are included within the definition of transmission medium.
  • wireline technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or a website, server or other using wireless technology such as infrared, radio and microwave
  • data, instructions, commands, information, signals, bits, symbols, chips etc may be voltage, current, electromagnetic waves, magnetic fields or particles, optical fields or photons, or any of these May be represented by a combination of
  • the channels and / or symbols may be signals.
  • the signal may be a message.
  • the component carrier (CC) may be called a carrier frequency, a cell or the like.
  • any reference to an element using the designation "first,” “second,” etc. as used herein does not generally limit the quantity or order of those elements. These designations may be used herein as a convenient way of distinguishing between two or more elements. Thus, reference to the first and second elements does not mean that only two elements can be taken there, or that in any way the first element must precede the second element.
  • each device described above may be replaced with a “unit”, a “circuit”, a “device” or the like.
  • Flight control system 10 Flight object 20: Server device 21: Processor 22: Memory 23: Storage 24: Communication device 25: Bus 200: Tracking unit 201: Input unit 202: Identification Part, 203: Calculation part, 204: Generation part, 205: Output part, 206: Evaluation value calculation part.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention aborde le problème de connaître le temps nécessaire à un traitement lorsqu'un traitement est réalisé avec la précision requise par un objet volant. Selon la présente invention, un exemple de dispositif de traitement est un dispositif serveur (20) qui effectue un traitement servant à calculer l'indice de végétation par différence normalisée (NDVI) à partir des propriétés de réflectance spectrale de la végétation à l'aide des résultats d'imagerie provenant d'un objet volant (10). De plus, le dispositif serveur (20) réalise un traitement servant à calculer le temps nécessaire à un traitement lorsqu'un traitement est réalisé avec la précision requise par l'objet volant (10).
PCT/JP2019/001696 2018-01-26 2019-01-21 Dispositif de traitement d'informations WO2019146551A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019567059A JP6957651B2 (ja) 2018-01-26 2019-01-21 情報処理装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018011500 2018-01-26
JP2018-011500 2018-01-26

Publications (1)

Publication Number Publication Date
WO2019146551A1 true WO2019146551A1 (fr) 2019-08-01

Family

ID=67394925

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/001696 WO2019146551A1 (fr) 2018-01-26 2019-01-21 Dispositif de traitement d'informations

Country Status (2)

Country Link
JP (1) JP6957651B2 (fr)
WO (1) WO2019146551A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160132057A1 (en) * 2013-07-09 2016-05-12 Duretek Inc. Method for constructing air-observed terrain data by using rotary wing structure
JP2016197980A (ja) * 2015-04-06 2016-11-24 株式会社Nttファシリティーズ 診断システム、診断方法、及びプログラム
JP2017077879A (ja) * 2015-07-17 2017-04-27 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 無人飛行体、飛行制御方法及び飛行制御プログラム
WO2017073310A1 (fr) * 2015-10-27 2017-05-04 三菱電機株式会社 Système de capture d'image pour une mesure de forme de structure, procédé de capture d'image de structure utilisée pour une mesure de forme de structure, dispositif de commande embarqué, dispositif de télécommande, programme et support d'enregistrement
JP2017176115A (ja) * 2016-03-31 2017-10-05 本田技研工業株式会社 自律走行作業車の制御装置
JP2017216524A (ja) * 2016-05-30 2017-12-07 パナソニックIpマネジメント株式会社 映像撮影装置
KR20180000767A (ko) * 2016-06-23 2018-01-04 서울대학교산학협력단 무인항공기의 항로와 비행 스케쥴을 운항 지상국 소프트웨어를 통해 공유하여 무인항공기간 충돌을 방지하기 위한 방법

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160132057A1 (en) * 2013-07-09 2016-05-12 Duretek Inc. Method for constructing air-observed terrain data by using rotary wing structure
JP2016197980A (ja) * 2015-04-06 2016-11-24 株式会社Nttファシリティーズ 診断システム、診断方法、及びプログラム
JP2017077879A (ja) * 2015-07-17 2017-04-27 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 無人飛行体、飛行制御方法及び飛行制御プログラム
WO2017073310A1 (fr) * 2015-10-27 2017-05-04 三菱電機株式会社 Système de capture d'image pour une mesure de forme de structure, procédé de capture d'image de structure utilisée pour une mesure de forme de structure, dispositif de commande embarqué, dispositif de télécommande, programme et support d'enregistrement
JP2017176115A (ja) * 2016-03-31 2017-10-05 本田技研工業株式会社 自律走行作業車の制御装置
JP2017216524A (ja) * 2016-05-30 2017-12-07 パナソニックIpマネジメント株式会社 映像撮影装置
KR20180000767A (ko) * 2016-06-23 2018-01-04 서울대학교산학협력단 무인항공기의 항로와 비행 스케쥴을 운항 지상국 소프트웨어를 통해 공유하여 무인항공기간 충돌을 방지하기 위한 방법

Also Published As

Publication number Publication date
JPWO2019146551A1 (ja) 2021-01-07
JP6957651B2 (ja) 2021-11-02

Similar Documents

Publication Publication Date Title
US20200184668A1 (en) Systems and methods for three-dimensional pose determination
US20130212094A1 (en) Visual signatures for indoor positioning
US20160214533A1 (en) Autonomous vehicle cameras used for near real-time imaging
CN111045024A (zh) 一种基于光检测和测距的车辆追踪方法和系统
JP7299213B2 (ja) 情報処理装置
JP7341991B2 (ja) 監視装置
US11810323B2 (en) Position estimation system
CN111045023A (zh) 一种基于光检测和测距的车辆追踪方法和系统
CN111045025A (zh) 一种基于光检测和测距的车辆追踪方法和系统
CN112400346A (zh) 采集其它设备的位置信息的服务器设备和方法
CN112215887B (zh) 一种位姿确定方法、装置、存储介质及移动机器人
JP6857250B2 (ja) 飛行制御装置及び飛行制御システム
WO2019087891A1 (fr) Dispositif de traitement d'informations et système de commande de vol
JP7246388B2 (ja) 飛行体制御装置
WO2019146551A1 (fr) Dispositif de traitement d'informations
JP7050809B2 (ja) 情報処理装置
JP7060624B2 (ja) 情報処理装置
CN112212851B (zh) 一种位姿确定方法、装置、存储介质及移动机器人
CN109242782A (zh) 噪点处理方法及装置
JP2019101451A (ja) 情報処理装置
WO2019146577A1 (fr) Dispositif de traitement d'informations
JP7058290B2 (ja) 情報処理装置及び情報処理方法
WO2019082924A1 (fr) Dispositif de traitement d'informations
WO2024057746A1 (fr) Dispositif de correction
JP6903535B2 (ja) 情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19743814

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019567059

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19743814

Country of ref document: EP

Kind code of ref document: A1