CN111442722A - Positioning method, positioning device, storage medium and electronic equipment - Google Patents

Positioning method, positioning device, storage medium and electronic equipment Download PDF

Info

Publication number
CN111442722A
CN111442722A CN202010225750.4A CN202010225750A CN111442722A CN 111442722 A CN111442722 A CN 111442722A CN 202010225750 A CN202010225750 A CN 202010225750A CN 111442722 A CN111442722 A CN 111442722A
Authority
CN
China
Prior art keywords
point cloud
camera
map
coordinate system
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010225750.4A
Other languages
Chinese (zh)
Other versions
CN111442722B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Robotics Co Ltd
Original Assignee
Cloudminds Chengdu Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Chengdu Technologies Co ltd filed Critical Cloudminds Chengdu Technologies Co ltd
Priority to CN202010225750.4A priority Critical patent/CN111442722B/en
Publication of CN111442722A publication Critical patent/CN111442722A/en
Application granted granted Critical
Publication of CN111442722B publication Critical patent/CN111442722B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The disclosure relates to a positioning method, a positioning device, a storage medium and an electronic device. The method comprises the following steps: acquiring first attitude information of a camera under a map coordinate system at the current moment; the map coordinate system is a coordinate system of a pre-established point cloud map, and the point cloud map comprises a map of the current environment of the camera; acquiring first point cloud data of a current image acquired by a camera at the current moment in a map coordinate system; extracting a local point cloud map from the point cloud map according to the first position information; determining pose change information of the camera according to the first point cloud data and the local point cloud map; and correcting the first pose information according to the pose change information to obtain target pose information of the camera at the current moment, and positioning the camera according to the target pose information. Through the technical scheme, the camera can be positioned more accurately, the positioning accuracy is effectively improved, and the safe and accurate operation of the robot and other equipment integrated with the camera is guaranteed.

Description

Positioning method, positioning device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of positioning technologies, and in particular, to a positioning method, an apparatus, a storage medium, and an electronic device.
Background
The application of equipment such as robots, unmanned aerial vehicles, unmanned vehicles and the like in various fields such as distribution, logistics and the like is more and more extensive. In the related art, the robot and other devices are generally positioned by adopting a visual positioning technology, namely, a positioning mode that an image of the environment where the device is located is acquired through a camera integrated in the device and position information is obtained through the image. If the positioning precision is not high, the equipment cannot accurately judge the current position of the equipment, so that the safe and accurate operation of the equipment cannot be guaranteed.
Disclosure of Invention
The disclosure aims to provide a positioning method, a positioning device, a storage medium and an electronic device, which can improve the positioning accuracy.
To achieve the above object, in a first aspect, the present disclosure provides a positioning method, including: acquiring first attitude information of a camera under a map coordinate system at the current moment; the map coordinate system is a coordinate system of a pre-established point cloud map, and the point cloud map comprises a map of the current environment where the camera is located; acquiring first point cloud data of a current image acquired by the camera at the current moment in the map coordinate system; extracting a local point cloud map from the point cloud map according to the first position information; determining pose change information of the camera according to the first point cloud data and the local point cloud map; and correcting the first pose information according to the pose change information to obtain target pose information of the camera at the current moment, and positioning the camera according to the target pose information.
In a second aspect, the present disclosure provides a positioning device, the device comprising: the pose information acquisition module is configured to acquire first pose information of the camera under a map coordinate system at the current moment; the map coordinate system is a coordinate system of a pre-established point cloud map, and the point cloud map comprises a map of the current environment where the camera is located; a point cloud data acquisition module configured to acquire first point cloud data of a current image acquired by the camera at the current moment in the map coordinate system; an extraction module configured to extract a local point cloud map from the point cloud map according to the first pose information; a determination module configured to determine pose change information of the camera according to the first point cloud data and the local point cloud map; a positioning module configured to modify the first pose information according to the pose change information to obtain target pose information of the camera at the current time, and position the camera according to the target pose information.
In a third aspect, the present disclosure provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method provided by the first aspect of the present disclosure.
In a fourth aspect, the present disclosure provides an electronic device comprising: a memory having a computer program stored thereon; a processor for executing the computer program in the memory to implement the steps of the method provided by the first aspect of the present disclosure.
According to the technical scheme, the local point cloud map is extracted from the pre-established point cloud map according to the first position information of the camera under the map coordinate system at the current moment, and the local point cloud map can be a certain area around the current position of the camera, so that interference of other areas in the point cloud map on positioning is avoided. And then, determining pose change information of the camera according to the first point cloud data of the current image acquired by the camera in a map coordinate system and the local point cloud map. The target pose information of the camera at the current moment is obtained by correcting the first pose information through the pose change information, so that the target pose information is more accurate, the camera can be more accurately positioned according to the target pose information, the positioning accuracy is effectively improved, the current pose and the position of the robot and other equipment integrated with the camera can be accurately judged, and the safe and accurate operation of the equipment is ensured.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
Fig. 1 is a flow chart illustrating a positioning method according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating a method of acquiring first pose information of a camera in a map coordinate system at a current time according to an exemplary embodiment.
Fig. 3 is a flowchart illustrating a method of acquiring first point cloud data of a current image acquired by a camera at a current time in a map coordinate system according to an exemplary embodiment.
FIG. 4 is a flow diagram illustrating a method of pre-establishing a point cloud map, according to an example embodiment.
FIG. 5 is a block diagram illustrating a positioning device according to an exemplary embodiment.
FIG. 6 is a block diagram illustrating an electronic device in accordance with an example embodiment.
FIG. 7 is a block diagram illustrating an electronic device in accordance with another example embodiment.
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
The positioning method provided by the disclosure can be applied to terminal equipment such as robots, unmanned planes, unmanned vehicles and the like, and also can be applied to servers such as positioning management servers, cloud servers and the like. When the positioning method is applied to the server, the terminal equipment can send the acquired data such as the image to the server, the server carries out subsequent processing to position the terminal equipment, and the positioning information is returned to the terminal equipment.
It should be noted that, since the camera is integrated in the terminal device such as the robot, in the following description of the present disclosure, when referring to the pose information of the camera, it is also understood as the pose information of the terminal device into which the camera is integrated, and when referring to the positioning of the camera, it is also understood as the positioning of the terminal device into which the camera is integrated.
Fig. 1 is a flow chart illustrating a positioning method according to an exemplary embodiment. As shown in fig. 1, the positioning method may include S101 to S105.
In S101, first posture information of the camera in the map coordinate system at the present time is acquired.
The map coordinate system is a coordinate system of a pre-established point cloud map, and the point cloud map can comprise a map of the current environment where the camera is located. In general, a point cloud map needs to be pre-established by equipment such as a robot to recognize the surrounding environment, and a path is planned according to the point cloud map, so that the current position of the equipment in the point cloud map needs to be acquired, and the equipment can continue to move according to the planned path. In one embodiment, the point cloud map may be pre-established by images acquired by the camera at historical times.
The first pose information of the camera under the map coordinate system at the current moment can comprise position information and pose information of the camera currently in a pre-established point cloud map. The position information may include three-dimensional coordinate information of the camera in a map coordinate system, and the posture information may include deflection angles of the camera with respect to three coordinate axes of the map coordinate system, respectively.
In S102, first point cloud data of a current image acquired by the camera at the current time in a map coordinate system is acquired.
The camera can acquire images in a visual field in real time, and the visual field of the camera is an environment range which can be shot by the camera. The current image acquired by the camera at the current moment is the image of the environment where the camera is currently located.
Wherein, the collection of feature points in the image can be called as point cloud. The point cloud data related to the present disclosure may be three-dimensional point cloud data, and the first point cloud data of the current image in the map coordinate system may include three-dimensional coordinate information of all feature points in the current image in the map coordinate system.
It should be noted that, the execution sequence of S101 and S102 is not specifically limited in the present disclosure, and for example, S102 may be executed first, then S101 may be executed, or both may be executed simultaneously. Fig. 1 only shows an example in which S101 is performed before S102, but does not constitute a limitation on the embodiments of the present disclosure.
In S103, a local point cloud map is extracted from the point cloud map according to the first pose information.
The first pose information acquired in S101 may include position information of the camera currently in the point cloud map. And extracting a local point cloud map from the point cloud map according to the first position information, wherein the local point cloud map refers to extracting a certain area around the current position of the camera. Because the range of the point cloud map is larger, the local point cloud map is extracted, and the interference of other areas on positioning can be avoided.
The range of the extracted local point cloud map may be preset, and the disclosure is not particularly limited as long as the local point cloud map includes the current position of the camera. Illustratively, for example, an area formed by taking the current position information of the camera in the point cloud map as the center and taking a preset distance as a radius is used as the local point cloud map.
In S104, the pose change information of the camera is determined according to the first point cloud data and the local point cloud map.
In the step, the first point cloud data and the local point cloud map can be geometrically registered, and the local point cloud map has a small range and only comprises a certain area around the current position of the camera, so that more accurate feature point registration can be realized. Illustratively, the pose change information of the camera is determined by means of ICP (Iterative Closest Point) geometric registration, for example. The pose change information may refer to an incremental change in pose relative to the first pose information for use in correcting the first pose information.
In S105, the first pose information is corrected according to the pose change information to obtain target pose information of the camera at the current time, and the camera is positioned according to the target pose information.
In this step, the obtained target pose information of the camera at the current time is obtained by correcting the first pose information, so that the target pose information is more accurate, that is, more accurate position information and pose information of the camera at the current time in a map coordinate system are obtained. Therefore, the camera is positioned according to the target pose information, the positioning precision is higher, and the safe and accurate operation of the robot and other equipment integrated with the camera can be ensured.
According to the technical scheme, the local point cloud map is extracted from the pre-established point cloud map according to the first position information of the camera under the map coordinate system at the current moment, and the local point cloud map can be a certain area around the current position of the camera, so that interference of other areas in the point cloud map on positioning is avoided. And then, determining pose change information of the camera according to the first point cloud data of the current image acquired by the camera in a map coordinate system and the local point cloud map. The target pose information of the camera at the current moment is obtained by correcting the first pose information through the pose change information, so that the target pose information is more accurate, the camera can be more accurately positioned according to the target pose information, the positioning accuracy is effectively improved, the current pose and the position of the robot and other equipment integrated with the camera can be accurately judged, and the safe and accurate operation of the equipment is ensured.
Fig. 2 is a flowchart illustrating a method of acquiring first pose information of a camera in a map coordinate system at a current time according to an exemplary embodiment. As shown in fig. 2, S101 may include S201 to S206.
In S201, second position information of the camera in the world coordinate system at the current time is acquired.
A Visual-inertial metrology (VIO) module is usually integrated in a terminal device such as a robot, and an Inertial Measurement Unit (IMU) is integrated in the VIO module. The VIO module may calculate second pose information of the camera in the world coordinate system at the current moment according to the current image acquired by the camera, where the second pose information may include position information and pose information of the camera in the world coordinate system.
In S202, visual repositioning is performed according to the current image and the point cloud map.
Because the VIO module only calculates the pose information of the camera according to the current image, errors generated by positioning at each moment are accumulated continuously, and an error accumulation process exists. In order to ensure the accuracy of the pose information of the camera, a vision repositioning module is usually integrated in the robot and other devices. The visual relocation module can perform visual relocation according to a current image acquired by the camera and a pre-established point cloud map, for example, a loop candidate frame consistent with the current image is detected from the point cloud map, and then the feature points in the loop candidate frame are matched with the feature points of the current image to obtain the current pose information of the camera, so that the phenomenon of error accumulation can be avoided to a certain extent.
However, the visual repositioning has certain limitations, for example, in an actual application scene, an image acquired by a camera is greatly influenced by environmental factors such as light, and if the acquired image has high noise or the number of feature points of the image is small, it is difficult to detect a loop candidate frame. Even if the loop candidate frame is detected, the number of feature points that can be matched is small. Thus, it may happen that the visual repositioning result is not obtained.
It should be noted that, the execution sequence of S201 and S202 is not specifically limited in the present disclosure, and fig. 2 is only an exemplary illustration, for example, S202 may also be executed before S201, or both may be executed at the same time.
In S203, it is determined whether a visual repositioning result is acquired. If not, executing S204; in the case of acquisition, S205 and S206 are executed.
In S204, the second pose information is subjected to coordinate transformation according to the currently stored coordinate transformation information of the world coordinate system and the map coordinate system, so as to obtain the first pose information.
In the case that the visual repositioning result is not obtained, since the second pose information calculated by the VIO module is pose information of the camera in the world coordinate system, in order to obtain the first pose information of the camera in the map coordinate system, coordinate conversion needs to be performed on the second pose information. In this step, the second pose information may be subjected to coordinate conversion according to coordinate system conversion information of the world coordinate system and the map coordinate system that are currently stored. The coordinate system transformation information can represent the pose relationship between the world coordinate system and the map coordinate system. For example, the first posture information may be obtained by the following formula (1):
T_G_I=T_M_I*T_G_M (1)
Wherein, T _ G _ I represents first pose information of the camera in the map coordinate system at the current time, T _ M _ I represents second pose information of the camera in the world coordinate system at the current time, and T _ G _ M represents coordinate system conversion information of the world coordinate system and the map coordinate system which are currently stored. Both T _ G _ I, T _ M _ I and T _ G _ M may be represented in the form of a matrix.
In S205, first pose information is determined according to the visual repositioning result.
Under the condition of obtaining the vision repositioning result, the vision repositioning module carries out vision repositioning according to the current image and the point cloud map, and can directly obtain the position information and the posture information of the camera in the point cloud map, so the vision repositioning result can comprise the first posture information of the camera in a map coordinate system at the current moment.
In S206, the coordinate system conversion information is updated based on the first and second attitude information.
In order to ensure the accuracy of the coordinate system conversion information, the coordinate system conversion information is not fixed, and in the case that the vision repositioning result can be obtained, the coordinate system conversion information can be updated after the first pose information is determined according to the vision repositioning result. The VIO module can output second position and posture information of the camera in a world coordinate system. For example, according to the first posture information and the second posture information, the updated coordinate system conversion information may be obtained by the following formula (2):
T_G_M'=T_G_I*T_M_I-1(2)
Where T _ G _ M' represents the updated coordinate system conversion information.
In the above technical solution, if the visual repositioning result is not obtained, the first pose information may be obtained by performing coordinate transformation on the second pose information of the camera currently in the world coordinate system. If the vision repositioning result can be obtained, the first pose information can be directly obtained according to the vision repositioning result, and the coordinate system conversion information of the world coordinate system and the map coordinate system can be updated, so that the accuracy of the coordinate system conversion information is ensured.
Fig. 3 is a flowchart illustrating a method of acquiring first point cloud data of a current image acquired by a camera at a current time in a map coordinate system according to an exemplary embodiment. As shown in fig. 3, S102 may include S301 and S302.
In S301, second point cloud data of the current image in the world coordinate system is acquired.
The VIO module can calculate second position and posture information of the camera in a world coordinate system according to the current image acquired by the camera, and can output second point cloud data of the current image in the world coordinate system. The second point cloud data may include three-dimensional coordinate information of all feature points in the current image in a world coordinate system.
In S302, the second point cloud data is projected to a map coordinate system to obtain first point cloud data.
In this step, for example, the second point cloud data may be subjected to coordinate conversion to project the second point cloud data into a map coordinate system. In this way, the first point cloud data of the current image acquired by the camera under the map coordinates can be obtained.
In this disclosure, in the above S105, an exemplary embodiment of correcting the first pose information according to the pose change information to obtain the target pose information of the camera at the current time may be:
And determining the target pose information according to the product of the pose change information and the first pose information.
For example, the product of the pose change information and the first pose information may be determined as the target pose information. By way of example, the target pose information may be determined by equation (3) as follows:
T_Gi_I=T_Gi_G*T_G_I (3)
Wherein, T _ Gi _ I represents object pose information, and T _ Gi _ G represents pose change information.
In the present disclosure, the pre-established point cloud map may be established in any manner of the related art, or may be established in the manner of establishing the point cloud map described below in the present disclosure. FIG. 4 is a flow diagram illustrating a method of pre-establishing a point cloud map, according to an example embodiment. As shown in FIG. 4, the method may include S401 to S404.
In S401, third posture information of the camera in the world coordinate system at the historical time is acquired.
The VIO module can calculate third posture information of the camera under a world coordinate system according to images collected by the camera at historical time. The third pose information may include position information and pose information of the camera in a world coordinate system at historical times.
In S402, third point cloud data of the image acquired by the camera at the historical time in the world coordinate system is obtained according to the third pose information.
In the process of establishing the point cloud map, the point cloud data of the image acquired by the camera is the point cloud data under the camera coordinate system. In the step, coordinate conversion can be performed on the point cloud data in the camera coordinate system according to the third pose information of the camera in the world coordinate system, so that third point cloud data of the image acquired by the camera at the historical moment in the world coordinate system can be obtained. The third point cloud data may include three-dimensional coordinate information of feature points in images acquired by the camera at historical times in a world coordinate system.
In S403, filtering the third point cloud data to obtain fourth point cloud data after filtering.
And filtering the third point cloud data to obtain more accurate filtered fourth point cloud data. The present disclosure does not specifically limit the manner of the filtering process. For example, a voxel filtering process may be adopted, and the fourth point cloud data after the voxel filtering process is obtained by creating a three-dimensional voxel grid and performing down-sampling and other processes on the third point cloud data in the three-dimensional voxel grid.
In S404, a point cloud map is generated according to the filtered fourth point cloud data.
in this step, a pre-established point cloud map may be generated according to the fourth point cloud data after the filtering processing by using a Simultaneous localization and Mapping (slam) technique in the related art.
Based on the same inventive concept, the disclosure also provides a positioning device. Fig. 5 is a block diagram illustrating a positioning apparatus according to an exemplary embodiment, and as shown in fig. 5, the positioning apparatus 500 may include:
A pose information acquisition module 501 configured to acquire first pose information of the camera in a map coordinate system at a current moment; the map coordinate system is a coordinate system of a pre-established point cloud map, and the point cloud map comprises a map of the current environment where the camera is located;
A point cloud data acquiring module 502 configured to acquire first point cloud data of a current image acquired by the camera at the current time in the map coordinate system;
An extracting module 503 configured to extract a local point cloud map from the point cloud map according to the first pose information;
A determining module 504 configured to determine pose change information of the camera according to the first point cloud data and the local point cloud map;
A positioning module 505 configured to modify the first pose information according to the pose change information to obtain target pose information of the camera at the current time, and position the camera according to the target pose information.
According to the technical scheme, the local point cloud map is extracted from the pre-established point cloud map according to the first position information of the camera under the map coordinate system at the current moment, and the local point cloud map can be a certain area around the current position of the camera, so that interference of other areas in the point cloud map on positioning is avoided. And then, determining pose change information of the camera according to the first point cloud data of the current image acquired by the camera in a map coordinate system and the local point cloud map. The target pose information of the camera at the current moment is obtained by correcting the first pose information through the pose change information, so that the target pose information is more accurate, the camera can be more accurately positioned according to the target pose information, the positioning accuracy is effectively improved, the current pose and the position of the robot and other equipment integrated with the camera can be accurately judged, and the safe and accurate operation of the equipment is ensured.
Optionally, the pose information acquiring module 501 may include: a first obtaining sub-module configured to obtain second pose information of the camera in a world coordinate system at the current time; a visual repositioning sub-module configured for visual repositioning according to the current image and the point cloud map; and the coordinate conversion sub-module is configured to, in a case that the vision repositioning sub-module does not obtain a vision repositioning result, perform coordinate conversion on the second pose information according to currently stored coordinate system conversion information of the world coordinate system and the map coordinate system to obtain the first pose information.
Optionally, the pose information acquiring module 501 may further include: a second obtaining sub-module configured to, in a case where the visual repositioning sub-module obtains the visual repositioning result, determine the first pose information according to the visual repositioning result.
Optionally, the apparatus 500 may further include: an update module configured to update the coordinate system conversion information according to the first pose information and the second pose information.
Optionally, the point cloud data obtaining module 502 is configured to obtain second point cloud data of the current image in a world coordinate system; and projecting the second point cloud data to the map coordinate system to obtain the first point cloud data.
Optionally, the positioning module 505 is configured to determine the target pose information according to a product of the pose change information and the first pose information.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 6 is a block diagram illustrating an electronic device 600 according to an example embodiment. As shown in fig. 6, the electronic device 600 may include: a processor 601 and a memory 602. The electronic device 600 may also include one or more of a multimedia component 603, an input/output (I/O) interface 604, and a communications component 605.
The processor 601 is configured to control the overall operation of the electronic device 600, so as to complete all or part of the steps in the positioning method. The memory 602 is used to store various types of data to support operation at the electronic device 600, such as instructions for any application or method operating on the electronic device 600 and application-related data, such as contact data, transmitted and received messages, pictures, audio, video, and so forth. The Memory 602 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia components 603 may include a screen and audio components. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 602 or transmitted through the communication component 605. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 604 provides an interface between the processor 601 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 605 is used for wired or wireless communication between the electronic device 600 and other devices. Wireless communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding communication component 605 may therefore include: Wi-Fi module, Bluetooth module, NFC module, etc.
in an exemplary embodiment, the electronic Device 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable logic devices (plds), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the above-described positioning method.
In another exemplary embodiment, a computer readable storage medium comprising program instructions which, when executed by a processor, implement the steps of the positioning method described above is also provided. For example, the computer readable storage medium may be the memory 602 described above comprising program instructions that are executable by the processor 601 of the electronic device 600 to perform the positioning method described above.
Fig. 7 is a block diagram illustrating an electronic device 700 in accordance with another example embodiment. For example, the electronic device 700 may be provided as a server, such as a location management server, a cloud server, and the like, as described above. Referring to fig. 7, an electronic device 700 includes a processor 722, which may be one or more in number, and a memory 732 for storing computer programs that are executable by the processor 722. The computer programs stored in memory 732 may include one or more modules that each correspond to a set of instructions. Further, the processor 722 may be configured to execute the computer program to perform the above-described positioning method.
in addition, the electronic device 700 may also include a power component 726 that may be configured to perform power management of the electronic device 700 and a communication component 750 that may be configured to enable communication, e.g., wired or wireless communication, of the electronic device 700. in addition, the electronic device 700 may also include an input/output (I/O) interface 758. the electronic device 700 may operate based on an operating system stored in memory 732, e.g., Windows Server, Mac XTOSM, UnixTM, L inuxTM, and so on.
In another exemplary embodiment, a computer readable storage medium comprising program instructions which, when executed by a processor, implement the steps of the positioning method described above is also provided. For example, the computer readable storage medium may be the memory 732 described above including program instructions that are executable by the processor 722 of the electronic device 700 to perform the positioning method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned positioning method when executed by the programmable apparatus.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that, in the foregoing embodiments, various features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various combinations that are possible in the present disclosure are not described again.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (10)

1. A method of positioning, the method comprising:
Acquiring first attitude information of a camera under a map coordinate system at the current moment; the map coordinate system is a coordinate system of a pre-established point cloud map, and the point cloud map comprises a map of the current environment where the camera is located;
Acquiring first point cloud data of a current image acquired by the camera at the current moment in the map coordinate system;
Extracting a local point cloud map from the point cloud map according to the first position information;
Determining pose change information of the camera according to the first point cloud data and the local point cloud map;
And correcting the first pose information according to the pose change information to obtain target pose information of the camera at the current moment, and positioning the camera according to the target pose information.
2. The method of claim 1, wherein the obtaining first pose information of the camera in the map coordinate system at the current time comprises:
Acquiring second position and posture information of the camera under a world coordinate system at the current moment;
Performing visual repositioning according to the current image and the point cloud map;
And under the condition that a vision repositioning result is not obtained, performing coordinate conversion on the second posture information according to currently stored coordinate system conversion information of the world coordinate system and the map coordinate system to obtain the first posture information.
3. The method of claim 2, wherein the obtaining first pose information of the camera in a map coordinate system at a current time further comprises:
And under the condition of obtaining the vision repositioning result, determining the first posture information according to the vision repositioning result.
4. The method of claim 3, further comprising:
And updating the coordinate system conversion information according to the first position and attitude information and the second position and attitude information.
5. The method of claim 1, wherein the obtaining first point cloud data of a current image acquired by the camera at the current time in the map coordinate system comprises:
Acquiring second point cloud data of the current image in a world coordinate system;
And projecting the second point cloud data to the map coordinate system to obtain the first point cloud data.
6. The method of claim 1, wherein the correcting the first pose information according to the pose change information to obtain target pose information of the camera at the current time comprises:
And determining the target pose information according to the product of the pose change information and the first pose information.
7. The method of any one of claims 1-6, wherein the point cloud map is pre-established by:
Acquiring third posture information of the camera under a world coordinate system at historical time;
Obtaining third point cloud data of the image acquired by the camera at the historical moment in the world coordinate system according to the third posture information;
Filtering the third point cloud data to obtain filtered fourth point cloud data;
And generating the point cloud map according to the fourth point cloud data after filtering processing.
8. A positioning device, the device comprising:
The pose information acquisition module is configured to acquire first pose information of the camera under a map coordinate system at the current moment; the map coordinate system is a coordinate system of a pre-established point cloud map, and the point cloud map comprises a map of the current environment where the camera is located;
A point cloud data acquisition module configured to acquire first point cloud data of a current image acquired by the camera at the current moment in the map coordinate system;
An extraction module configured to extract a local point cloud map from the point cloud map according to the first pose information;
A determination module configured to determine pose change information of the camera according to the first point cloud data and the local point cloud map;
A positioning module configured to modify the first pose information according to the pose change information to obtain target pose information of the camera at the current time, and position the camera according to the target pose information.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
10. An electronic device, comprising:
A memory having a computer program stored thereon;
A processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 7.
CN202010225750.4A 2020-03-26 2020-03-26 Positioning method, positioning device, storage medium and electronic equipment Active CN111442722B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010225750.4A CN111442722B (en) 2020-03-26 2020-03-26 Positioning method, positioning device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010225750.4A CN111442722B (en) 2020-03-26 2020-03-26 Positioning method, positioning device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111442722A true CN111442722A (en) 2020-07-24
CN111442722B CN111442722B (en) 2022-05-17

Family

ID=71648157

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010225750.4A Active CN111442722B (en) 2020-03-26 2020-03-26 Positioning method, positioning device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111442722B (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111951262A (en) * 2020-08-25 2020-11-17 杭州易现先进科技有限公司 Method, device and system for correcting VIO error and electronic device
CN111983635A (en) * 2020-08-17 2020-11-24 浙江商汤科技开发有限公司 Pose determination method and device, electronic equipment and storage medium
CN112116638A (en) * 2020-09-04 2020-12-22 季华实验室 Three-dimensional point cloud matching method and device, electronic equipment and storage medium
CN112269386A (en) * 2020-10-28 2021-01-26 深圳拓邦股份有限公司 Method and device for repositioning symmetric environment and robot
CN112270709A (en) * 2020-11-12 2021-01-26 Oppo广东移动通信有限公司 Map construction method and device, computer readable storage medium and electronic device
CN112307363A (en) * 2020-11-05 2021-02-02 深圳市慧鲤科技有限公司 Virtual-real fusion display method and device, electronic equipment and storage medium
CN112432636A (en) * 2020-11-30 2021-03-02 浙江商汤科技开发有限公司 Positioning method and device, electronic equipment and storage medium
CN112446827A (en) * 2020-11-23 2021-03-05 北京百度网讯科技有限公司 Point cloud information processing method and device
CN112462784A (en) * 2020-12-03 2021-03-09 上海擎朗智能科技有限公司 Robot pose determination method, device, equipment and medium
CN112948411A (en) * 2021-04-15 2021-06-11 深圳市慧鲤科技有限公司 Pose data processing method, interface, device, system, equipment and medium
CN113124902A (en) * 2021-04-19 2021-07-16 追创科技(苏州)有限公司 Positioning correction method and device for mobile robot, storage medium, and electronic device
CN113298879A (en) * 2021-05-26 2021-08-24 北京京东乾石科技有限公司 Visual positioning method and device, storage medium and electronic equipment
CN113313765A (en) * 2021-05-28 2021-08-27 上海高仙自动化科技发展有限公司 Positioning method, positioning device, electronic equipment and storage medium
CN113503883A (en) * 2021-06-22 2021-10-15 北京三快在线科技有限公司 Method for collecting data for constructing map, storage medium and electronic equipment
CN113776517A (en) * 2021-09-03 2021-12-10 Oppo广东移动通信有限公司 Map generation method, device, system, storage medium and electronic equipment
CN113884006A (en) * 2021-09-27 2022-01-04 视辰信息科技(上海)有限公司 Space positioning method, system, equipment and computer readable storage medium
CN113989636A (en) * 2021-09-23 2022-01-28 深圳市联洲国际技术有限公司 Household appliance positioning and identifying method and device, storage medium and terminal device
CN114063091A (en) * 2020-07-30 2022-02-18 北京四维图新科技股份有限公司 High-precision positioning method and product
CN114310951A (en) * 2021-12-31 2022-04-12 北京航空航天大学杭州创新研究院 Pose optimization method and device, grabbing equipment and computer readable storage medium
CN114332228A (en) * 2021-12-30 2022-04-12 高德软件有限公司 Data processing method, electronic device and computer storage medium
WO2022078512A1 (en) * 2020-10-16 2022-04-21 北京猎户星空科技有限公司 Map establishment method and apparatus, and self-moving device and storage medium
CN114442605A (en) * 2021-12-16 2022-05-06 中国科学院深圳先进技术研究院 Positioning detection method, positioning detection device, autonomous mobile equipment and storage medium
CN114485607A (en) * 2021-12-02 2022-05-13 陕西欧卡电子智能科技有限公司 Method for determining motion track, operation equipment, device and storage medium
WO2022110777A1 (en) * 2020-11-30 2022-06-02 浙江商汤科技开发有限公司 Positioning method and apparatus, electronic device, storage medium, computer program product, and computer program
CN114812381A (en) * 2021-01-28 2022-07-29 华为技术有限公司 Electronic equipment positioning method and electronic equipment
CN115222808A (en) * 2021-06-30 2022-10-21 达闼机器人股份有限公司 Positioning method and device based on unmanned aerial vehicle, storage medium and electronic equipment
WO2024001339A1 (en) * 2022-07-01 2024-01-04 华为云计算技术有限公司 Pose determination method and apparatus, and computing device
CN113776517B (en) * 2021-09-03 2024-05-31 Oppo广东移动通信有限公司 Map generation method, device, system, storage medium and electronic equipment

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1262915A2 (en) * 2001-05-29 2002-12-04 Topcon Corporation Construction management system
CN107209853A (en) * 2015-01-27 2017-09-26 诺基亚技术有限公司 Positioning and map constructing method
CN107796397A (en) * 2017-09-14 2018-03-13 杭州迦智科技有限公司 A kind of Robot Binocular Vision localization method, device and storage medium
CN107990899A (en) * 2017-11-22 2018-05-04 驭势科技(北京)有限公司 A kind of localization method and system based on SLAM
CN108732584A (en) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 Method and apparatus for updating map
CN109297510A (en) * 2018-09-27 2019-02-01 百度在线网络技术(北京)有限公司 Relative pose scaling method, device, equipment and medium
CN109540148A (en) * 2018-12-04 2019-03-29 广州小鹏汽车科技有限公司 Localization method and system based on SLAM map
CN109613543A (en) * 2018-12-06 2019-04-12 深圳前海达闼云端智能科技有限公司 Method and device for correcting laser point cloud data, storage medium and electronic equipment
US20190178655A1 (en) * 2016-08-23 2019-06-13 Denso Corporation Vehicle control system, own vehicle position calculation apparatus, vehicle control apparatus, own vehicle position calculation program, and non-transitory computer readable storage medium
CN109887032A (en) * 2019-02-22 2019-06-14 广州小鹏汽车科技有限公司 A kind of vehicle positioning method and system based on monocular vision SLAM
CN110032965A (en) * 2019-04-10 2019-07-19 南京理工大学 Vision positioning method based on remote sensing images
CN110197615A (en) * 2018-02-26 2019-09-03 北京京东尚科信息技术有限公司 For generating the method and device of map
DE102018113344A1 (en) * 2018-06-05 2019-12-05 Valeo Schalter Und Sensoren Gmbh A method for locating a motor vehicle in an environment after a learning trip; Control device and driver assistance system
CN110675457A (en) * 2019-09-27 2020-01-10 Oppo广东移动通信有限公司 Positioning method and device, equipment and storage medium
CN110728717A (en) * 2019-09-27 2020-01-24 Oppo广东移动通信有限公司 Positioning method and device, equipment and storage medium
CN110849374A (en) * 2019-12-03 2020-02-28 中南大学 Underground environment positioning method, device, equipment and storage medium
CN110879400A (en) * 2019-11-27 2020-03-13 炬星科技(深圳)有限公司 Method, equipment and storage medium for fusion positioning of laser radar and IMU

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1262915A2 (en) * 2001-05-29 2002-12-04 Topcon Corporation Construction management system
CN107209853A (en) * 2015-01-27 2017-09-26 诺基亚技术有限公司 Positioning and map constructing method
US20190178655A1 (en) * 2016-08-23 2019-06-13 Denso Corporation Vehicle control system, own vehicle position calculation apparatus, vehicle control apparatus, own vehicle position calculation program, and non-transitory computer readable storage medium
CN108732584A (en) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 Method and apparatus for updating map
CN107796397A (en) * 2017-09-14 2018-03-13 杭州迦智科技有限公司 A kind of Robot Binocular Vision localization method, device and storage medium
CN107990899A (en) * 2017-11-22 2018-05-04 驭势科技(北京)有限公司 A kind of localization method and system based on SLAM
CN110197615A (en) * 2018-02-26 2019-09-03 北京京东尚科信息技术有限公司 For generating the method and device of map
DE102018113344A1 (en) * 2018-06-05 2019-12-05 Valeo Schalter Und Sensoren Gmbh A method for locating a motor vehicle in an environment after a learning trip; Control device and driver assistance system
CN109297510A (en) * 2018-09-27 2019-02-01 百度在线网络技术(北京)有限公司 Relative pose scaling method, device, equipment and medium
CN109540148A (en) * 2018-12-04 2019-03-29 广州小鹏汽车科技有限公司 Localization method and system based on SLAM map
CN109613543A (en) * 2018-12-06 2019-04-12 深圳前海达闼云端智能科技有限公司 Method and device for correcting laser point cloud data, storage medium and electronic equipment
CN109887032A (en) * 2019-02-22 2019-06-14 广州小鹏汽车科技有限公司 A kind of vehicle positioning method and system based on monocular vision SLAM
CN110032965A (en) * 2019-04-10 2019-07-19 南京理工大学 Vision positioning method based on remote sensing images
CN110675457A (en) * 2019-09-27 2020-01-10 Oppo广东移动通信有限公司 Positioning method and device, equipment and storage medium
CN110728717A (en) * 2019-09-27 2020-01-24 Oppo广东移动通信有限公司 Positioning method and device, equipment and storage medium
CN110879400A (en) * 2019-11-27 2020-03-13 炬星科技(深圳)有限公司 Method, equipment and storage medium for fusion positioning of laser radar and IMU
CN110849374A (en) * 2019-12-03 2020-02-28 中南大学 Underground environment positioning method, device, equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HUA LIU等: "《Pose alignment of aircraft structures with distance sensors and CCD cameras》", 《ROBOTICS AND COMPUTER–INTEGRATED MANUFACTURING》 *
俞毓锋,等: "《基于道路结构特征的智能车单目视觉定位》", 《自动化学报》 *
刘浩敏,等: "《基于单目视觉的同时定位与地图构建方法综述》", 《计算机辅助设计与图形学学报》 *

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114063091A (en) * 2020-07-30 2022-02-18 北京四维图新科技股份有限公司 High-precision positioning method and product
CN111983635B (en) * 2020-08-17 2022-03-29 浙江商汤科技开发有限公司 Pose determination method and device, electronic equipment and storage medium
CN114814872A (en) * 2020-08-17 2022-07-29 浙江商汤科技开发有限公司 Pose determination method and device, electronic equipment and storage medium
JP2022548441A (en) * 2020-08-17 2022-11-21 チョーチアン センスタイム テクノロジー デベロップメント カンパニー,リミテッド POSITION AND ATTITUDE DETERMINATION METHOD, APPARATUS, ELECTRONIC DEVICE, STORAGE MEDIUM AND COMPUTER PROGRAM
JP7236565B2 (en) 2020-08-17 2023-03-09 チョーチアン センスタイム テクノロジー デベロップメント カンパニー,リミテッド POSITION AND ATTITUDE DETERMINATION METHOD, APPARATUS, ELECTRONIC DEVICE, STORAGE MEDIUM AND COMPUTER PROGRAM
CN111983635A (en) * 2020-08-17 2020-11-24 浙江商汤科技开发有限公司 Pose determination method and device, electronic equipment and storage medium
CN111951262A (en) * 2020-08-25 2020-11-17 杭州易现先进科技有限公司 Method, device and system for correcting VIO error and electronic device
CN111951262B (en) * 2020-08-25 2024-03-12 杭州易现先进科技有限公司 VIO error correction method, device, system and electronic device
CN112116638A (en) * 2020-09-04 2020-12-22 季华实验室 Three-dimensional point cloud matching method and device, electronic equipment and storage medium
WO2022078512A1 (en) * 2020-10-16 2022-04-21 北京猎户星空科技有限公司 Map establishment method and apparatus, and self-moving device and storage medium
CN112269386A (en) * 2020-10-28 2021-01-26 深圳拓邦股份有限公司 Method and device for repositioning symmetric environment and robot
CN112269386B (en) * 2020-10-28 2024-04-02 深圳拓邦股份有限公司 Symmetrical environment repositioning method, symmetrical environment repositioning device and robot
CN112307363A (en) * 2020-11-05 2021-02-02 深圳市慧鲤科技有限公司 Virtual-real fusion display method and device, electronic equipment and storage medium
CN112270709B (en) * 2020-11-12 2024-05-14 Oppo广东移动通信有限公司 Map construction method and device, computer readable storage medium and electronic equipment
CN112270709A (en) * 2020-11-12 2021-01-26 Oppo广东移动通信有限公司 Map construction method and device, computer readable storage medium and electronic device
CN112446827B (en) * 2020-11-23 2023-06-23 北京百度网讯科技有限公司 Point cloud information processing method and device
CN112446827A (en) * 2020-11-23 2021-03-05 北京百度网讯科技有限公司 Point cloud information processing method and device
WO2022110777A1 (en) * 2020-11-30 2022-06-02 浙江商汤科技开发有限公司 Positioning method and apparatus, electronic device, storage medium, computer program product, and computer program
CN112432636B (en) * 2020-11-30 2023-04-07 浙江商汤科技开发有限公司 Positioning method and device, electronic equipment and storage medium
CN112432636A (en) * 2020-11-30 2021-03-02 浙江商汤科技开发有限公司 Positioning method and device, electronic equipment and storage medium
CN112462784A (en) * 2020-12-03 2021-03-09 上海擎朗智能科技有限公司 Robot pose determination method, device, equipment and medium
CN114812381A (en) * 2021-01-28 2022-07-29 华为技术有限公司 Electronic equipment positioning method and electronic equipment
CN112948411B (en) * 2021-04-15 2022-10-18 深圳市慧鲤科技有限公司 Pose data processing method, interface, device, system, equipment and medium
CN112948411A (en) * 2021-04-15 2021-06-11 深圳市慧鲤科技有限公司 Pose data processing method, interface, device, system, equipment and medium
CN113124902A (en) * 2021-04-19 2021-07-16 追创科技(苏州)有限公司 Positioning correction method and device for mobile robot, storage medium, and electronic device
WO2022222345A1 (en) * 2021-04-19 2022-10-27 追觅创新科技(苏州)有限公司 Positioning correction method and apparatus for mobile robot, storage medium, and electronic apparatus
CN113124902B (en) * 2021-04-19 2024-05-14 追创科技(苏州)有限公司 Positioning correction method and device for mobile robot, storage medium and electronic device
CN113298879A (en) * 2021-05-26 2021-08-24 北京京东乾石科技有限公司 Visual positioning method and device, storage medium and electronic equipment
CN113298879B (en) * 2021-05-26 2024-04-16 北京京东乾石科技有限公司 Visual positioning method and device, storage medium and electronic equipment
CN113313765B (en) * 2021-05-28 2023-12-01 上海高仙自动化科技发展有限公司 Positioning method, positioning device, electronic equipment and storage medium
CN113313765A (en) * 2021-05-28 2021-08-27 上海高仙自动化科技发展有限公司 Positioning method, positioning device, electronic equipment and storage medium
CN113503883A (en) * 2021-06-22 2021-10-15 北京三快在线科技有限公司 Method for collecting data for constructing map, storage medium and electronic equipment
CN115222808A (en) * 2021-06-30 2022-10-21 达闼机器人股份有限公司 Positioning method and device based on unmanned aerial vehicle, storage medium and electronic equipment
CN113776517A (en) * 2021-09-03 2021-12-10 Oppo广东移动通信有限公司 Map generation method, device, system, storage medium and electronic equipment
CN113776517B (en) * 2021-09-03 2024-05-31 Oppo广东移动通信有限公司 Map generation method, device, system, storage medium and electronic equipment
CN113989636A (en) * 2021-09-23 2022-01-28 深圳市联洲国际技术有限公司 Household appliance positioning and identifying method and device, storage medium and terminal device
CN113989636B (en) * 2021-09-23 2024-05-28 深圳市联洲国际技术有限公司 Household appliance positioning and identifying method and device, storage medium and terminal equipment
CN113884006A (en) * 2021-09-27 2022-01-04 视辰信息科技(上海)有限公司 Space positioning method, system, equipment and computer readable storage medium
CN114485607B (en) * 2021-12-02 2023-11-10 陕西欧卡电子智能科技有限公司 Method, operation equipment, device and storage medium for determining motion trail
CN114485607A (en) * 2021-12-02 2022-05-13 陕西欧卡电子智能科技有限公司 Method for determining motion track, operation equipment, device and storage medium
CN114442605A (en) * 2021-12-16 2022-05-06 中国科学院深圳先进技术研究院 Positioning detection method, positioning detection device, autonomous mobile equipment and storage medium
CN114442605B (en) * 2021-12-16 2023-08-18 中国科学院深圳先进技术研究院 Positioning detection method, device, autonomous mobile equipment and storage medium
CN114332228A (en) * 2021-12-30 2022-04-12 高德软件有限公司 Data processing method, electronic device and computer storage medium
CN114310951B (en) * 2021-12-31 2024-04-26 北京航空航天大学杭州创新研究院 Pose optimization method, pose optimization device, grabbing equipment and computer readable storage medium
CN114310951A (en) * 2021-12-31 2022-04-12 北京航空航天大学杭州创新研究院 Pose optimization method and device, grabbing equipment and computer readable storage medium
WO2024001339A1 (en) * 2022-07-01 2024-01-04 华为云计算技术有限公司 Pose determination method and apparatus, and computing device

Also Published As

Publication number Publication date
CN111442722B (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN111442722B (en) Positioning method, positioning device, storage medium and electronic equipment
CN110246182B (en) Vision-based global map positioning method and device, storage medium and equipment
CN110118554B (en) SLAM method, apparatus, storage medium and device based on visual inertia
CN112258567B (en) Visual positioning method and device for object grabbing point, storage medium and electronic equipment
JP6442193B2 (en) Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method and program
US20160327946A1 (en) Information processing device, information processing method, terminal device, and setting method
CN107748569B (en) Motion control method and device for unmanned aerial vehicle and unmanned aerial vehicle system
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
CN110470333B (en) Calibration method and device of sensor parameters, storage medium and electronic device
CN112964276B (en) Online calibration method based on laser and vision fusion
CN111652113B (en) Obstacle detection method, device, equipment and storage medium
CN111805535A (en) Positioning navigation method, device and computer storage medium
CN112556685A (en) Navigation route display method and device, storage medium and electronic equipment
WO2023273415A1 (en) Positioning method and apparatus based on unmanned aerial vehicle, storage medium, electronic device, and product
CN111380515A (en) Positioning method and device, storage medium and electronic device
CN117392241B (en) Sensor calibration method and device in automatic driving and electronic equipment
CN111583338B (en) Positioning method and device for unmanned equipment, medium and unmanned equipment
CN113252066B (en) Calibration method and device for parameters of odometer equipment, storage medium and electronic device
CN112689234A (en) Indoor vehicle positioning method and device, computer equipment and storage medium
CN109489658B (en) Moving target positioning method and device and terminal equipment
CN113503883B (en) Method for collecting data for constructing map, storage medium and electronic equipment
CN113513983B (en) Precision detection method and device, electronic equipment and medium
JP2016177749A (en) Mobile body control device, program, and integrated circuit
CN114571460A (en) Robot control method, device and storage medium
CN113932793A (en) Three-dimensional coordinate positioning method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210303

Address after: 201111 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant after: Dalu Robot Co.,Ltd.

Address before: No.3, 7th floor, unit 1, building 5, No.399, Fucheng Avenue West, Chengdu, Sichuan 610094

Applicant before: CLOUDMINDS (CHENGDU) TECHNOLOGIES Co.,Ltd.

TA01 Transfer of patent application right
CB02 Change of applicant information

Address after: 201111 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Applicant after: Dayu robot Co.,Ltd.

Address before: 201111 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant before: Dalu Robot Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant