CN111844041B - Positioning assistance device, robot and visual positioning system - Google Patents

Positioning assistance device, robot and visual positioning system Download PDF

Info

Publication number
CN111844041B
CN111844041B CN202010719595.1A CN202010719595A CN111844041B CN 111844041 B CN111844041 B CN 111844041B CN 202010719595 A CN202010719595 A CN 202010719595A CN 111844041 B CN111844041 B CN 111844041B
Authority
CN
China
Prior art keywords
information
display
positioning
target
feature information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010719595.1A
Other languages
Chinese (zh)
Other versions
CN111844041A (en
Inventor
姚达琛
刘皓松
李�诚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sensetime Lingang Intelligent Technology Co Ltd
Original Assignee
Shanghai Sensetime Lingang Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sensetime Lingang Intelligent Technology Co Ltd filed Critical Shanghai Sensetime Lingang Intelligent Technology Co Ltd
Priority to CN202010719595.1A priority Critical patent/CN111844041B/en
Publication of CN111844041A publication Critical patent/CN111844041A/en
Application granted granted Critical
Publication of CN111844041B publication Critical patent/CN111844041B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a positioning auxiliary device, a robot and a vision positioning system. The positioning auxiliary device is arranged on target equipment to be positioned and comprises a display unit, a control unit and a communication unit; the display unit is used for displaying characteristic information, wherein the characteristic information can be used for visual positioning to determine pose information of the target device; the control unit is connected with the display unit and is used for controlling the display unit to display the characteristic information; the communication unit is connected with the control unit and used for realizing information transmission between the target equipment and other equipment. The positioning auxiliary device can be used for conveniently realizing visual positioning on various types of target equipment and enabling the target equipment to have a communication function.

Description

Positioning assistance device, robot and visual positioning system
Technical Field
The application relates to the technical field of visual positioning, in particular to a positioning auxiliary device, a robot and a visual positioning system.
Background
The vision positioning refers to detecting and positioning the robot based on the pictures acquired by the camera to acquire the pose information of the robot.
At present, the vision recognition equipment can only recognize the trained robot in the database, but cannot recognize the robots outside the database, which greatly limits the application and popularization of the vision positioning technology.
Disclosure of Invention
The application provides at least a location auxiliary device, robot and vision positioning system.
The first aspect of the application provides a positioning auxiliary device, which is used for being arranged on target equipment to be positioned and comprises a display unit, a control unit and a communication unit; the display unit is used for displaying characteristic information, wherein the characteristic information can be used for visual positioning to determine pose information of the target device; the control unit is connected with the display unit and is used for controlling the display unit to display the characteristic information; the communication unit is connected with the control unit and used for realizing information transmission between the target equipment and other equipment.
Therefore, the display unit is used for displaying the characteristic information, so that the visual positioning equipment can directly identify the positioning auxiliary device, the pose information of the target equipment can be obtained, the visual positioning of various types of target equipment is facilitated, the target equipment is not required to be used for training the visual positioning equipment, and the training cost is reduced. Moreover, the communication unit can realize information interaction between the target device and other devices. Thus, the target device which does not have the communication function can be enabled to have the capability of communicating with the outside.
The communication unit comprises a first communication module and a second communication module, wherein the first communication module is used for being connected with other equipment, and the second communication module is used for being connected with target equipment.
Therefore, the first communication module and the second communication module can respectively realize the communication between the positioning auxiliary device and other equipment and the target equipment.
Wherein, the interface is a serial interface.
Therefore, by setting the interface module as a serial interface, communication with the target device can be achieved.
The control unit is specifically configured to acquire display control information of other devices or target devices through the communication unit, and control the display unit to display the feature information based on the display control information.
Therefore, the control unit can correspondingly display the characteristic information by acquiring the display control information, so that the target device can be identified by visual positioning and the pose information of the target device can be acquired.
Wherein, the above-mentioned display control information is the device identification information of the target device, and the control unit executes to control the display unit to display the characteristic information based on the display control information, including: detecting whether associated characteristic information corresponding to the equipment identification information exists or not; if the equipment identification information exists, controlling a display unit to display the associated characteristic information bound with the equipment identification information; if the device identification information does not exist, selecting candidate feature information from the candidate feature information set to be bound, controlling the display unit to display the selected candidate feature information, and storing the selected candidate feature information as associated feature information bound with the device identification information.
Therefore, by displaying the associated feature information, the visual positioning apparatus can confirm that the apparatus corresponding to the associated feature information is operating and obtain the pose information thereof by recognizing the associated feature information. In addition, after the corresponding relation between the selected candidate characteristic information and the device identification information of the device is established, the visual positioning device can acquire the pose information of the target device by identifying the selected candidate characteristic information and confirm that the target device is running.
Wherein, foretell display control information is for predetermineeing characteristic information, and the control unit execution is based on the display control information control display element and shows characteristic information, includes: and controlling the display unit to display the preset characteristic information.
Therefore, by directly displaying the display control information, the feature information displayed by the display unit can be preset feature information, so that the target device can recognize and acquire pose information.
The display unit is a light emitting diode array.
By providing the display unit as a light emitting diode array, the characteristic information can be displayed by the light and dark of the light emitting diode array so that the characteristic information can be recognized.
A second aspect of the present application provides a robot comprising: the robot comprises a robot body and a positioning auxiliary device, wherein the positioning auxiliary device is arranged on the robot body and is the device described in the first aspect.
A third aspect of the application provides a visual positioning system, the system comprising: a target device and a visual positioning device. The target device is provided with the positioning assistance apparatus described in the above first aspect. The visual positioning equipment is used for carrying out visual positioning according to the characteristic information displayed by the positioning auxiliary device so as to determine the pose information of the target equipment.
According to the scheme, the display unit is used for displaying the characteristic information, so that the visual positioning equipment can directly identify the positioning auxiliary device, the pose information of the target equipment can be obtained, the visual positioning of various types of target equipment is facilitated, the target equipment is not required to be used for training the visual positioning equipment, and the training cost is reduced. Moreover, the communication unit can realize information interaction between the target device and other devices. Thus, the target device which does not have the communication function can be enabled to have the capability of communicating with the outside.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of an embodiment of a visual alignment system of the present application;
FIG. 2 is a schematic view of a first frame of an embodiment of the positioning aid of the present application;
FIG. 3 is a second frame diagram of an embodiment of the positioning aid of the present application;
fig. 4 is a schematic diagram of a framework of the robot embodiment of the present application.
Detailed Description
The following describes in detail the embodiments of the present application with reference to the drawings attached hereto.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, interfaces, techniques, etc. in order to provide a thorough understanding of the present application.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship. Further, the term "plurality" herein means two or more than two. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
The vision positioning refers to detecting and positioning the robot based on the pictures acquired by the camera to acquire the pose information of the robot. In the present application, visual positioning may be performed using a visual positioning apparatus. The visual positioning device comprises a calibrated camera which is used as an image acquisition device to acquire image information, and can also comprise an image analysis device to analyze an acquired image to acquire pose information of the target device. It can be understood that the number of the cameras is not limited, and may be 1 or several.
In the application, the positioning auxiliary device may be disposed on the target device, so that the visual positioning device may detect and position the positioning auxiliary device to obtain the pose information of the positioning auxiliary device, and use the pose information of the positioning auxiliary device as the pose information of the target device. Therefore, the visual positioning equipment does not need to identify the target equipment, and the visual positioning equipment does not need to spend a large amount of time in advance to train the identification model of the target equipment, so that the target robot can be quickly accessed into the visual positioning system.
Referring to fig. 1, fig. 1 is a schematic diagram of an embodiment of a visual positioning system of the present application. In the disclosed embodiment, the visual positioning system 10 includes a target apparatus 11, a positioning aid 12, and a visual positioning apparatus 13. The target device 11 may be a robot, for example, a movable smart cart, a transfer robot, or the like. On the target device 11, a positioning aid 12 may be provided, and the positioning aid 12 may be used to display characteristic information so that the visual positioning device 13 can recognize the positioning aid 12 and position it. The characteristic information may be any information that can be visually recognized, and may be specifically expressed as a figure, a character, or the like. Generally, in order to realize visual positioning according to the feature information, an image including different feature information may be used to train a visual positioning model in advance, and the visual positioning device 13 identifies the feature information displayed by the positioning assisting apparatus 12 by using the trained visual positioning model, and uses the feature information to perform positioning to obtain the pose information of the target device 11.
The positioning aid 12 may be disposed on top of the target apparatus 11, for example, to reduce the probability that the positioning aid 12 is occluded, thereby making it easier for the visual positioning apparatus 13 to acquire pose information of the target apparatus 11 using the positioning aid 12. The visual positioning apparatus 13 may identify the feature information displayed by the positioning assistance device 12 by using a visual positioning technology to determine the pose information of the target apparatus 11. The visual positioning apparatus 13 includes, for example, several calibrated cameras 131 and an image analysis device 132, and the image analysis device 132 is, for example, a computer, and is capable of analyzing images captured by the cameras to obtain pose information of the target apparatus 11.
The visual positioning technology is to use image capturing devices, such as cameras and cameras, to capture images of a target object, identify the captured images to identify feature information about the target object in the images, calculate three-dimensional coordinates and rotational coordinates of the target object in a camera coordinate system according to the feature information, and further obtain relative three-dimensional coordinates and rotational coordinates of the image capturing devices relative to the target object, so as to obtain pose information of the target object relative to the ground. Therefore, the visual positioning apparatus 13 can acquire an image of the feature information displayed by the positioning assistance device 12, recognize the feature information using the acquired image, determine the pose information of the positioning assistance device 12 based on the feature information, and obtain the pose information of the target apparatus 11 because the positioning assistance device 12 is provided on the target apparatus 11. The positioning aid is further described below.
Referring to fig. 2, fig. 2 is a schematic diagram of a first frame of an embodiment of a positioning assistance device of the present application. In the disclosed embodiment, the positioning assistance device 20 includes a display unit 21, a control unit 22, and a communication unit 23.
The display unit 21 may be used to display the characteristic information. The feature information displayed by the display unit 21 can be used for visual localization to determine pose information of the target device. The feature information is information that has been used to train the visual positioning apparatus, so the visual positioning apparatus can identify the feature information using a visual positioning technique to determine pose information of the target apparatus. The characteristic information is, for example, a specific pattern, for example, some specific two-dimensional patterns, such as crosses, squares, etc. It is understood that the number of the characteristic information may be several, and thus the display unit 21 may also display several characteristic information. In a disclosed embodiment, the display unit 21 may be a Light Emitting Diode (LED) array, and a specific pattern may be formed as the characteristic information by controlling some LED lamps in the LED array to emit Light, some LED lamps to not emit Light, or all LED lamps to emit Light. Specifically, the LED array may be a 5 × 5 array, or may be another array, and the arrangement of the LED array is not limited and may be adjusted as needed. In another disclosed embodiment, the display unit 21 may also be a display screen, such as an LED display screen or a liquid crystal display screen.
By displaying the characteristic information using the display unit 21, the visual positioning apparatus can be enabled to directly recognize the positioning assistance device 20, whereby the pose information of the target apparatus can be obtained without having to train the visual positioning apparatus using the target apparatus. Therefore, the types of the target devices can be unlimited, and various types of target devices can be rapidly helped to be identified by visual positioning.
In the positioning assistance device 20 described in the embodiment of the present disclosure, the control unit 22 is connected to the display unit 21, and can be used to control the display unit 21 to display the feature information. The control unit 22 is, for example, a control circuit including a processor and capable of issuing a control instruction. After receiving the control instruction sent by the control unit 22, the display unit 21 may display corresponding feature information based on the control instruction.
The communication unit 23 is connected to the control unit 22. The communication unit 23 is capable of implementing information transmission between the target device and other devices, such as the visual positioning device or other target devices, and the communication unit 23 may implement information interaction between the target device and the visual positioning device, for example.
Therefore, by displaying the characteristic information by using the display unit 21, the visual positioning apparatus can directly recognize the positioning auxiliary device 20, so that the pose information of the target apparatus can be obtained, thereby facilitating the visual positioning of various types of target apparatuses without using the target apparatus to train the visual positioning apparatus, and reducing the training cost. Furthermore, information interaction between the target device and other devices can be realized through the communication unit 23 of the positioning assistance apparatus 20. Thus, the target device which does not have the communication function can be enabled to have the capability of communicating with the outside.
Referring to fig. 3, fig. 3 is a schematic diagram of a second frame of the positioning assistance device according to the embodiment of the present application. The embodiment of the disclosure is a further extension of the embodiment of the positioning auxiliary device. Specifically, the positioning assistance apparatus 30 includes a display unit 31, a control unit 32, and a communication unit 33.
Specifically, the communication unit 33 includes a first communication module 332 and a second communication module 332. The first communication module 332 is used for wireless connection with other devices. The first communication module 332 is, for example, an action hotspot module. The visual positioning device can be wirelessly connected with the communication unit 33, and realize information interaction with the target device through the positioning auxiliary device 30; or each target device realizes information interaction with other target devices through the respective first communication module. The second communication module 332 is used for connecting with a target device. For example, an interface is provided at the second communication module 332, so that the target device can be connected with the communication unit 33 through the interface by wire, and information interaction with the visual positioning device and the positioning assistance apparatus 30 is realized. The second communication module 332 is, for example, a serial interface.
It can be seen that, through the communication unit 33 of the positioning assistance device 30, the target device and other devices can be connected, and thus, information interaction between them can be realized. Therefore, the target device without the communication function can have the capability of communicating with the outside, and the target device can be more easily accessed into the visual positioning and recognition system.
In one disclosed embodiment, the control unit 32 is capable of acquiring display control information of other devices or target devices through the communication unit 33. The display control information may be understood as control information that the visual positioning apparatus or the target device sends to the positioning assistance device 30 for controlling the positioning assistance device 30 to display the specific feature information. Specifically, the visual positioning apparatus or the target apparatus may transmit the display control information to the communication unit 33 of the positioning assistance device 30, so that the control unit 32 can acquire the display control information transmitted by the visual positioning apparatus or the target apparatus through the communication unit 33.
After the control unit 32 acquires the display control information, the control unit 32 may control the display unit 31 to display the feature information based on the display control information. For example, the control unit 32 may analyze the acquired display characteristic information, acquire characteristic information corresponding to the display characteristic information, and control the display unit 31 to display the characteristic information. Or, the display characteristic information itself may be the characteristic information, that is, the display characteristic information is the preset characteristic information, and after the control unit 32 acquires the preset control information, the control unit 31 may be directly controlled to display the preset control information. The control unit 32 can correspondingly display the characteristic information by acquiring the display control information, so that the target device can be identified by visual positioning and the pose information thereof can be acquired.
In one disclosed embodiment, the display control information is device identification information of the target device. The device identification information may be understood as information for identifying the device, such as a serial number of the device or the like. Based on this, the control unit 32 can confirm information such as the specific type of the target device connected to the positioning assistance apparatus 30 by analyzing the device identification information of the target device after acquiring the device identification information.
When the display control information is the device identification information of the target device, the control unit 32 performs a process of controlling the display unit 31 to display the feature information based on the display control information, and specifically may include the steps of:
step 1: the control unit 32 detects whether there is associated feature information corresponding to the device identification information.
In the positioning assistance device 30, device identification information of the robot that has accessed the visual positioning system may be stored in advance, and a specific piece of associated feature information may be assigned to each device that has accessed. Associated feature information may be understood as feature information corresponding to a particular target device. For example, if a device number one has access to the visual positioning system, the associated feature information assigned to the device number one may be the first feature information. After that, the visual positioning device can confirm that the first device is in operation after recognizing the first characteristic information. Therefore, the control unit 32 can determine whether the device has accessed the device of the visual positioning system and whether the associated feature information corresponding to the device exists by analyzing the obtained device identification information. That is, the control unit 32 may detect whether there is associated feature information corresponding to the device identification information.
Step 2: if the device identification information exists, the display unit 31 is controlled to display the associated feature information bound with the device identification information.
The presence of corresponding associated feature information corresponding to the presence of the device, meaning that the device has been accessed by the visual positioning system, is detected at the control unit 32. At this time, the control unit 32 may control the display unit 31 to display the associated feature information bound with the device identification information. Thus, the characteristic information displayed by the visual positioning equipment can be enabled to confirm that the equipment is in the process of operation.
By displaying the associated feature information, the visual positioning apparatus may confirm that the apparatus corresponding to the associated feature information is operating by recognizing the associated feature information.
And step 3: if not, selecting a candidate feature information from the candidate feature information set to be bound, controlling the display unit 31 to display the selected candidate feature information, and saving the selected candidate feature information as the associated feature information bound with the device identification information.
If the control unit 32 detects that there is no associated feature information corresponding to the device identification information of the device, that means that the device has not previously accessed the visual positioning system, the device is a newly added device. At this time, the control unit 32 may select one candidate feature information from the candidate feature information sets to be bound, and control the display unit 31 to display the selected candidate feature information. The candidate feature information to be bound may be understood as device identification information that these candidate feature information do not currently have corresponding to. Meanwhile, the control unit 32 may also save the selected candidate feature information as associated feature information bound with the device identification information. After the established correspondence relationship, the selected candidate feature information is the associated feature information corresponding to the device identification information of the device, and the control unit 32 may delete the selected candidate feature information from the candidate feature information set to be bound.
By allocating candidate feature information to be bound to the target device without the associated feature information, the visual positioning device can acquire the pose information of the target device by identifying the selected candidate feature information. In addition, after the selected candidate feature information is established to be the corresponding relation with the device identification information of the device, the visual positioning device can confirm that the corresponding target device is running by identifying the selected candidate feature information.
It is understood that the execution subject for assigning a specific associated feature information to each device that has been accessed may be the control unit 32, or may be a visual positioning device. Regardless of the control unit 32 or the visual positioning device, after one device allocates a specific associated feature information to an already accessed device, the related information may be sent to the other device, so that the other device can synchronously acquire the corresponding relationship between the device identification information and the associated feature information of the device. In addition, after the control unit 32 stores the selected candidate feature information as associated feature information bound with the device identification information, the control unit 32 may send related information to the visual positioning device through the first communication module 332 of the communication unit 33, so that the visual positioning device can synchronously establish a corresponding relationship between the selected candidate feature information and the device identification information.
With the above-described arrangement, by displaying the characteristic information using the display unit 31, the visual positioning apparatus can directly recognize the positioning assistance device 30, and thereby can obtain the pose information of the target apparatus. In addition, the communication unit 33 of the positioning assistance device 30 can realize information interaction between the target device and other devices.
Referring to fig. 4, fig. 4 is a schematic diagram of a frame of a robot according to an embodiment of the present application. The robot 40 of the present embodiment includes a robot body 41 and a positioning assistance device 42. The positioning aid 42 may be the positioning aid described in the above embodiments. In addition, the positioning assistance device 42 is provided on the robot body 41, for example, on the top of the robot, so that the characteristic information displayed by the positioning assistance device 42 is not unrecognizable by the visual positioning apparatus due to the rotational movement of the robot or the like. The robot body 41 is, for example, a cart, a transfer robot, a mechanical dog, or the like.
Therefore, by arranging the positioning auxiliary device 42 on the robot body 41, the robot body 41 does not need to access a related vision positioning system in advance, and can acquire the pose only by the positioning auxiliary device 42, so that the robot body 41 can be quickly accessed to the related vision positioning system, and the communication between the robot body 41 and other robots or vision positioning equipment can be realized by the communication function of the positioning auxiliary device 42.
The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, and for brevity, will not be described again herein.
The above embodiments are merely examples and are not intended to limit the scope of the present disclosure, and all modifications, equivalents, and flow charts using the contents of the specification and drawings of the present disclosure or those directly or indirectly applied to other related technical fields are intended to be included in the scope of the present disclosure.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely one type of logical division, and an actual implementation may have another division, for example, a unit or a component may be combined or integrated with another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some interfaces, and may be in an electrical, mechanical or other form.

Claims (8)

1. A positioning assistance device, for being disposed on a target device to be positioned, comprising:
a display unit for displaying characteristic information, wherein the characteristic information can be used for visual positioning to determine pose information of the target device;
the control unit is connected with the display unit and is used for controlling the display unit to display the characteristic information;
the communication unit is connected with the control unit and is used for realizing information transmission between the target equipment and other equipment;
the control unit is specifically configured to acquire display control information of the other device through the communication unit, and control the display unit to display the feature information based on the display control information, where the other device is a visual positioning device;
in a case where the display control information is device identification information of a target device, the control unit performs control of the display unit to display the feature information based on the display control information, including:
detecting whether associated characteristic information corresponding to the equipment identification information exists or not;
if the equipment identification information exists, controlling the display unit to display the associated characteristic information bound with the equipment identification information;
if the candidate feature information does not exist, selecting candidate feature information from a candidate feature information set to be bound, controlling the display unit to display the selected candidate feature information, and storing the selected candidate feature information as associated feature information bound with the equipment identification information.
2. The apparatus according to claim 1, wherein the communication unit comprises a first communication module and a second communication module, the first communication module is configured to connect with the other device, and the second communication module is configured to connect with the target device.
3. The apparatus of claim 2, wherein the first communication module is configured to wirelessly connect with the other device; the second communication module is provided with an interface for wired connection with the target device.
4. The apparatus of claim 3, wherein the interface is a serial interface.
5. The apparatus according to claim 1, wherein in a case where the display control information is preset feature information, the control unit performs control of the display unit to display the feature information based on the display control information, including:
and controlling the display unit to display the preset characteristic information.
6. The device according to any one of claims 1-5, wherein the display unit is an array of light emitting diodes.
7. A robot comprising a robot body and a positioning assistance device, wherein the positioning assistance device is provided on the robot body and is the device according to any one of claims 1 to 6.
8. A visual positioning system, comprising:
a target device provided with the positioning assistance apparatus of any one of claims 1 to 6;
and the visual positioning equipment is used for carrying out visual positioning according to the characteristic information displayed by the positioning auxiliary device so as to determine the pose information of the target equipment.
CN202010719595.1A 2020-07-23 2020-07-23 Positioning assistance device, robot and visual positioning system Active CN111844041B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010719595.1A CN111844041B (en) 2020-07-23 2020-07-23 Positioning assistance device, robot and visual positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010719595.1A CN111844041B (en) 2020-07-23 2020-07-23 Positioning assistance device, robot and visual positioning system

Publications (2)

Publication Number Publication Date
CN111844041A CN111844041A (en) 2020-10-30
CN111844041B true CN111844041B (en) 2021-11-09

Family

ID=72949358

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010719595.1A Active CN111844041B (en) 2020-07-23 2020-07-23 Positioning assistance device, robot and visual positioning system

Country Status (1)

Country Link
CN (1) CN111844041B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105573316A (en) * 2015-12-01 2016-05-11 武汉科技大学 Autonomous-formation mobile swarm robot
CN107076557A (en) * 2016-06-07 2017-08-18 深圳市大疆创新科技有限公司 Mobile robot recognition positioning method, device, system and mobile robot
CN110774294A (en) * 2019-11-15 2020-02-11 国家电网有限公司 Positioning and laminating system and method for ultrasonic partial discharge probe of GIS detection robot
CN212326284U (en) * 2018-11-28 2021-01-12 尚科宁家运营有限公司 Optical beacon

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9727963B2 (en) * 2015-09-18 2017-08-08 Auris Surgical Robotics, Inc. Navigation of tubular networks

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105573316A (en) * 2015-12-01 2016-05-11 武汉科技大学 Autonomous-formation mobile swarm robot
CN107076557A (en) * 2016-06-07 2017-08-18 深圳市大疆创新科技有限公司 Mobile robot recognition positioning method, device, system and mobile robot
CN212326284U (en) * 2018-11-28 2021-01-12 尚科宁家运营有限公司 Optical beacon
CN110774294A (en) * 2019-11-15 2020-02-11 国家电网有限公司 Positioning and laminating system and method for ultrasonic partial discharge probe of GIS detection robot

Also Published As

Publication number Publication date
CN111844041A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
US10614569B2 (en) Mobile processing device system for patient monitoring data acquisition
CN106826822B (en) A kind of vision positioning and mechanical arm crawl implementation method based on ROS system
JP3749369B2 (en) Hand pointing device
CN105518560A (en) Location-based control method and apparatus, mobile machine and robot
CN104487014A (en) Efficient and interactive bleeding detection in a surgical system
CN111383266B (en) Object tracking system and object tracking method
US9479228B2 (en) Unit device and wireless power supply information providing system
KR101239532B1 (en) Apparatus and method for recognizing position of robot
CN102184053A (en) Novel projector unit
CN205787823U (en) A kind of control device coordinating visual identity to apply
JPH1163927A (en) Head position and posture measuring device, and operation monitoring device
JPWO2016163564A1 (en) Information processing apparatus, information processing system, position notification method, and program
WO2013025011A1 (en) Method and system for body tracking for recognizing gestures in a space
CN111844041B (en) Positioning assistance device, robot and visual positioning system
JP6445935B2 (en) Work support device, work support method, and work support program
US11376736B2 (en) Activity area restriction method, robot control apparatus, and notifier
CN111975776A (en) Robot movement tracking system and method based on deep learning and Kalman filtering
JP2002288637A (en) Environmental information forming method
CN112528699B (en) Method and system for obtaining identification information of devices or users thereof in a scene
CN202093496U (en) Novel projector
WO2020244480A1 (en) Relative positioning device, and corresponding relative positioning method
JP2015188284A (en) Terminal, server, work support system and work support method
CN113813168A (en) Meridian point follow-up identification method and device, equipment and storage medium
CN113807204A (en) Human body meridian recognition method and device, equipment and storage medium
TWI689743B (en) Object positioning system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant