CN111157003A - Indoor mobile robot position tracking detection method - Google Patents

Indoor mobile robot position tracking detection method Download PDF

Info

Publication number
CN111157003A
CN111157003A CN201911398929.3A CN201911398929A CN111157003A CN 111157003 A CN111157003 A CN 111157003A CN 201911398929 A CN201911398929 A CN 201911398929A CN 111157003 A CN111157003 A CN 111157003A
Authority
CN
China
Prior art keywords
mobile robot
tracker
image
infrared lamp
optical signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911398929.3A
Other languages
Chinese (zh)
Inventor
熊开胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Lyuchuang Detection Technology Service Co ltd
Original Assignee
Suzhou Lyuchuang Detection Technology Service Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Lyuchuang Detection Technology Service Co ltd filed Critical Suzhou Lyuchuang Detection Technology Service Co ltd
Priority to CN201911398929.3A priority Critical patent/CN111157003A/en
Publication of CN111157003A publication Critical patent/CN111157003A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a position tracking and detecting method for an indoor mobile robot, which comprises the following steps: 1) installing a tracker on the mobile robot, wherein the tracker is used for emitting optical signals and is used as a tracking point of the mobile robot; 2) adding an optical signal filter in front of a lens of the camera, wherein the optical signal filter is used for filtering other optical signals except the optical signal emitted by the tracker; 3) the camera receives the emitted light signal of the tracker, photographs the scene where the mobile robot is located and generates an image; 4) transmitting the image to a processor, and carrying out image algorithm processing on the image to obtain the central position coordinates of the infrared lamp bead; 5) and (5) repeating the step 3) and the step 4), transmitting the central position coordinates of the infrared lamp beads obtained at different times to a screen, and positioning the position coordinates of the mobile robot in real time. Compared with the prior art, the method can effectively solve the problems of large data calculation amount and difficulty in identification, and can realize real-time positioning of the robot.

Description

Indoor mobile robot position tracking detection method
Technical Field
The invention relates to a tracking detection method, in particular to a position tracking detection method for an indoor mobile robot.
Background
Compared with the traditional industrial robot, the mobile robot has the advantages of high working efficiency, simplicity in driving and controlling, flexibility and convenience in operation and the like, and therefore, the mobile robot has wide application in various fields. In the civil field, the mobile robot can replace human beings to engage in various heavy tasks, such as the inspection of substation equipment, the security inspection of markets, the storage logistics distribution and other occasions. In the military field, various unmanned fighters, bomb-dismantling explosion-proof robots and the like are increasingly widely applied. Therefore, with the increasing maturity of intelligent technology, various mobile robots will be more widely applied.
The track tracking control is a fundamental problem of mobile robot research and is the core of an intelligent technology, so that the improvement of the track tracking control performance of the mobile robot has important theoretical significance and practical value for improving the automation level of the robot; the original standard adopts common image tracking, the common image tracking method has large calculation amount for a computer, and the calculation amount is larger when a plurality of cameras process data at a splicing position. And when the graph is blocked, the algorithm is not easy to recognize. Overall, the calculation amount is large, the identification is not easy, the requirement on computer hardware is high, and the real-time positioning cannot be realized.
Disclosure of Invention
In view of this, the invention provides a method for tracking and detecting the position of an indoor mobile robot, which can effectively solve the problems of large data calculation amount and difficult identification and can realize real-time positioning of the robot.
Therefore, the invention provides a position tracking detection method for an indoor mobile robot, which comprises the following steps:
1) installing a tracker on the mobile robot, wherein the tracker is used for emitting optical signals and is used as a tracking point of the mobile robot;
2) adding an optical signal filter in front of a lens of the camera, wherein the optical signal filter is used for filtering other optical signals except the optical signal emitted by the tracker;
3) the camera receives the emitted light signal of the tracker, photographs the scene where the mobile robot is located and generates an image;
4) transmitting the image to a processor, and carrying out image algorithm processing on the image to obtain the central position coordinates of the infrared lamp bead;
5) and (5) repeating the step 3) and the step 4), transmitting the central position coordinates of the infrared lamp beads obtained at different times to a screen, and positioning the position coordinates of the mobile robot in real time.
Further, the tracker is an infrared lamp bead.
Further, the signal of the light emitted by the infrared lamp bead is infrared light.
Further, the center wavelength of the infrared light is 850nm, and the half band width is 20 nm.
Further, the optical signal filter is a 850nm narrowband filter.
Furthermore, only the infrared lamp beads on the image emit light with the wavelength of 850nm, and the other places are black.
The invention provides a position tracking detection method for an indoor mobile robot, which mainly comprises the steps of 1) installing a tracker on the mobile robot, wherein the tracker is used for emitting optical signals and is used as a tracking point of the mobile robot; 2) adding an optical signal filter in front of a lens of the camera, wherein the optical signal filter is used for filtering other optical signals except the optical signal emitted by the tracker; 3) the camera receives the emitted light signal of the tracker, photographs the scene where the mobile robot is located and generates an image; 4) transmitting the image to a processor, and carrying out image algorithm processing on the image to obtain the central position coordinates of the infrared lamp bead; 5) and (5) repeating the step 3) and the step 4), transmitting the central position coordinates of the infrared lamp beads obtained at different times to a screen, and positioning the position coordinates of the mobile robot in real time. The method comprises the following specific steps: an 850nm narrow-band filter is added in front of a lens of the camera to filter out unwanted light, and only light with the central wavelength of 850nm and the half bandwidth of 20nm is transmitted to the camera. Light emitted by 850nm infrared lamp beads is used as a tracking point. The image shot by the camera is only light emitted by 850nm infrared lamp beads, and other places are black, so that the center position of the bright point can be quickly found out in the algorithm, and the position coordinate of the mobile robot can be positioned in real time. Meanwhile, the bright spot is small, and the probability of being shielded by the articles in the room is also reduced.
Therefore, compared with the prior art, the indoor mobile robot position tracking detection method provided by the invention can effectively solve the problems of large data calculation amount and difficulty in identification, and can realize real-time positioning of the robot.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a schematic structural diagram of a robot in an indoor mobile robot position tracking and detecting method according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The first embodiment is as follows:
referring to fig. 1, a method for tracking and detecting a position of an indoor mobile robot according to an embodiment of the present invention is shown, including the following steps:
1) installing a tracker on a mobile robot, wherein the tracker is used for emitting light signals and is used as a tracking point of the mobile robot;
2) adding an optical signal filter in front of a lens of the camera, wherein the optical signal filter is used for filtering other optical signals except the optical signal emitted by the tracker;
3) the camera receives the emitted light signal of the tracker, photographs the scene where the mobile robot is located and generates an image;
4) transmitting the image to a processor, and carrying out image algorithm processing on the image to obtain a central position coordinate of the infrared lamp bead;
5) and (5) repeating the step 3) and the step 4), transmitting the central position coordinates of the infrared lamp beads obtained at different times to a screen, and positioning the position coordinates of the mobile robot in real time.
The invention provides a position tracking detection method for an indoor mobile robot, which mainly comprises the steps of 1) installing a tracker on the mobile robot, wherein the tracker is used for emitting optical signals and is used as a tracking point of the mobile robot; 2) adding an optical signal filter in front of a lens of the camera, wherein the optical signal filter is used for filtering other optical signals except the optical signal emitted by the tracker; 3) the camera receives the emitted light signal of the tracker, photographs the scene where the mobile robot is located and generates an image; 4) transmitting the image to a processor, and carrying out image algorithm processing on the image to obtain the central position coordinates of the infrared lamp bead; 5) and (5) repeating the step 3) and the step 4), transmitting the central position coordinates of the infrared lamp beads obtained at different times to a screen, and positioning the position coordinates of the mobile robot in real time. The method comprises the following specific steps: an 850nm narrow-band filter is added in front of a lens of the camera to filter out unwanted light, and only light with the central wavelength of 850nm and the half bandwidth of 20nm is transmitted to the camera. Light emitted by 850nm infrared lamp beads is used as a tracking point. The image shot by the camera is only light emitted by 850nm infrared lamp beads, and other places are black, so that the center position of the bright point can be quickly found out in the algorithm, and the position coordinate of the mobile robot can be positioned in real time. Meanwhile, the bright spot is small, and the probability of being shielded by the articles in the room is also reduced.
Therefore, compared with the prior art, the indoor mobile robot position tracking detection method provided by the invention can effectively solve the problems of large data calculation amount and difficulty in identification, and can realize real-time positioning of the robot.
Example two:
referring to fig. 1, a method for tracking and detecting a position of an indoor mobile robot according to a second embodiment of the present invention is shown, and the present embodiment further adopts the following technical solutions as improvements on the basis of the above embodiments: the tracker is infrared lamp bead, and the signal that infrared lamp bead sent the light is the infrared light, and the central wavelength of infrared light is 850nm, and half bandwidth 20 nm. The infrared lamp beads are small in size, the probability of being shielded by objects in a room is reduced, the range of signals received by the camera can be improved, and the effect of facilitating tracking control of the robot is achieved.
Example three:
referring to fig. 1, a method for tracking and detecting a position of an indoor mobile robot according to a third embodiment of the present invention is shown, and the present embodiment further adopts the following technical solutions as improvements on the basis of the above embodiments: the optical signal filter is a 850nm narrow-band filter, only the infrared lamp beads emit light with the wavelength of 850nm on an image, and the other places are black, so that the camera can filter natural light through the 850nm narrow-band filter and can only receive infrared light emitted by the infrared lamp beads on the mobile robot, only the infrared lamp beads emit light with the wavelength of 850nm on the image shot by the camera, and the other places are black and can be reduced.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (6)

1. A position tracking detection method for an indoor mobile robot is characterized by comprising the following steps:
1) installing a tracker on a mobile robot, wherein the tracker is used for emitting light signals and is used as a tracking point of the mobile robot;
2) adding an optical signal filter in front of a lens of the camera, wherein the optical signal filter is used for filtering other optical signals except the optical signal emitted by the tracker;
3) the camera receives the emitted light signal of the tracker, photographs the scene where the mobile robot is located and generates an image;
4) transmitting the image to a processor, and carrying out image algorithm processing on the image to obtain a central position coordinate of the infrared lamp bead;
5) and (5) repeating the step 3) and the step 4), transmitting the central position coordinates of the infrared lamp beads obtained at different times to a screen, and positioning the position coordinates of the mobile robot in real time.
2. The method of claim 1, wherein the tracker is an infrared lamp bead.
3. The method according to claim 2, wherein the signal emitted by the infrared lamp bead is infrared light.
4. The method as claimed in claim 3, wherein the infrared light has a center wavelength of 850nm and a half-band width of 20 nm.
5. The method as claimed in claim 1, wherein the optical filter is a 850nm narrowband filter.
6. The method of claim 5, wherein only the infrared lamp beads emit light with a wavelength of 850nm on the image, and the other parts are black.
CN201911398929.3A 2019-12-30 2019-12-30 Indoor mobile robot position tracking detection method Pending CN111157003A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911398929.3A CN111157003A (en) 2019-12-30 2019-12-30 Indoor mobile robot position tracking detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911398929.3A CN111157003A (en) 2019-12-30 2019-12-30 Indoor mobile robot position tracking detection method

Publications (1)

Publication Number Publication Date
CN111157003A true CN111157003A (en) 2020-05-15

Family

ID=70559593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911398929.3A Pending CN111157003A (en) 2019-12-30 2019-12-30 Indoor mobile robot position tracking detection method

Country Status (1)

Country Link
CN (1) CN111157003A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102033222A (en) * 2010-11-17 2011-04-27 吉林大学 Large-scale multiple-object ultrasonic tracking and locating system and method
CN105973226A (en) * 2016-06-21 2016-09-28 昆山穿山甲机器人有限公司 Indoor robot locating and navigating system and method
CN106826821A (en) * 2017-01-16 2017-06-13 深圳前海勇艺达机器人有限公司 The method and system that robot auto-returned based on image vision guiding charges
CN107797560A (en) * 2017-11-28 2018-03-13 广州中国科学院先进技术研究所 A kind of visual identifying system and method for robotic tracking
US20180365840A1 (en) * 2017-06-19 2018-12-20 Inuitive Ltd. Optical module and a method for objects' tracking under poor light conditions
CN110298334A (en) * 2019-07-05 2019-10-01 齐鲁工业大学 Tracking robot multi-targets recognition device based on infrared image processing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102033222A (en) * 2010-11-17 2011-04-27 吉林大学 Large-scale multiple-object ultrasonic tracking and locating system and method
CN105973226A (en) * 2016-06-21 2016-09-28 昆山穿山甲机器人有限公司 Indoor robot locating and navigating system and method
CN106826821A (en) * 2017-01-16 2017-06-13 深圳前海勇艺达机器人有限公司 The method and system that robot auto-returned based on image vision guiding charges
US20180365840A1 (en) * 2017-06-19 2018-12-20 Inuitive Ltd. Optical module and a method for objects' tracking under poor light conditions
CN107797560A (en) * 2017-11-28 2018-03-13 广州中国科学院先进技术研究所 A kind of visual identifying system and method for robotic tracking
CN110298334A (en) * 2019-07-05 2019-10-01 齐鲁工业大学 Tracking robot multi-targets recognition device based on infrared image processing

Similar Documents

Publication Publication Date Title
Alzugaray et al. Asynchronous corner detection and tracking for event cameras in real time
CN110442120B (en) Method for controlling robot to move in different scenes, robot and terminal equipment
CN107170011B (en) robot vision tracking method and system
WO2019096902A1 (en) System and method for real-time large image homography processing
CN103345644A (en) Method and device for detecting online-training targets
TWI726278B (en) Driving detection method, vehicle and driving processing device
CN105486288A (en) Machine-vision-based vision servo alignment system
Suarez et al. Using the Kinect for search and rescue robotics
Dietsche et al. Powerline tracking with event cameras
Iaboni et al. Event camera based real-time detection and tracking of indoor ground robots
KR100970121B1 (en) Method, system, and computer-readable recording medium for performing image matching adaptively according to various conditions
WO2021196969A1 (en) Positioning method and apparatus, device, and medium
CN107808109B (en) Two-dimensional code image identification method and mobile terminal
CN111964680A (en) Real-time positioning method of inspection robot
CN112917467A (en) Robot positioning and map building method and device and terminal equipment
CN112784675B (en) Target detection method and device, storage medium and terminal
CN111157003A (en) Indoor mobile robot position tracking detection method
Chen et al. Efficient and lightweight grape and picking point synchronous detection model based on key point detection
Li et al. Object detection based on color and shape features for service robot in semi-structured indoor environment
Martins et al. Real-time generic ball recognition in RoboCup domain
Linder et al. Towards accurate 3D person detection and localization from RGB-D in cluttered environments
Das et al. Vision based object tracking by mobile robot
Meng et al. Prob-slam: real-time visual slam based on probabilistic graph optimization
US20160041036A1 (en) Method of non-contact control using a polarizing pen and system incorporating same
Lin et al. A monocular target pose estimation system based on an infrared camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination