CN113341977A - Mobile robot based on passive infrared label location - Google Patents

Mobile robot based on passive infrared label location Download PDF

Info

Publication number
CN113341977A
CN113341977A CN202110641333.2A CN202110641333A CN113341977A CN 113341977 A CN113341977 A CN 113341977A CN 202110641333 A CN202110641333 A CN 202110641333A CN 113341977 A CN113341977 A CN 113341977A
Authority
CN
China
Prior art keywords
passive infrared
tag
positioning
infrared
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110641333.2A
Other languages
Chinese (zh)
Inventor
黄威
吴迪
赵文泉
张琪昌
郑挺
刘列
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FJ Dynamics Technology Co Ltd
Original Assignee
FJ Dynamics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FJ Dynamics Technology Co Ltd filed Critical FJ Dynamics Technology Co Ltd
Priority to CN202110641333.2A priority Critical patent/CN113341977A/en
Publication of CN113341977A publication Critical patent/CN113341977A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the invention discloses a mobile robot based on passive infrared label positioning, which comprises: the tag module comprises a plurality of passive infrared tags distributed at different positions; the acquisition module is used for identifying and acquiring the image information of the passive infrared tag; the positioning module is used for calculating the position and the posture of the acquisition module according to the image information; and the motion module is used for controlling the moving direction and the gesture of the robot according to the position and the gesture of the acquisition module. The embodiment of the invention discloses a mobile robot based on passive infrared tag positioning, which identifies a passive infrared tag through a camera, calculates the relative position of the mobile robot in real time, solves the problem of great difference in performance and cost of the robot positioning technology in the prior art, and realizes the effects of low positioning cost, accurate positioning, good flexibility and good adaptability to scenes.

Description

Mobile robot based on passive infrared label location
Technical Field
The embodiment of the invention relates to a positioning technology, in particular to a mobile robot based on passive infrared label positioning.
Background
In recent years, with the continuous development of scientific technology and the need of industrial upgrading, more and more intelligent mobile robots enter the fields of industry (such as routing inspection and intelligent factories), civil use (such as service robots and disinfection robots) and the like, so that human beings are liberated from dangerous, heavy and repeated work, the labor cost is reduced for enterprises, and the production efficiency is improved.
The existing robot navigation technology is limited by application scenes, and has great difference in performance and cost, for example, the positioning technology based on the laser radar is accurate but high in price, and is difficult to position under the conditions of more moving objects, large scenes and the like; the technology based on pure visual positioning has low cost, but the scheme is complex and is limited by the influence of illumination conditions; although the technology based on magnetic navigation has lower cost, the original ground is easy to damage and the construction is complex in the installation process of the magnetic nail or the magnetic strip; the positioning technology based on the ultra-wideband wireless carrier communication has high cost, particularly needs more base stations in a large area, and is easily interfered by other signals to influence the operation precision of the mobile robot.
Disclosure of Invention
The invention provides a mobile robot based on passive infrared tag positioning, which aims to realize the effects of low positioning cost, accurate positioning, better flexibility and good adaptability to scenes.
The embodiment of the invention provides a mobile robot based on passive infrared label positioning, which comprises:
the tag module comprises a plurality of passive infrared tags distributed at different positions;
the acquisition module is used for identifying and acquiring the image information of the passive infrared tag;
the positioning module is used for calculating the position and the posture of the acquisition module according to the image information;
and the motion module is used for controlling the moving direction and the gesture of the robot according to the position and the gesture of the acquisition module.
Optionally, the plurality of passive infrared tags are distributed at a plurality of positions according to a preset rule, and each passive infrared tag includes an individual ID number.
Optionally, the passive infrared tag is connected to the reflective tag attaching carrier, and the passive infrared tag is disposed on the reflective tag attaching carrier.
Optionally, the passive infrared tag includes a coordinate point, a check point and a coding information point, the coordinate point is used to determine the coordinate system of the passive infrared tag, the check point is used to determine whether the identified coordinate point is correct, and the coding information point is used to identify the individual ID number of the passive infrared tag.
Optionally, the acquisition module includes an acquisition camera, and the acquisition camera is used for identifying and acquiring the image information of the passive infrared tag.
Optionally, the acquisition module further includes an infrared light supplementing device, an infrared narrowband optical filter and an infrared light transmitting sheet, and the infrared light supplementing device is connected to the acquisition camera and is used for supplementing infrared light to the passive infrared tag; the infrared narrowband filter is arranged in the acquisition camera and used for filtering infrared light, and the infrared light transmission piece is arranged on the outer surface of the infrared light supplementing device and used for filtering visible light.
Optionally, the infrared light supplementing device is consistent with the infrared narrowband filter and the infrared light transmitting sheet in design wavelength.
Optionally, the positioning module further obtains the coordinates of the acquisition module by identifying the ID number of the passive infrared tag and combining the coordinates of the passive infrared tag.
Optionally, the positioning module and the motion module are further configured to:
performing image distortion removal on the image information;
extracting reflection points from the image information after distortion removal;
clustering the reflective points according to the reflective points;
identifying the clustering clusters of the light reflecting points;
calculating the position and the posture of the acquisition module according to the light reflecting point cluster;
and determining the position of the robot according to the position and the posture of the acquisition module.
Optionally, the positioning module is disposed at a top position of the mobile robot.
The embodiment of the invention discloses a mobile robot based on passive infrared label positioning, which comprises: the tag module comprises a plurality of passive infrared tags distributed at different positions; the acquisition module is used for identifying and acquiring the image information of the passive infrared tag; the positioning module is used for calculating the position and the posture of the acquisition module according to the image information; and the motion module is used for controlling the moving direction and the gesture of the robot according to the position and the gesture of the acquisition module. The embodiment of the invention discloses a mobile robot based on passive infrared tag positioning, which identifies a passive infrared tag through a camera, calculates the relative position of the mobile robot in real time, solves the problem of great difference in performance and cost of the robot positioning technology in the prior art, and realizes the effects of low positioning cost, accurate positioning, good flexibility and good adaptability to scenes.
Drawings
Fig. 1 is a schematic connection diagram of a mobile robot based on passive infrared tag positioning according to an embodiment of the present invention;
fig. 2 is an exemplary situation of a passive infrared tag according to an embodiment of the present invention;
fig. 3 is a diagram of an identification process for identifying a position of a robot by a passive infrared tag according to a first embodiment of the present invention;
fig. 4 is a schematic structural diagram of an acquisition module according to a first embodiment of the present invention;
FIG. 5 is a schematic diagram illustrating a position of a positioning module according to an embodiment of the present invention;
FIG. 6 is a flowchart of a method according to a second embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. A process may be terminated when its operations are completed, but may have additional steps not included in the figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc.
Furthermore, the terms "first," "second," and the like may be used herein to describe various orientations, actions, steps, elements, or the like, but the orientations, actions, steps, or elements are not limited by these terms. These terms are only used to distinguish one direction, action, step or element from another direction, action, step or element. For example, a first module may be termed a second module, and, similarly, a second module may be termed a first module, without departing from the scope of the present application. The first module and the second module are both modules, but they are not the same module. The terms "first", "second", etc. are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Example one
Fig. 1 is a schematic connection diagram of a mobile robot based on passive infrared tag positioning according to an embodiment of the present invention, where the mobile robot based on passive infrared tag positioning according to the embodiment of the present invention is suitable for a situation where a user determines to position a camera, and specifically, the mobile robot based on passive infrared tag positioning according to the embodiment of the present invention includes: the system comprises a label module 1, an acquisition module 2, a positioning module 3 and a motion module 4.
The tag module 1 comprises a plurality of passive infrared tags distributed at different positions.
In this embodiment, the plurality of passive infrared tags are distributed at a plurality of locations according to a preset rule, and each passive infrared tag includes an individual ID number. Referring to fig. 2, fig. 2 is an exemplary situation of the passive infrared tag in the present embodiment, in the example, a black solid point is the place where the reflective point is attached, and a hollow circle in the example is the place where the reflective point is not attached in the present example. The passive infrared tag comprises a coordinate point, a check point and a coding information point. The coordinate points (211, 212, 213) are used for determining a coordinate system of the passive infrared tag, and the coordinates of the light reflecting points are determined by the coordinate system. The check point (220) is used for confirming whether the identified coordinate point is correct or not. The coded information points (230, 231, 232) are used for resolving the ID number of the passive infrared tag through the special coding mode. In different examples of the passive infrared tag, the coded information points (230, 231, 232) and the hollow circles are distributed according to the special coding mode.
Referring to fig. 3, fig. 3 is a diagram of an identification process for identifying a position of a robot by a passive infrared tag 401 in this embodiment, the passive infrared tag 401 is connected to a reflective label sticker carrier 402, and the passive infrared tag 401 is disposed on the reflective label sticker carrier 102. The plurality of passive infrared tags 401 are distributed at different positions, a plurality of passive infrared tags 401 with different codes are pasted above the working area of the camera 403, the image above the working area is collected through the camera 403, and then the position and the posture of the current camera 403 are obtained through a recognition algorithm. Generally, the passive infrared tag 401 is made of a reflective material, the infrared light supplement device 22 in the collection module 2 needs to be connected to a power supply to actively emit light, and the position of the passive infrared tag 401 is acquired by the camera 403. The surface of the passive infrared label 401 is coated with a reflective material, and the infrared light supplement device 22 emits infrared light through the collection module 2, wherein the reflective label pasting carrier 402 can be a wall body, a bracket, a ceiling and the like, and is not limited in this embodiment, so that the reflective label pasting condition is met. In an alternative embodiment, the passive infrared tag 401 may also be connected to the reflective label sticker by multiple means on a planar base, and the user may adapt according to the needs.
In this embodiment, the acquisition module 2 is an infrared camera, and the infrared camera 21 works on the principle that an infrared lamp emits infrared rays to irradiate an object, and the infrared rays are subjected to diffuse reflection and received by the infrared camera to form a video image. The infrared camera identifies and collects image information of the passive infrared tag, wherein the image information comprises position information and number information of the passive infrared tag, illustratively, the image information comprises a specific three-dimensional coordinate and a specific number of the passive infrared tag, so that the specific position and posture of the infrared camera can be accurately obtained through calculation. The acquisition module 2 acquires image information in the camera view field in the operation process and transmits the image information to the identification module for further processing.
The acquisition module 2 comprises an acquisition camera which is used for identifying and acquiring the image information of the passive infrared tag. The acquisition module 2 further comprises an infrared light supplementing device 22, an infrared narrowband optical filter 23 and an infrared light transmitting sheet 24, wherein the infrared light supplementing device 22 is connected with the acquisition camera and is used for supplementing infrared light to the passive infrared tag; the infrared narrowband filter 23 is arranged in the acquisition camera and used for filtering infrared light, and the infrared light transmission piece 24 is arranged on the outer surface of the infrared light supplementing device and used for filtering visible light.
In the present embodiment, referring to fig. 4, fig. 4 is a schematic structural diagram of the acquisition module 2 in the present embodiment, wherein the acquisition camera is an infrared camera 21. The infrared light supplementing devices 22 are arranged at two ends of the infrared camera 21 and used for supplementing infrared light to the passive infrared tags, and the infrared light supplementing devices 22 can comprise infrared light supplementing lamps, infrared light transmitting sheets and the like. The infrared narrowband filter 23 is disposed in the infrared camera 21, the infrared narrowband filter 23 is subdivided from the band pass filter, and the definition of the infrared narrowband filter is the same as that of the band pass filter, that is, the filter allows the optical signal to pass through in a specific waveband, but blocks the optical signal on two sides outside the specific waveband, and the passband of the infrared narrowband filter 23 is relatively narrow, generally less than 5% of the central wavelength value, and is mainly used for limiting the infrared light range. The infrared light-transmitting sheet 24 is disposed on an outer surface of the infrared light supplement device and used for filtering visible light. In an alternative embodiment, other infrared light supplement devices may also be included, which is not specifically limited in this embodiment. The wavelength of the infrared light used by the infrared light supplement device 22 should be the same as the design wavelength of the infrared narrowband filter 23, so that the effect that the acquisition camera 21 only acquires the wavelength of the infrared light is achieved. The infrared light supplement device 22 uses an infrared light wavelength that is consistent with the designed wavelength of the infrared light-transmitting sheet 23, so that the effect of only transmitting the infrared light wavelength is achieved.
The positioning module 3 is used for calculating the position and the posture of the acquisition module 2 according to the image information.
In this embodiment, the positioning module 3 receives the data acquired by the acquisition module 2 and then identifies the ID of the current instance of the passive infrared tag, and obtains the coordinate of the current camera in the coordinate system of the passive infrared tag through a special positioning calculation algorithm by combining the coordinate of the reflective point of the passive infrared tag and the corresponding pixel coordinate in the data acquired by the acquisition module 2, so as to achieve the purpose of positioning. The positioning module 3 may be an MCU or a CPU processor, for example, an MCU (micro control Unit), also called a Single Chip Microcomputer (Single Chip Microcomputer) or a Single Chip Microcomputer (Single Chip Microcomputer), which properly reduces the frequency and specification of a Central Processing Unit (CPU), and integrates peripheral interfaces such as a memory, a counter (Timer), a USB, an a/D converter, a UART, a PLC, a DMA, etc., even an LCD driving circuit on a Single Chip to form a Chip-level computer, thereby performing different combination control for different applications. And calculating the image information to obtain the position and the posture of the acquisition module 2 by integrating a recognition algorithm in the MCU. The positioning module 3 further acquires the coordinates of the acquisition module 2 by identifying the ID number of the passive infrared tag and combining the coordinates of the passive infrared tag.
Referring to fig. 5, fig. 5 is a schematic position diagram of a positioning module in the embodiment, where the positioning module 601 is disposed at a top position of the mobile robot 602. The positioning module 601 is arranged at the top of the mobile robot 602, so that the passive infrared tag in the semi-air can be conveniently identified, and the position of the mobile robot 602 can be accurately acquired.
And the motion module 4 is used for controlling the moving direction and the gesture of the robot according to the position and the gesture of the acquisition module.
In this embodiment, the motion module 4 is a robot processing module, which may be an MCU or a CPU processor, as the same as the positioning module 3, and generates a control instruction for controlling the moving direction and posture of the robot through the motion module 4, and calculates the relative position of the mobile robot in real time, so as to precisely control the movement of the robot.
The embodiment of the invention discloses a mobile robot based on passive infrared label positioning, which comprises: the tag module comprises a plurality of passive infrared tags distributed at different positions; the acquisition module is used for identifying and acquiring image information of one or more passive infrared tags; the positioning module is used for calculating the position and the posture of the acquisition module according to the image information; and the motion module is used for controlling the moving direction and the gesture of the robot according to the position and the gesture of the acquisition module. The embodiment of the invention discloses a mobile robot based on passive infrared tag positioning, which identifies a passive infrared tag through a camera, calculates the relative position of the mobile robot in real time, solves the problem of great difference in performance and cost of the robot positioning technology in the prior art, and realizes the effects of low positioning cost, accurate positioning, good flexibility and good adaptability to scenes.
Example two
The mobile robot based on passive infrared tag positioning provided by the second embodiment of the present invention is described in detail on the basis of the first embodiment, and the mobile robot based on passive infrared tag positioning provided by the second embodiment of the present invention is suitable for a situation where a user positions a camera, and specifically, the mobile robot based on passive infrared tag positioning according to the first embodiment of the present invention includes: the system comprises a label module 1, an acquisition module 2, a positioning module 3 and a motion module 4.
The tag module 1 comprises a plurality of passive infrared tags distributed at different positions.
In this embodiment, the plurality of passive infrared tags are distributed at a plurality of locations according to a preset rule, and each passive infrared tag includes an individual ID number. The black solid point in the example is the position where the light reflecting point is adhered, and the hollow circle in the example is the position where the light reflecting point is not adhered in the current example. The passive infrared tag comprises a coordinate point, a check point and a coding information point. The coordinate points (211, 212, 213) are used for determining a coordinate system of the passive infrared tag, and the coordinates of the light reflecting points are determined by the coordinate system. The check point (220) is used for confirming whether the identified coordinate point is correct or not. The coded information points (230, 231, 232) are used for resolving the ID number of the passive infrared tag through the special coding mode. In different examples of the passive infrared tag, the coded information points (230, 231, 232) and the hollow circles are distributed according to the special coding mode.
The passive infrared tag 401 is connected to the reflective label sticker carrier 402, and the passive infrared tag 401 is disposed on the reflective label sticker carrier 102. The plurality of passive infrared tags 401 are distributed at different positions, a plurality of passive infrared tags 401 with different codes are pasted above the working area of the camera 403, the image above the working area is collected through the camera 403, and then the position and the posture of the current camera 403 are obtained through a recognition algorithm. Generally, the passive infrared tag 401 is made of a reflective material, the infrared light supplement device 22 in the collection module 2 needs to be connected to a power supply for active reflection, and the position of the passive infrared tag 401 is acquired by the camera 403. The surface of the passive infrared label 401 is coated with a reflective material, and infrared light is emitted through the infrared light supplement device 22 in the acquisition module 2, wherein the reflective label pasting carrier 402 can be a wall body, a bracket, a ceiling and the like, and is not limited in this embodiment, so that the reflective label pasting condition is met. In an alternative embodiment, the passive infrared tag 401 may also be connected to the reflective tag pasting carrier in various ways such as hook surface base connection, planar base, etc., wherein the hook surface base is a base with a certain radian on the surface, and the hook surface is connected with an object through a hook, which is beneficial to reducing the stress area of the base and makes the appearance more beautiful; the plane base compares with colluding the face base and has bigger atress area, is fit for connecting the higher condition of object pulling force, and user of service can carry out the adaptability adjustment according to actual conditions and demand.
In this embodiment, the acquisition module 2 is an infrared camera, and the operating principle of the infrared camera 21 is that an infrared lamp emits infrared rays to irradiate an object, the infrared rays are subjected to diffuse reflection and are received by a monitoring camera to form a video image. The infrared camera identifies and collects image information of the passive infrared tag, wherein the image information comprises position information and number information of the passive infrared tag, illustratively, the image information comprises a specific three-dimensional coordinate and a specific number of the passive infrared tag, so that the specific position and posture of the infrared camera can be accurately obtained through calculation. The acquisition module 2 acquires image information in the camera view field in the operation process and transmits the image information to the identification module for further processing.
The acquisition module 2 comprises an acquisition camera which is used for identifying and acquiring the image information of the passive infrared tag. The acquisition module 2 further comprises an infrared light supplementing device 22, an infrared narrowband optical filter 23 and an infrared light transmitting sheet 24, wherein the infrared light supplementing device 22 is connected with the acquisition camera and is used for supplementing infrared light to the passive infrared tag; the infrared narrowband filter 23 is arranged in the acquisition camera and used for filtering infrared light, and the infrared light transmission piece 24 is arranged on the outer surface of the infrared light supplementing device and used for filtering visible light.
In the present embodiment, the capturing camera is an infrared camera 21. The infrared light supplement devices 22 are disposed at two ends of the infrared camera 21 and used for performing infrared light supplement on the passive infrared tag, and the infrared light supplement devices 22 may include an infrared light supplement lamp, an infrared light supplement sheet, and the like. The infrared narrowband filter 23 is disposed in the infrared camera 21, the infrared narrowband filter 23 is subdivided from the band pass filter, and the definition of the infrared narrowband filter is the same as that of the band pass filter, that is, the filter allows the optical signal to pass through in a specific waveband, but blocks the optical signal on two sides outside the specific waveband, and the passband of the infrared narrowband filter 23 is relatively narrow, generally less than 5% of the central wavelength value, and is mainly used for limiting the infrared light range. The infrared light-transmitting sheet 24 is disposed on the outer surface of the infrared device, and is used for filtering visible light and retaining infrared light to be identified. In an alternative embodiment, other infrared light supplement devices may also be included, which is not specifically limited in this embodiment. The wavelength of the infrared light used by the infrared light supplement device 22 should be the same as the design wavelength of the infrared narrowband filter 23, so that the effect that the acquisition camera 21 only acquires the wavelength of the infrared light is achieved. The infrared light supplement device 22 uses an infrared light wavelength that is consistent with the designed wavelength of the infrared light-transmitting sheet 23, so that the effect of only transmitting the infrared light wavelength is achieved.
The positioning module 3 is used for calculating the position and the posture of the acquisition module 2 according to the image information.
In this embodiment, the positioning module 3 receives the data acquired by the acquisition module 2 and then identifies the ID of the current instance of the passive infrared tag, and obtains the coordinate of the current camera in the coordinate system of the passive infrared tag through a special positioning calculation algorithm by combining the coordinate of the reflective point of the passive infrared tag and the corresponding pixel coordinate in the data acquired by the acquisition module 2, so as to achieve the purpose of positioning. The positioning module 3 may be an MCU or a CPU processor, for example, an MCU (micro control Unit), also called a Single Chip Microcomputer (Single Chip Microcomputer) or a Single Chip Microcomputer (Single Chip Microcomputer), which properly reduces the frequency and specification of a Central Processing Unit (CPU), and integrates peripheral interfaces such as a memory, a counter (Timer), a USB, an a/D converter, a UART, a PLC, a DMA, etc., even an LCD driving circuit on a Single Chip to form a Chip-level computer, thereby performing different combination control for different applications. And calculating the image information to obtain the position and the posture of the acquisition module 2 by integrating a recognition algorithm in the MCU. The positioning module 3 further acquires the coordinates of the acquisition module 2 by identifying the ID number of the passive infrared tag and combining the coordinates of the passive infrared tag.
The positioning module 601 is disposed at a top position of the mobile robot 602. The positioning module 601 is arranged at the top of the mobile robot 602, so that the passive infrared tag in the semi-air can be conveniently identified, and the position of the mobile robot 602 can be accurately acquired.
And the motion module 4 is used for controlling the moving direction and the gesture of the robot according to the position and the gesture of the acquisition module.
In this embodiment, the motion module 4 is a robot processing module, which may be an MCU or a CPU processor, as the same as the positioning module 3, and generates a control instruction for controlling the moving direction and posture of the robot through the motion module 4, and calculates the relative position of the mobile robot in real time, so as to precisely control the movement of the robot.
In the present embodiment, referring to fig. 6, the positioning module 3 and the moving module 4 further include the following methods:
and step 600, performing image distortion removal on the image information.
In the present embodiment, the image distortion is caused by the distortion introduced by the lens manufacturing accuracy and the deviation of the assembly process, resulting in the distortion of the original image. The distortion of the lens is classified into radial distortion and tangential distortion. The distortion of camera module is inevitable, and that is in order to guarantee the formation of image quality of camera, must involve the distortion processing of image, and the pixel position that distorts and distorts removes the distortion processing, finds the corresponding position in undistorted image, and bilinear interpolation calculates its pixel value to obtain the image that distorts, guaranteed the quality of image, improved the degree of accuracy of discernment.
And step 610, extracting reflection points from the image information after the distortion removal.
In this embodiment, the extracting the reflection points from the undistorted image information includes: and carrying out image gray threshold segmentation, image binarization, connected domain searching and central point calculation of the connected domain on the image information after distortion removal so as to extract the reflecting points. Specifically, the image is subjected to various processes to extract a plurality of reflection points in the image, and the extraction manner includes but is not limited to image gray threshold segmentation, image binarization, connected domain searching, calculation of center points of connected domains and the like.
And step 620, clustering the reflective points according to the reflective points.
In this embodiment, the clustering the reflection points according to the reflection points includes: and segmenting the reflective points by using a clustering algorithm according to the reflective points, wherein the clustering algorithm comprises a density clustering method and a nearest neighbor clustering method. Specifically, the reflective point clustering is to segment the reflective points by using a clustering algorithm according to the reflective points extracted by the reflective point extracting part, wherein the clustering algorithm includes, but is not limited to, a density clustering method, a nearest neighbor clustering method and the like.
And 630, identifying the reflecting point cluster.
In this embodiment, the identifying the cluster of reflection points includes: and identifying the cluster obtained by dividing the light reflecting point cluster part according to a corresponding coding rule to obtain the ID number corresponding to the infrared light reflecting label. Specifically, the light reflecting point cluster identification part is used for identifying clusters obtained by dividing the light reflecting point cluster part according to corresponding coding rules of the infrared light reflecting labels, and the identification result is the ID number corresponding to the infrared light reflecting labels.
And step 640, calculating the position and the posture of the acquisition module according to the light reflecting point cluster.
Calculating the position and the posture of the acquisition module 2 according to the reflective point cluster comprises the following steps: and calculating the light reflecting point cluster according to a calculation formula so as to obtain the position and the posture of the acquisition module 2. The calculation formula comprises:
Figure BDA0003107915520000141
u and v are pixel coordinates of the reflection point in the image respectively, K is an internal reference matrix of the acquisition camera, and the dimensionality of the internal reference matrix is 3 multiplied by 3; r and t are respectively the position and the posture of the acquisition module 2 relative to the passive infrared tag, wherein R is a 3 x 3 matrix, and t is a 3 x 1 vector; xw,Yw,ZwAnd the position coordinate values of the reflective points in the passive infrared label under a world coordinate system are respectively. In the embodiment, the identification and positioning method based on the passive infrared tag has the advantages of low application cost, accurate positioning, good flexibility, convenience in changing or expanding a path, easiness in maintenance, capability of achieving the best balance effect between the cost and the performance when the identification and positioning method is put into market for use, and high cost performance.
And 650, determining the position of the robot according to the position and the posture of the acquisition module.
In the embodiment, the motion module 4 generates a control instruction for controlling the moving direction and posture of the robot, and calculates the relative position of the mobile robot in real time, thereby accurately controlling the movement of the robot.
The embodiment of the invention discloses a mobile robot based on passive infrared label positioning, which comprises: the tag module comprises one or more passive infrared tags distributed at different positions; the acquisition module is used for identifying and acquiring image information of the plurality of passive infrared tags; the positioning module is used for calculating the position and the posture of the acquisition module according to the image information; and the motion module is used for controlling the moving direction and the gesture of the robot according to the position and the gesture of the acquisition module. The embodiment of the invention discloses a mobile robot based on passive infrared tag positioning, which identifies a passive infrared tag through a camera, calculates the relative position of the mobile robot in real time, solves the problem of great difference in performance and cost of the robot positioning technology in the prior art, and realizes the effects of low positioning cost, accurate positioning, good flexibility and good adaptability to scenes.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A mobile robot based on passive infrared label location, comprising:
the tag module comprises a plurality of passive infrared tags distributed at different positions;
the acquisition module is used for identifying and acquiring the image information of the passive infrared tag;
the positioning module is used for calculating the position and the posture of the acquisition module according to the image information;
and the motion module is used for controlling the moving direction and the gesture of the robot according to the position and the gesture of the acquisition module.
2. The mobile robot based on passive infrared tag positioning as claimed in claim 1, wherein the plurality of passive infrared tags are distributed at a plurality of places according to a preset rule, and each passive infrared tag includes a separate ID number.
3. The mobile robot based on passive infrared tag positioning as claimed in claim 2, wherein the passive infrared tag is connected with a reflective tag sticker carrier on which the passive infrared tag is disposed.
4. A mobile robot based on passive infrared tag positioning as claimed in claim 3, characterized in that the passive infrared tag comprises coordinate points for determining the coordinate system of the passive infrared tag, check points for confirming whether the identified coordinate points are correct or not, and encoded information points for identifying the individual ID number of the passive infrared tag.
5. The mobile robot based on passive infrared tag positioning as claimed in claim 1, wherein the collection module comprises a collection camera for identifying and collecting image information of the passive infrared tag.
6. The mobile robot based on the passive infrared tag positioning as claimed in claim 5, wherein the collection module further comprises an infrared light supplement device, an infrared narrowband optical filter and an infrared light transmission sheet, and the infrared light supplement device is connected with the collection camera and is used for performing infrared light supplement on the passive infrared tag; the infrared narrowband filter is arranged in the acquisition camera and used for filtering infrared light, and the infrared light transmission piece is arranged on the outer surface of the infrared light supplementing device and used for filtering visible light.
7. The mobile robot based on passive infrared tag positioning as claimed in claim 6, wherein the infrared light supplement device is consistent with the design wavelengths of the infrared narrowband filter and the infrared light transmission sheet.
8. The mobile robot based on passive infrared tag positioning as claimed in claim 2, wherein the positioning module further obtains the coordinates of the acquisition module by recognizing an ID number of the passive infrared tag and combining the coordinates of the passive infrared tag.
9. A mobile robot based on passive infrared tag positioning as claimed in claim 1, wherein the positioning module and the motion module are further configured to:
performing image distortion removal on the image information;
extracting reflection points from the image information after distortion removal;
clustering the reflective points according to the reflective points;
identifying the clustering clusters of the light reflecting points;
calculating the position and the posture of the acquisition module according to the light reflecting point cluster;
and determining the position of the robot according to the position and the posture of the acquisition module.
10. The mobile robot based on passive infrared tag positioning as claimed in claim 1, wherein the positioning module is disposed at a top position of the mobile robot.
CN202110641333.2A 2021-06-09 2021-06-09 Mobile robot based on passive infrared label location Pending CN113341977A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110641333.2A CN113341977A (en) 2021-06-09 2021-06-09 Mobile robot based on passive infrared label location

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110641333.2A CN113341977A (en) 2021-06-09 2021-06-09 Mobile robot based on passive infrared label location

Publications (1)

Publication Number Publication Date
CN113341977A true CN113341977A (en) 2021-09-03

Family

ID=77476178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110641333.2A Pending CN113341977A (en) 2021-06-09 2021-06-09 Mobile robot based on passive infrared label location

Country Status (1)

Country Link
CN (1) CN113341977A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1770821A (en) * 2004-11-02 2006-05-10 林永全 Infrared lighting system
CN102773862A (en) * 2012-07-31 2012-11-14 山东大学 Quick and accurate locating system used for indoor mobile robot and working method thereof
CN103529622A (en) * 2013-11-01 2014-01-22 四川中盾金卫光电科技有限公司 Infrared camera
CN106297551A (en) * 2016-09-14 2017-01-04 哈工大机器人集团上海有限公司 A kind of road sign for determining robot location and coding checkout method thereof
CN108022265A (en) * 2016-11-01 2018-05-11 狒特科技(北京)有限公司 Infrared camera pose determines method, equipment and system
CN207978044U (en) * 2018-04-02 2018-10-16 北京小米移动软件有限公司 Camera
CN109767602A (en) * 2019-03-14 2019-05-17 钧捷智能(深圳)有限公司 A kind of round-the-clock driver fatigue monitor system camera
US20190163197A1 (en) * 2016-09-14 2019-05-30 Hit Robot Group Shanghai Co.,Ltd. Road sign for determining position of robot, device, and method for distinguishing labels with different functions in road sign
CN111818245A (en) * 2020-07-03 2020-10-23 江苏集萃智能光电系统研究所有限公司 Visual sensor optical optimization device and correction method for outdoor complex environment
CN211878163U (en) * 2019-07-30 2020-11-06 深圳市普渡科技有限公司 Positioning identifier and system
CN112013850A (en) * 2020-10-16 2020-12-01 北京猎户星空科技有限公司 Positioning method, positioning device, self-moving equipment and storage medium
CN213292181U (en) * 2020-09-16 2021-05-28 东风汽车集团有限公司 Multifunctional camera module in vehicle

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1770821A (en) * 2004-11-02 2006-05-10 林永全 Infrared lighting system
CN102773862A (en) * 2012-07-31 2012-11-14 山东大学 Quick and accurate locating system used for indoor mobile robot and working method thereof
CN103529622A (en) * 2013-11-01 2014-01-22 四川中盾金卫光电科技有限公司 Infrared camera
CN106297551A (en) * 2016-09-14 2017-01-04 哈工大机器人集团上海有限公司 A kind of road sign for determining robot location and coding checkout method thereof
US20190163197A1 (en) * 2016-09-14 2019-05-30 Hit Robot Group Shanghai Co.,Ltd. Road sign for determining position of robot, device, and method for distinguishing labels with different functions in road sign
CN108022265A (en) * 2016-11-01 2018-05-11 狒特科技(北京)有限公司 Infrared camera pose determines method, equipment and system
CN207978044U (en) * 2018-04-02 2018-10-16 北京小米移动软件有限公司 Camera
CN109767602A (en) * 2019-03-14 2019-05-17 钧捷智能(深圳)有限公司 A kind of round-the-clock driver fatigue monitor system camera
CN211878163U (en) * 2019-07-30 2020-11-06 深圳市普渡科技有限公司 Positioning identifier and system
CN111818245A (en) * 2020-07-03 2020-10-23 江苏集萃智能光电系统研究所有限公司 Visual sensor optical optimization device and correction method for outdoor complex environment
CN213292181U (en) * 2020-09-16 2021-05-28 东风汽车集团有限公司 Multifunctional camera module in vehicle
CN112013850A (en) * 2020-10-16 2020-12-01 北京猎户星空科技有限公司 Positioning method, positioning device, self-moving equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王爱民 等, vol. 1, 中国铁道出版社有限公司, pages: 151 - 152 *

Similar Documents

Publication Publication Date Title
CN106092090B (en) Infrared road sign for positioning indoor mobile robot and use method thereof
CN111989544B (en) System and method for indoor vehicle navigation based on optical target
WO2022257807A1 (en) Positioning apparatus based on passive infrared tag
Krajník et al. A practical multirobot localization system
US10757307B2 (en) Lighting devices configurable for generating a visual signature
US20140198206A1 (en) System and Method for Estimating the Position and Orientation of an Object using Optical Beacons
CN108288289B (en) LED visual detection method and system for visible light positioning
CN1934459A (en) Wireless location and identification system and method
CN112907625B (en) Target following method and system applied to quadruped bionic robot
CN113607158A (en) Visual identification matching positioning method and system for flat light source based on visible light communication
CN111090074A (en) Indoor visible light positioning method and equipment based on machine learning
CN110705540A (en) Animal remedy production pointer instrument image identification method and device based on RFID and deep learning
CN113341977A (en) Mobile robot based on passive infrared label location
CN212206103U (en) Indoor positioning device
Llorca et al. Assistive pedestrian crossings by means of stereo localization and rfid anonymous disability identification
CN111044055A (en) Indoor positioning method and indoor positioning device
US20140132500A1 (en) Method and apparatus for recognizing location of moving object in real time
CN117058766B (en) Motion capture system and method based on active light stroboscopic effect
Kartashov et al. Fast artificial landmark detection for indoor mobile robots
CN110703195A (en) Indoor visible light passive positioning method based on spatial filter
EP3963418B1 (en) Industrial vehicle with feature-based localization and navigation
CN218727961U (en) Label layout array structure based on RFID positioning navigation
KR102154043B1 (en) Vision module for position recognition of moving object and position recognition method for moving object
Holtzblatt Can you intentionally design a product that is Cool?(keynote)
CN111966134A (en) Tracking control equipment and method based on high-precision positioning technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210903

RJ01 Rejection of invention patent application after publication