CN111947659A - Acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for mobile robot - Google Patents

Acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for mobile robot Download PDF

Info

Publication number
CN111947659A
CN111947659A CN202010644082.9A CN202010644082A CN111947659A CN 111947659 A CN111947659 A CN 111947659A CN 202010644082 A CN202010644082 A CN 202010644082A CN 111947659 A CN111947659 A CN 111947659A
Authority
CN
China
Prior art keywords
mobile robot
ultrasonic
ultra
wideband
signals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010644082.9A
Other languages
Chinese (zh)
Other versions
CN111947659B (en
Inventor
杨萃
韦岗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202010644082.9A priority Critical patent/CN111947659B/en
Publication of CN111947659A publication Critical patent/CN111947659A/en
Application granted granted Critical
Publication of CN111947659B publication Critical patent/CN111947659B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a mobile robot-oriented acoustic-optical multi-mode distributed collaborative positioning and navigation system, which is characterized in that a plurality of acoustic-optical-electrical units are arranged around an intelligent mobile robot to form distributed information acquisition, ultrasonic, image and ultra-wideband signals received by the plurality of acoustic-optical-electrical units are combined to perform collaborative processing to acquire information of more environments and system working targets, then the ultrasonic, image and ultra-wideband information are combined to highly fuse sound, light and electricity, and the collaborative processing is performed to improve the positioning and recognition precision and the precision of path planning; and finally, all data are transmitted to the cloud for fusion processing, so that the performance requirements on a processing module of the mobile robot can be reduced, the requirements of an original system on power consumption, size, weight and the like are reduced, and the computing capacity is improved. The invention can be applied to occasions with wider requirements on positioning distance and range outdoors and can be expected to be applied to occasions with positioning precision reaching millimeter level, such as intelligent nursing robots and the like.

Description

Acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for mobile robot
Technical Field
The invention relates to the technical field of positioning and navigation, in particular to an acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for a mobile robot.
Background
With the rapid development of artificial intelligence technology, wireless communication technology and internet of things technology, indoor and outdoor intelligent mobile robots are increasingly applied, such as unmanned vehicles, unmanned planes, unmanned ships and the like of the outdoor intelligent mobile robots; indoor mobile intelligent robots such as sweeping robots, nursing robots, and the like; robots capable of moving indoors and outdoors, such as pet robots for accompanying and nursing the elderly, nursing robots, etc., will greatly change people's lifestyles, and an intelligent and reliable positioning and navigation technology is one of the important technologies for promoting the above-mentioned intelligent mobile robots to further develop.
In the existing indoor and outdoor intelligent navigation method, the main problems are as follows:
first, the existing robot system relies on the information collected by the robot itself, and no external device provides more information for the current environment, so the sensing range is very limited. For example, in an indoor sweeping robot, only the camera or other ranging modules of the sweeping robot obtain environmental information; for example, in the hotel service robot disclosed in patent CN108789427, the environment sensing module is mounted on the robot body, and no other module provides more information for constructing the surrounding environment. In an outdoor navigation system, except for equipment such as a camera of a vehicle body, positioning is carried out by combining a satellite navigation system such as a GPS, but the positioning precision is difficult to be accurate to a meter level, and when the outdoor navigation system enters a track with poor satellite signals such as a tunnel and a bridge opening, positioning cannot be carried out by combining the satellite navigation system; the high-precision area positioning navigation system disclosed in patent CN108089204 navigates by means of distributed base stations, belongs to a radio navigation means, and has a large application area and high base station construction cost.
Secondly, positioning and navigation are realized by a single means or independent technical means, and the precision is not high. Most positioning systems are navigation systems based on image processing, and a mobile intelligent robot carries a camera to acquire image information and segments and identifies the image so as to judge current environment information and realize an automatic navigation function. When the light is not good, the image is not clear, or the image information can cause ambiguity, the performance can be greatly reduced. For example, for a ball in a mirror and a real ball, it is not easily distinguishable and recognizable by means of an image alone.
The indoor navigation equipment also uses laser, infrared or ultrasonic signals for positioning, but the processing capability of a processing module carried on a robot is limited, image information and distance measurement information are processed independently, and the fusion of the information cannot be realized, so that the positioning accuracy is not high. In addition, most of the existing intelligent mobile robots perform data processing based on their own processing modules, and in view of the limitations of the system in terms of power consumption, volume, weight, and the like, the processing capability is difficult to be improved, so that more data cannot be fused to further improve the performance of positioning and navigation.
In various positioning techniques, the position of the target is usually obtained by using a time difference method. In essence, the precision of the distance measurement depends on the measurement precision of time, the speed of ultrasonic waves is only one thousandth compared with that of electromagnetic waves, and the requirement of ultrasonic distance measurement on the measurement precision of time is greatly reduced under the same distance, so that the method has great advantages in short-distance high-precision measurement and can reach a millimeter level. The ultra-wideband positioning technology has the advantages of centimeter-level positioning precision and high positioning speed. The wide application is patrolled and examined, security protection control, wisdom city in intelligent factory, storage commodity circulation, wisdom medical treatment, intelligence, and though positioning accuracy is not as fast, the location is fast, and the scope of action is bigger. In summary, if several kinds of positioning and ranging information can be fused, the positioning accuracy can be improved.
Disclosure of Invention
The invention aims to solve the problems that the information acquired by the existing mobile robot navigation system is processed independently, the information fusion degree is not high, no other auxiliary equipment is used for cooperatively acquiring larger information quantity, the processing capacity is insufficient and the like, and provides an acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for a mobile robot.
The purpose of the invention can be achieved by adopting the following technical scheme:
a mobile robot-oriented acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system is provided, wherein the working target of the mobile robot is the working task which needs to be completed by the mobile robot, and can be a point target, a line target or a surface target. For example, if the mobile robot is a transfer robot, the work target thereof is a point target, i.e., an object to be transferred; if the mobile robot is an automobile which automatically runs, the working target of the mobile robot is a line target, namely a road surface which correctly runs; if the mobile robot is a sweeping robot, the working target is a surface target, namely an area needing to be swept. The acoustic-optical-electric multi-mode distribution collaborative positioning and navigation system for the mobile robot positions and navigates the mobile robot aiming at the working target point, line or plane of the mobile robot.
The acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system facing the mobile robot comprises the mobile robot, a distributed acoustic-optical-electric array and a cloud processing center. The method comprises the steps of arranging a distributed acoustic-optic array around a mobile robot, obtaining images, polarized light images, ultrasonic data and ultra-wideband signals through the mobile robot and the distributed acoustic-optic array around the mobile robot, fusing the images, the polarized light images, the ultrasonic data and the ultra-wideband signals in a cloud processing center, positioning and identifying working targets of the mobile robot and a system by utilizing an ultrasonic ranging algorithm, an array signal processing theory, a depth learning algorithm and a computer vision algorithm, and planning paths for the mobile robot.
The mobile robot comprises a mechanical device, an intelligent processing unit, a control unit, a communication unit, an LED optical device, an omnidirectional ultrasonic transmitting device, a camera array, an ultra-wideband transmitting module, a functional module and a power supply. The distributed acousto-optic-electric array is composed of a plurality of acousto-optic-electric units which are arranged in the moving range of the mobile robot, and the plurality of acousto-optic-electric units are arranged into a required shape according to a specific application scene to form an array.
The mobile robot-oriented acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system has multiple working modes including an independent computing mode and a cloud processing high-precision positioning mode. The independent calculation mode means that the mobile robot realizes positioning through calculation of ultrasonic, image and ultra-wideband signals. The cloud processing high-precision positioning mode refers to that ultrasonic, image and ultra-wideband signals acquired by the distributed acoustic-optical-electrical array and images and polarized light images acquired by the mobile robot are sent to a cloud processing center for processing, and high-precision positioning and navigation information is obtained.
The mechanical device in the mobile robot is a mechanical device that can travel and turn under the control of the control unit. The intelligent processing unit generates access information, broadband digital ultrasonic signals and LED lamp signals, and can operate a signal processing and fusion algorithm to obtain positioning and navigation information of the mobile robot in an independent calculation mode of the mobile robot. The control unit is connected with the intelligent processing unit and respectively controls the LED optical device, the omnidirectional ultrasonic emission device, the ultra-wideband emission module and the mechanical device according to instructions given by the intelligent processing unit. The communication unit is used for communicating the mobile robot with the cloud processing center and the distributed acousto-optic-electric array, and the communication unit is communicated with the distributed acousto-optic-electric array by adopting the existing wireless communication technology in an independent computing mode of the mobile robot to receive the ultrasound, the image, the polarized image and the ultra-wideband signal collected by the acousto-optic-electric array; under the cloud processing high-precision positioning mode, the communication unit is communicated with the cloud processing center and receives instructions and positioning and navigation information from the cloud processing center. The LED light device is composed of an LED lamp array and can emit LED light to the periphery of the mobile robot, and the color and the lighting sequence of the LED light are controlled by the control unit. The omnidirectional ultrasonic transmitting device on the mobile robot comprises an ultrasonic transducer array comprising a plurality of ultrasonic transducers and a digital-to-analog conversion module, and can transmit ultrasonic signals to all directions around the mobile robot, wherein the digital-to-analog conversion module converts digital broadband ultrasonic signals into analog signals, and the ultrasonic transducers in the ultrasonic transducer array convert electric signals into acoustic signals. The camera array is an array formed by a plurality of cameras arranged on the mobile robot and can acquire optical images in all directions, and the camera array is provided with polarized cameras to form the polarized camera array and can acquire the polarized images. The ultra-wideband transmitting module is a module for transmitting ultra-wideband electrical signals. The power supply supplies power to each module of the mobile robot.
The sound-light-electricity unit in the distributed sound-light-electricity array comprises an omnidirectional ultrasonic receiving device, a camera array, an ultra-wideband receiving module and a communication module, wherein the omnidirectional ultrasonic receiving device can receive ultrasonic signals from all directions and comprises an ultrasonic transducer array and an analog-digital conversion module, the ultrasonic transducer array is used for converting the received ultrasonic signals into electric signals, and the analog-digital conversion unit converts the electric signals into digital signals and sends the digital signals to the communication module. The camera array comprises a plurality of cameras, images in all directions can be obtained, and the camera array is provided with a polarized camera and can obtain polarized images. The ultra-wideband receiving module is a module responsible for receiving ultra-wideband signals. The communication module is responsible for communicating with the cloud processing center and the mobile robot, and sends the received ultrasonic signals, images, polarized images and ultra-wideband signals to the mobile robot in an independent computing mode of the mobile robot; and under the cloud processing high-precision positioning mode, the communication module sends the received ultrasonic signals, images, polarized images and ultra-wideband signals to the cloud processing center.
The cloud processing center is connected with the communication unit of the mobile robot and the communication module in the sound-light-electricity unit, receives the image and the polarized light image sent by the mobile robot, receives the ultrasonic, the image, the polarized light image and the ultra-wideband signal sent by the sound-light-electricity unit, fuses and processes the received information, intelligently calculates the position information of the mobile robot, plans a reasonable path and sends path planning information to the mobile robot. When a plurality of mobile robots are arranged in the acoustic-optical-electric multi-mode distribution collaborative positioning and navigation system facing the mobile robots, the cloud processing center distributes different LED codes and different broadband ultrasonic signal codes to different mobile robots.
When a plurality of mobile robots are arranged in the acoustic-optical-electric multi-mode distribution collaborative positioning and navigation system facing the mobile robots, different mobile robots have different LED codes and different broadband ultrasonic signal codes for identifying the identities of different mobile robots. Specifically, the light emitted by the LED light devices of different mobile robots has different codes, which facilitates the identification of different mobile robots in the captured image and the polarized image. Namely: different mobile robots are used as labels through LED light devices and have different codes, the LED codes are distributed to each mobile robot by the cloud processing center, corresponding control signals are generated by the intelligent processing units of the mobile robots according to the LED codes, and the LED light devices are controlled to emit light with different colors and different lighting sequences through the control units.
When a plurality of mobile robots are arranged in the acoustic-optical-electric multi-mode distribution collaborative positioning and navigation system facing the mobile robots, ultrasonic signals emitted by the omnidirectional ultrasonic emission devices of different mobile robots have different wideband ultrasonic signal codes, and the wideband ultrasonic signal codes specify parameters of the used ultrasonic signals, so that different mobile robots can be conveniently identified in a distance measurement algorithm. Namely: the parameters of broadband ultrasonic signals emitted by different mobile robots are different, a cloud processing center distributes broadband ultrasonic signal codes for each ultrasonic transducer in each mobile robot and sends the broadband ultrasonic signal codes to the mobile robots, an intelligent processing unit of each mobile robot sets the bandwidth, the duration and the signal form parameters of the broadband ultrasonic signals according to the broadband ultrasonic signal codes, corresponding digital broadband ultrasonic signals are generated according to the parameters, and the digital broadband ultrasonic signals are sent to an omnidirectional ultrasonic emission device through a control unit.
Further, the mobile robot-oriented acoustic-optical-electrical multimode distribution collaborative positioning and navigation system comprises the working steps of a cloud processing high-precision positioning mode:
t1, initialization: the mobile robot is started, the intelligent processing unit of the mobile robot generates access information, the communication unit sends the access information to the cloud processing center, and the cloud processing center distributes the LED codes and the ultrasonic broadband codes for the started mobile robot and sends the LED codes and the ultrasonic broadband codes to the mobile robot through the communication unit.
T2, mobile robot initialization: the intelligent processing unit of the mobile robot generates a corresponding control signal of the LED light device and a corresponding digital broadband ultrasonic signal according to the LED code and the ultrasonic broadband code, controls the LED light device to light through the control unit, and sends the digital broadband ultrasonic signal to a corresponding ultrasonic transducer in the omnidirectional ultrasonic transmitting device through the control unit. The control unit controls the ultra-wideband transmitting module to transmit ultra-wideband signals.
T3, acquiring images by the mobile robot, and polarizing images: the camera array on the mobile robot acquires images and polarized images and sends the images and the polarized images to the cloud processing center through the communication unit of the mobile robot.
T4, acquiring ultrasound, images, polarized images and ultra-wideband signals by the distributed acousto-optic-electric array: the omnidirectional ultrasonic receiving device in each acousto-optic-electrical unit in the distributed acousto-optic-electrical array receives ultrasonic signals, the camera array acquires images and polarized light images, and the ultra-wideband receiving module receives ultra-wideband signals and sends the ultra-wideband signals to the cloud processing center through the communication module of the acousto-optic-electrical unit.
T5, data fusion and processing: the cloud processing center receives signals sent by the mobile robot and signals sent by the distributed acousto-optic-electric array, runs related processing programs, calculates the position of the mobile robot and the position of a system working target, completes path planning, and sends a path planning result to the mobile robot.
T6, operation of the mobile robot: if the communication unit of the mobile robot receives a task finishing instruction from the cloud processing center, the mobile robot is shut down, otherwise, the communication unit of the mobile robot receives the path plan, and the control unit controls the mechanical device to move according to the coordinates of the path plan. And continuously repeating the steps T3-T6 while the mechanical device moves, and continuously updating the path planning result of the mobile robot.
Further, the mobile robot-oriented acoustic-optic multimode distribution collaborative positioning and navigation system comprises the working steps of an independent computing mode of the mobile robot:
r1, initialization: the mobile robot is started, an intelligent processing unit of the mobile robot generates a control signal of a corresponding LED optical device and a corresponding digital broadband ultrasonic signal, the control unit controls the LED optical device to light, the control unit sends the digital broadband ultrasonic signal to a corresponding ultrasonic transducer in the omnidirectional ultrasonic transmitting device, and the control unit controls the ultra-wideband transmitting module to transmit the ultra-wideband signal.
R2, mobile robot acquiring image, polarization image: the camera array on the mobile robot acquires images and polarized light images.
R3, acquiring ultrasonic and image and polarization image by the distributed acousto-optic-electric array: the omnidirectional ultrasonic receiving device in each acousto-optic-electrical unit in the distributed acousto-optic-electrical array receives ultrasonic signals, the camera array acquires images and polarized light images, and the ultra-wideband module receives ultra-wideband signals and sends the ultra-wideband signals to the mobile robot through the communication module of the acousto-optic-electrical unit.
R4, data fusion and processing: and the mobile robot runs related processing programs according to the acquired image, the polarization image and the signal sent by the distributed acousto-optic-electric array, calculates the position of the mobile robot and the position of a system working target and finishes path planning. And repeating the steps R2-R4, and continuously updating the path planning result of the mobile robot.
Compared with the prior art, the invention has the following advantages and effects:
(1) the existing mobile robot judges and identifies the surrounding environment by means of images acquired by the device or a distance measuring device, and calculates the running track, for example, an indoor sweeping robot, an outdoor unmanned device only installs an infrared device or a camera on the sweeping robot, and an outdoor unmanned automobile only installs a camera and a distance measuring device on the automobile. The positioning and navigation system arranges distributed acousto-optic and electric arrays around the mobile robot to acquire more information, can improve the positioning and navigation precision and stability, for example, for an indoor sweeping robot, a plurality of acousto-optic and electric units can be arranged at the indoor ceiling, floor and other positions, and the area needing to be swept (namely the system working target: the area needing to be cleaned) and the position of the sweeping robot are determined through more image information, ultrasonic ranging information and ultra-wideband positioning information; for the outdoor unmanned driving, a plurality of acousto-optic and electric units can be arranged on two sides of a road (such as a street lamp and a tree) to acquire more images and ultrasonic ranging information and assist a vehicle to determine the driving track (namely the system working target: the driving route).
(2) The existing mobile robot mostly depends on a camera to acquire images to realize positioning and path planning, and the performance is greatly reduced when the light is poor and the images are not clear or the image semantics are ambiguous. Both adopted distributed perception unit, enlarged the visual angle of information perception, the locating information of a plurality of dimensions of integration reputation electricity again promotes positioning accuracy by a wide margin, can be applied to the location demand of millimeter level, for example future nursing robot can realize for nursing object water feeding etc..
(3) The invention identifies the mobile robot by LED lamplight and ultra-wideband signal codes, and is beneficial to identifying the identity of the mobile robot.
(4) The mobile robots in the positioning and navigation system can be mobile robots with different forms and different work targets, and the distributed acousto-optic and electric arrays can be reused for the different mobile robots, so that the cost of intelligent life is further reduced. For example, a sweeping robot, a nursing robot, etc. are simultaneously arranged in the same indoor space, the work targets of the sweeping robot, the nursing robot, etc. are different, but all the acousto-optic and electric units arranged in the room can be reused, so that the positioning and navigation system can serve all smart homes.
Drawings
FIG. 1 is a schematic diagram of an acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system facing a nursing robot in an embodiment of the invention;
fig. 2 is a block diagram showing the structure of a mobile robot according to the embodiment of the present invention;
FIG. 3 is a block diagram of the configuration of the acousto-optic and electro-optic units of the distributed acousto-optic and electro-optic array in the embodiment of the invention;
FIG. 4 is a flowchart of the operation of the nursing robot-oriented acoustic-optical multi-mode distribution cooperative positioning and navigation system in the embodiment of the present invention;
FIG. 5 is a schematic diagram of an acousto-optic multi-mode distribution cooperative positioning and navigation system for pet dogs and auto-driven vehicles in a community according to an embodiment of the invention;
FIG. 6 is a flowchart of the operation of the system for cooperative positioning and navigation with acoustic, optical and electrical multimode distribution for pet dogs and automatic driven vehicles in a certain community in accordance with an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
In this embodiment, an indoor nursing robot is taken as an example, as shown in fig. 1, and is a schematic diagram of the mobile robot-oriented acoustic-optical multimode distributed cooperative positioning and navigation system disclosed in this embodiment, where the mobile robot is a nursing robot, a distributed acoustic-optical array is arranged indoors, and in this embodiment, the system includes seven acoustic- optical units 101 and 107, which are respectively arranged at a peripheral ceiling, a central pendant and a ground, and can be arranged in a range of motion of the mobile robot as required. There may be one or more mobile robots in the room, for example in the space shown, a sweeping robot or other mobile robot in addition to the care robot. In this embodiment, it is assumed that the object of the nursing robot 108 is the cup 109 on the table, i.e. the patient needs to be given water, and the object of the next task is the point object-cup. As can be seen from fig. 1, the camera of the nursing robot 108 is shielded by the obstacle 110, so that the cup cannot be observed, a correct path cannot be obtained, and the nursing robot is easy to get stuck in a narrow area. However, the other acoustic-optical- electrical units 101 and 107 arranged indoors can all acquire the relative position relationship between the water cup and the nursing robot, the acquired data is transmitted to the cloud processing center 111, the cloud processing center receives image information acquired by the nursing robot, images acquired by the acoustic-optical-electrical units, polarization images, ultrasonic information and ultra-wideband signals, and the correct path can be obtained through comprehensive analysis. The nursing robot, the acousto-optic and electric units and the cloud processing center 111 are wirelessly interconnected.
As shown in fig. 2, the nursing robot includes a mechanical device, an intelligent processing unit, a control unit, a communication unit, an LED light device, an omnidirectional ultrasonic emitting device, a camera array, an ultra-wideband emitting module, a functional module, and a power supply. The mechanical device is a mechanical device that can walk and steer under the control of the control unit. The intelligent processing unit respectively generates a digital broadband ultrasonic signal and an LED lamp signal according to the broadband ultrasonic signal code and the LED code. Under the independent calculation mode of the nursing robot, the intelligent processing unit can process ultrasonic signals, images, polarized images and ultra-wideband signals to obtain self positioning and navigation information. The control unit is connected with the intelligent processing unit and can respectively control the LED light device, the omnidirectional ultrasonic emission device, the ultra-wideband emission device and the mechanical device according to instructions given by the intelligent processing unit. The communication unit is used for the nursing robot to communicate with the cloud processing center and the acousto-optic-electric array, and adopts the existing wireless communication technology. The LED light device is composed of an LED lamp array and can emit LED light to the periphery of the nursing robot, and the color and the lighting sequence of the LED light are controlled by the control unit. The omnidirectional ultrasonic transmitting device on the nursing robot comprises an ultrasonic transducer array and a digital-to-analog conversion module, wherein the ultrasonic transducer array is composed of a plurality of ultrasonic transducers, ultrasonic signals can be transmitted to all directions around the nursing robot, the digital-to-analog conversion module converts digital broadband ultrasonic signals into analog signals, and the ultrasonic transducers in the transducer array convert electric signals into acoustic signals. The camera array is an array formed by a plurality of cameras arranged on the nursing robot and can acquire optical images in all directions, and the cameras are provided with polarized cameras to form the polarized camera array and can acquire polarized images. The ultra-wideband transmitting module transmits ultra-wideband signals. The functional module is a module for realizing the nursing function, and the power supply supplies power for the nursing robot.
As shown in fig. 3, the acousto-optic-electrical unit includes an omnidirectional ultrasonic receiving device, a camera array and a communication module, where the omnidirectional ultrasonic receiving device can receive ultrasonic signals from various directions and is composed of an ultrasonic transducer array and an analog-to-digital conversion module, the ultrasonic transducer array is used to convert the received ultrasonic signals into electrical signals, and the analog-to-digital conversion unit converts the electrical signals into digital signals and sends the digital signals to the communication module. The camera array comprises a plurality of cameras, images in all directions can be obtained, and polarized cameras are arranged in the cameras and can obtain polarized images. The communication module is responsible for communicating with the cloud processing center and the mobile robot, and in the independent computing mode of the mobile robot, the communication module sends the received ultrasonic signals, images, polarized light images and ultra-wideband signals to the mobile robot, and in the high-precision positioning mode of cloud processing, the communication module sends the received ultrasonic signals, images, polarized light images and ultra-wideband signals to the cloud processing center.
In this embodiment, a cloud processing high-precision positioning mode is adopted. The cloud processing center distributes different LED codes for the nursing robot, in this embodiment, the LED code that the cloud processing center distributes for the nursing robot is 1, the intelligent processing unit of nursing robot produces the LED signal according to the LED code received, LED code 1 corresponds three colour (red yellow blue) light in this embodiment, if there are other robots in the space, for example also sweep the floor the robot, then the cloud processing center can distribute the LED code different with the nursing robot for sweeping the floor, for example code 2, it is red to correspond three light colour. Therefore, two different mobile robots are marked, and the two different mobile robots can be conveniently identified in the subsequent shot images and the polarization images.
In the embodiment, the cloud processing center allocates a wideband ultrasonic signal code 1 to the nursing robot, and the intelligent processing unit of the nursing robot generates a chirp signal 1 according to the wideband ultrasonic signal code 1, wherein the parameter is f020KHz to f1Linear frequency-modulated signal of 30KHz, time length of frequency-modulated signal is T1 s, sampling rate fsThe ultrasonic transducer of the nursing robot 1 transmits signals of 100 KHz:
Figure BDA0002572479440000111
wherein, ω is0Is the starting angular frequency, omega, of the chirp signal0=2πf0/fs,fsIs the sampling rate, k is the frequency modulation rate,
Figure BDA0002572479440000112
n is the number of samples of the digital signal, and N ═ fsT。
Figure BDA0002572479440000113
Is the initial phase.
If other mobile robots exist in the space, if a sweeping robot exists, the cloud processing center allocates different broadband ultrasonic codes to the robot, so that two different mobile robots are identified, and signals emitted by different mobile robots are extracted from ultrasonic signals received in subsequent acousto-optic and electric units.
As shown in fig. 4, the acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for the nursing robot works in a cloud processing high-precision positioning mode, and includes the following steps:
t1, initialization: the nursing robot is started, the intelligent processing unit of the nursing robot generates access information and sends the access information to the cloud processing center through the communication unit, and the cloud processing center distributes the LED codes and the ultrasonic broadband codes for the started nursing robot and sends the LED codes and the ultrasonic broadband codes to the nursing robot through the communication unit. The LED code that the nursing robot obtained is 1, and the supersound wide band code is 1.
T2, nursing robot initialization: the intelligent processing unit of the nursing robot generates a corresponding control signal of the LED light device and a corresponding digital broadband ultrasonic signal according to the LED code and the ultrasonic broadband code, controls the LED light device to light up according to the LED code through the control unit, and sends the digital broadband ultrasonic signal to a corresponding ultrasonic transducer in the omnidirectional ultrasonic transmitting device through the control unit. The ultra-wideband transmitting module transmits ultra-wideband signals. In this embodiment, the LED code is 1 for red, yellow and blue lights, so the nursing robot is red, yellow and blue lights. In the embodiment, the intelligent processing unit of the nursing robot correspondingly generates a parameter f according to the received ultrasonic wideband code 1020KHz to f1The linear frequency modulation signal is 30KHz and is sent to the ultrasonic transducer in the omnidirectional ultrasonic transmitting device for transmission. In this embodiment, all the ultrasound transducers in the same omnidirectional ultrasound transmitting apparatus transmit the same signal.
T3, acquiring images and polarized images by the nursing robot: the camera array on the nursing robot acquires images and polarized light images. And the information is sent to the cloud processing center through a communication unit on the nursing robot.
T4, acquiring ultrasonic and image and polarization image by the distributed acousto-optic-electric array: each acousto-optic-electric unit in the distributed acousto-optic-electric array is provided with an omnidirectional ultrasonic receiving device for receiving ultrasonic signals, a camera array for acquiring images and polarized light images, and an ultra-wideband receiving module for receiving ultra-wideband signals and sending the ultra-wideband signals to a cloud processing center through a communication module of the acousto-optic-electric unit.
T5, data fusion and processing: the cloud processing center receives signals sent by the nursing robot and signals sent by the distributed acousto-optic-electric array, runs related processing programs, calculates the position of the mobile robot and the position of a system working target, completes path planning and sends the path planning to the nursing robot. In a room, a plurality of acousto-optic electric units form a receiving array of ultrasonic signals, so that a cloud processing center can solve the accurate position of the nursing robot according to the relevant theory and algorithm of array signal processing, and in combination with image information shot by a plurality of cameras, the nursing robot, a system working target and other obstacles are segmented and identified by using a computer vision relevant technology, a three-dimensional map where the nursing robot is located is constructed, and the image and the ultrasonic information are fused to obtain a correct planning path.
T6, operation of the nursing robot: if the nursing robot receives an instruction of ending the cloud processing center, the nursing robot is shut down; otherwise, the communication unit of the nursing robot receives the path plan, and the control unit controls the mechanical device to move according to the coordinates of the path plan. And continuously repeating the steps T3-T6 while the mechanical device moves, and continuously updating the path planning result of the nursing robot.
Example two
The present embodiment is directed to an outdoor mobile robot, and the present embodiment will be described by taking a pet robot dog and an autonomous automobile in a living cell as an example, as shown in fig. 5, in the present embodiment, the mobile robot is the pet robot dog and the autonomous automobile, and the working targets thereof are all drivable road surfaces and are line targets. A plurality of acoustic-optical-electric units are arranged in the moving range of the mobile robot to form a distributed acoustic-optical-electric array. The acousto-optic and electric units can be arranged in the range of the movement of the mobile robot such as a cell tree, a street lamp, a railing, a green belt, a road surface and the like, such as 501 and 506 in fig. 5, the positions and the number of the acousto-optic and electric units can be adjusted as required, and the positions of only 6 acousto-optic and electric units are given in the figure to illustrate. Mobile robots within a cell may share deployed acousto-optic cells, such as pet robot dog 507 and autonomous automobile 508 in this embodiment. As shown in the figure, the pet robot dog 507 needs to cross the road, it is difficult to plan the route correctly through its own camera array, and multiple acousto-optic units in the cell can cooperate to provide more road information for it. The ultrasound and image, the polarized light image and the ultra-wideband signal collected by the acousto-optic electric unit are transmitted to the cloud processing center 509 in a wireless mode, and the image and the polarized light image collected by the pet robot dog and the automatic driving automobile are also transmitted to the cloud processing center 509 in a wireless communication mode. And fusing ultrasonic and image and polarized image information through an ultrasonic distance measurement algorithm, an array signal processing theory, a deep learning algorithm and a computer vision algorithm in the cloud processing center, planning a path, and sending the planned path to the pet robot dog and the automatic driving automobile in real time to realize cooperative positioning and navigation.
The pet robot dog and the automatic driving automobile respectively comprise a mechanical device, an intelligent processing unit, a control unit, a communication unit, an LED light device, an omnidirectional ultrasonic emitting device, a camera array, an ultra-wideband emitting module, a functional module and a power supply. The functional module of the pet robot dog comprises functions of a pet robot dog accompanying an owner, such as a load-carrying functional module and a conversation functional module, and the functional module of the automatic driving automobile refers to a module which is arranged on the existing automobile.
In this embodiment, the mobile robot-oriented acoustic-optical multi-mode distributed collaborative positioning and navigation system has a cloud processing high-precision positioning mode.
Because the mobile robot has the pet robot dog and the automatic driving automobile simultaneously in the system, the cloud processing center allocates different LED codes for the pet robot dog and the automatic driving automobile, in the embodiment, the cloud processing center allocates the LED code 1 for the pet robot dog to represent that there is one red light, and the cloud processing center allocates the LED code 2 for the automatic driving automobile to represent three red lights. Therefore, two different mobile robots are identified, and the two different mobile robots can be conveniently identified in the subsequent shot images.
In the embodiment, the cloud processing center distributes broadband ultrasonic signal codes 1 to the pet robot dog and 2 to the automatic driving automobile, the intelligent processing unit of the pet robot dog generates linear frequency modulation signals 1 according to the broadband ultrasonic signal codes 1, and the parameter is f020KHz to f122KHz chirp signal with duration T1 s and sampling rate fs60KHz, the pet machine dog transmits signals as follows:
Figure BDA0002572479440000141
wherein, ω is0Is the starting angular frequency, omega, of the chirp signal0=2πf0/fs,fsIs the sampling rate, k is the frequency modulation rate,
Figure BDA0002572479440000142
n is the number of samples of the digital signal, and N ═ fsT。
Figure BDA0002572479440000143
Is the initial phase. The intelligent processing unit of the automatic driving automobile generates a corresponding linear frequency modulation signal 2 according to the distributed wideband ultrasonic signal code 2, and the parameter is f025KHz to f1=28KHz linear frequency-modulated signal, the time length of the frequency-modulated signal is T-1 s, and the sampling rate fs60KHz, the pet machine dog transmits signals as follows:
Figure BDA0002572479440000144
wherein, ω is0Is the starting angular frequency, omega, of the chirp signal0=2πf0/fs,fsIs the sampling rate, k is the frequency modulation rate,
Figure BDA0002572479440000151
n is the number of samples of the digital signal, and N ═ fsT。
Figure BDA0002572479440000152
Is the initial phase.
As shown in fig. 6, in this embodiment, the acoustic-optical-electric multi-mode distribution collaborative positioning and navigation system facing the pet robot dog and the automatic driving car works in the cloud processing high-precision positioning mode, and the working steps include:
r1, initialization: the pet machine dog enters a community range, a communication unit of the pet machine dog sends access information to a cloud processing center, and the cloud processing center distributes an LED code 1 and an ultrasonic broadband code 1 for the entering pet machine dog and sends the LED code and the ultrasonic broadband code to the pet machine dog through the communication unit. The LED code that pet machine dog obtained is 1, and the supersound wide band code is 1. And then, the automatic driving automobile enters a cell range, the communication unit of the automatic driving automobile sends access information to the cloud processing center, and the cloud processing center distributes the LED code 2 and the ultrasonic broadband code 2 for the entering automatic driving automobile and sends the LED code and the ultrasonic broadband code 2 to the automatic driving automobile through the communication unit. The LED code acquired by the automatic driving automobile is 2, and the ultrasonic broadband code is 2.
R2, mobile robot initialization: the intelligent processing unit of the pet robot dog generates a corresponding control signal of the LED light device and a corresponding digital broadband ultrasonic signal according to the LED code and the ultrasonic broadband code, controls the LED light device to light according to the LED code through the control unit, and controls the LED light device to light through the control unitAnd the system unit sends the digital broadband ultrasonic signals to corresponding ultrasonic transducers in the omnidirectional ultrasonic transmitting device. In this embodiment, the LED code 1 corresponds to a red light, so the pet robot dog lights a red LED light. In this embodiment, the intelligent processing unit of the pet robot dog correspondingly generates the parameter f according to the received ultrasonic wideband code 1020KHz to f122KHz chirp signal, and sends to the ultrasonic transducer in the omnidirectional ultrasonic transmitter. In this embodiment, all the ultrasound transducers in the same omnidirectional ultrasound transmitting apparatus transmit the same signal. The intelligent processing unit of the automatic driving automobile generates a corresponding control signal of the LED optical device and a corresponding digital broadband ultrasonic signal according to the LED code and the ultrasonic broadband code, controls the LED optical device to light up according to the LED code through the control unit, and sends the digital broadband ultrasonic signal to a corresponding ultrasonic transducer in the omnidirectional ultrasonic transmitting device of the automatic driving automobile through the control unit. In this embodiment, the LED code 2 corresponds to three red lights, so the autonomous vehicle lights three red LED lights. In this embodiment, the intelligent processing unit of the autonomous vehicle correspondingly generates the parameter f according to the received ultrasonic wideband code 2025KHz to f1The linear frequency modulation signal is 28KHz and is sent to the ultrasonic transducer in the omnidirectional ultrasonic transmitting device for transmission. In this embodiment, all the ultrasound transducers in the same omnidirectional ultrasound transmitting apparatus transmit the same signal. The ultra-wideband transmitting module transmits ultra-wideband signals.
R3, mobile robot acquiring image, polarization image: the camera arrays on the pet robot dog and the automatic driving automobile acquire images and polarized light images and send the images and the polarized light images to the cloud processing center through respective communication units.
R4, acquiring ultrasonic and image and polarization image by the distributed acousto-optic-electric array: each sound-light-electricity unit in the distributed sound-light-electricity array and the omnidirectional ultrasonic receiving device receive ultrasonic signals, the camera array obtains images, polarized light images and ultra-wideband signals, and the images, the polarized light images and the ultra-wideband signals are sent to the cloud processing center through a communication module of the sound-light-electricity unit.
R5, data fusion and processing: the cloud processing center receives signals sent by the pet machine dog and the automatic driving automobile and signals sent by the distributed acousto-optic-electric array, runs related processing programs, calculates the positions of the pet machine dog and the automatic driving automobile, finishes path planning and sends the positions to the pet machine dog and the automatic driving automobile.
R6, pet machine dog and auto drive car operation: if the pet robot dog and the automatic driving automobile receive an ending instruction of the cloud processing center, the pet robot dog and the automatic driving automobile are shut down; otherwise, the communication units of the pet robot dog and the automatic driving automobile receive the path planning, and the control unit controls the mechanical device to move according to the coordinates of the path planning. And continuously repeating the steps R3-R6 while the mechanical device moves, and continuously updating the path planning results of the pet robot dog and the automatic driving automobile.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (6)

1. The system is characterized by comprising one or more mobile robots, a distributed acousto-optic array and a cloud processing center, wherein each mobile robot comprises a mechanical device, an intelligent processing unit, a control unit, a communication unit, an LED (light-emitting diode) optical device, an omnidirectional ultrasonic emission device, a camera array, an ultra-wideband emission module, a functional module and a power supply, the distributed acousto-optic array is formed by a plurality of acousto-optic units which are arranged in the moving range of the mobile robot, and the acousto-optic units are arranged in a required shape according to a specific application scene to form an array; the cloud processing center is respectively in wireless connection with the mobile robot and the acousto-optic-electric unit, receives the image and the polarized light image sent by the mobile robot, receives the ultrasonic, the image, the polarized light image and the ultra-wideband signal sent by the acousto-optic-electric unit, fuses and processes the received information, intelligently calculates the position information of the mobile robot, plans a reasonable path, and sends path planning information to the mobile robot;
the mobile robot-oriented acoustic-optical multi-mode distributed collaborative positioning and navigation system comprises an independent computing mode and a cloud processing high-precision positioning mode, wherein the independent computing mode is that the mobile robot realizes positioning through computing ultrasonic, image and ultra-wideband signals, and the cloud processing high-precision positioning mode is that the ultrasonic, image and ultra-wideband signals acquired by a distributed acoustic-optical array and the image and polarized light image acquired by the mobile robot are all sent to a cloud processing center to be processed, so that high-precision positioning and navigation information is obtained.
2. The mobile robot-oriented acoustic-optical-electric multi-mode distributed collaborative positioning and navigation system according to claim 1, wherein in the mobile robot, mechanical devices walk and turn under the control of a connected control unit; the intelligent processing unit generates access information, a broadband digital ultrasonic signal and an LED lamp signal, and runs a signal processing and fusion algorithm in an independent calculation mode of the mobile robot to obtain the positioning and navigation information of the mobile robot; the control unit is connected with the intelligent processing unit and respectively controls the LED optical device, the omnidirectional ultrasonic emission device, the ultra-wideband emission module and the mechanical device according to instructions given by the intelligent processing unit; the communication unit is used for the mobile robot to perform wireless communication with the cloud processing center and the distributed acoustic-optical-electric array respectively, the communication unit is communicated with the distributed acoustic-optical-electric array in an independent computing mode of the mobile robot to receive ultrasonic signals, images, polarized images and ultra-wideband signals collected by the acoustic-optical-electric array, and the communication unit is communicated with the cloud processing center to receive instructions and positioning and navigation information from the cloud processing center in a high-precision positioning mode of cloud processing; the LED light device is composed of an LED lamp array and emits LED light to the periphery of the mobile robot, and the color and the lighting sequence of the LED light are controlled by the control unit; the omnidirectional ultrasonic transmitting device comprises an ultrasonic transducer array consisting of a plurality of ultrasonic transducers and a digital-to-analog conversion module, and ultrasonic signals are transmitted to all directions around the mobile robot, wherein the digital-to-analog conversion module converts digital broadband ultrasonic signals into analog signals, and the ultrasonic transducers in the ultrasonic transducer array convert electric signals into acoustic signals; the camera array is an array formed by a plurality of cameras and is used for acquiring optical images of the mobile robot in all directions; the ultra-wideband transmitting module is used for transmitting an ultra-wideband electric signal; the power supply supplies power to all the components of the mobile robot.
3. The mobile robot-oriented acoustic-optical multi-mode distributed collaborative positioning and navigation system according to claim 1, wherein the acoustic-optical-electrical units in the distributed acoustic-optical-electrical array comprise an omnidirectional ultrasonic receiving device, a camera array, an ultra-wideband receiving module and a communication module, wherein the omnidirectional ultrasonic receiving device receives ultrasonic signals from all directions and comprises an ultrasonic transducer array and an analog-to-digital conversion module, the ultrasonic transducer array is used for converting the received ultrasonic signals into electric signals, and the analog-to-digital conversion unit converts the electric signals into digital signals and sends the digital signals to the communication module; the camera array is an array formed by a plurality of cameras and is used for acquiring images in all directions; the ultra-wideband receiving module is used for receiving ultra-wideband signals; the communication module is used for realizing wireless communication between the acousto-optic-electric unit and the cloud processing center and between the acousto-optic-electric unit and the mobile robot respectively, and in an independent computing mode of the mobile robot, the communication module sends received ultrasonic signals, images, polarized light images and ultra-wideband signals to the mobile robot, and in a high-precision positioning mode of cloud processing, the communication module sends the received ultrasonic signals, images, polarized light images and ultra-wideband signals to the cloud processing center.
4. The mobile robot-oriented acousto-optic multimode distributed collaborative positioning and navigation system according to claim 1, characterized in that when a plurality of mobile robots are present in the mobile robot-oriented acousto-optic multimode distributed collaborative positioning and navigation system, the cloud processing center assigns different mobile robots with different LED codes and different wideband ultrasound signal codes, the light emitted by the LED optical devices of the different mobile robots has different codes, which facilitates identification of the different mobile robots in the captured images and the polarized images, and the ultrasound signals emitted by the omnidirectional ultrasound emitting devices of the different mobile robots have different wideband ultrasound signal codes, which specify the parameters of the ultrasound signals used, which facilitates identification of the different mobile robots in the ranging algorithm.
5. The mobile robot-oriented acoustic-optical-electric multi-mode distributed collaborative positioning and navigation system according to claim 1, wherein when the mobile robot-oriented acoustic-optical-electric multi-mode distributed collaborative positioning and navigation system is in a cloud processing high-precision positioning mode, the working steps include:
t1, initialization: the mobile robot is started, an intelligent processing unit of the mobile robot generates access information, the communication unit sends the access information to the cloud processing center, and the cloud processing center distributes an LED code and an ultrasonic broadband code to the started mobile robot and sends the LED code and the ultrasonic broadband code to the mobile robot through the communication unit;
t2, mobile robot initialization: the intelligent processing unit of the mobile robot generates a corresponding control signal of the LED optical device and a corresponding digital broadband ultrasonic signal according to the LED code and the ultrasonic broadband code, controls the LED optical device to light through the control unit, sends the digital broadband ultrasonic signal to a corresponding ultrasonic transducer in the omnidirectional ultrasonic transmitting device through the control unit, and controls the ultra-wideband transmitting module to transmit an ultra-wideband signal;
t3, acquiring images by the mobile robot, and polarizing images: the camera array on the mobile robot acquires an image and a polarized image and sends the image and the polarized image to the cloud processing center through the communication unit of the mobile robot;
t4, acquiring ultrasound, images, polarized images and ultra-wideband signals by the distributed acousto-optic-electric array: an omnidirectional ultrasonic receiving device in each acousto-optic-electrical unit in the distributed acousto-optic-electrical array receives an ultrasonic signal, a camera array acquires an image and a polarized image, an ultra-wideband receiving module receives an ultra-wideband signal and sends the ultra-wideband signal to a cloud processing center through a communication module of the acousto-optic-electrical unit;
t5, data fusion and processing: the cloud processing center receives signals sent by the mobile robot and signals sent by the distributed acousto-optic-electric array, calculates the position of the mobile robot and the position of a system working target, completes path planning and sends a path planning result to the mobile robot;
t6, operation of the mobile robot: if the communication unit of the mobile robot receives a command of ending the task from the cloud processing center, the mobile robot is powered off, otherwise, the communication unit of the mobile robot receives the path plan, the control unit controls the mechanical device to move according to the coordinates of the path plan, and when the mechanical device moves, the steps T3-T6 are repeated continuously, and the path plan result of the mobile robot is updated continuously.
6. The mobile robot-oriented acoustic-optical-electric multi-mode distributed collaborative positioning and navigation system according to claim 1, wherein when the mobile robot-oriented acoustic-optical-electric multi-mode distributed collaborative positioning and navigation system is in a mobile robot independent computing mode, the working steps include:
r1, initialization: the mobile robot is started, an intelligent processing unit of the mobile robot generates a control signal of a corresponding LED optical device and a corresponding digital broadband ultrasonic signal, the LED optical device is controlled to light through the control unit, the digital broadband ultrasonic signal is sent to a corresponding ultrasonic transducer in the omnidirectional ultrasonic transmitting device through the control unit, and the ultra-wideband transmitting module is controlled by the control unit to transmit an ultra-wideband signal;
r2, mobile robot acquiring image, polarization image: a camera array on the mobile robot acquires an image and a polarized light image;
r3, acquiring ultrasonic and image and polarization image by the distributed acousto-optic-electric array: an omnidirectional ultrasonic receiving device in each acousto-optic-electrical unit in the distributed acousto-optic-electrical array receives an ultrasonic signal, a camera array acquires an image and a polarized image, an ultra-wideband module receives an ultra-wideband signal, and the ultra-wideband signal is sent to the mobile robot through a communication module of the acousto-optic-electrical unit;
r4, data fusion and processing: and the mobile robot calculates the position of the mobile robot and the position of a system working target according to the image, the polarization image and the signal sent by the distributed acousto-optic-electric array, which are acquired by the mobile robot, completes path planning, repeats the steps R2-R4, and continuously updates the path planning result of the mobile robot.
CN202010644082.9A 2020-07-07 2020-07-07 Acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for mobile robot Active CN111947659B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010644082.9A CN111947659B (en) 2020-07-07 2020-07-07 Acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010644082.9A CN111947659B (en) 2020-07-07 2020-07-07 Acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for mobile robot

Publications (2)

Publication Number Publication Date
CN111947659A true CN111947659A (en) 2020-11-17
CN111947659B CN111947659B (en) 2022-05-24

Family

ID=73341803

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010644082.9A Active CN111947659B (en) 2020-07-07 2020-07-07 Acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for mobile robot

Country Status (1)

Country Link
CN (1) CN111947659B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104390643A (en) * 2014-11-24 2015-03-04 上海美琦浦悦通讯科技有限公司 Method for realizing indoor positioning based on multi-information fusion
CN105116378A (en) * 2015-09-30 2015-12-02 长沙开山斧智能科技有限公司 Wireless and ultrasonic composite location system and location method for wireless and ultrasonic composite location system
GB201703647D0 (en) * 2017-03-07 2017-04-19 Sonitor Technologies As Ultrasound position-determination system
CN106647766A (en) * 2017-01-13 2017-05-10 广东工业大学 Robot cruise method and system based on complex environment UWB-vision interaction
CN107356256A (en) * 2017-07-05 2017-11-17 中国矿业大学 A kind of indoor high-accuracy position system and method for multi-source data mixing
CN108810133A (en) * 2018-06-08 2018-11-13 深圳勇艺达机器人有限公司 A kind of intelligent robot localization method and positioning system based on UWB and TDOA algorithms
CN109129507A (en) * 2018-09-10 2019-01-04 北京联合大学 A kind of medium intelligent introduction robot and explanation method and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104390643A (en) * 2014-11-24 2015-03-04 上海美琦浦悦通讯科技有限公司 Method for realizing indoor positioning based on multi-information fusion
CN105116378A (en) * 2015-09-30 2015-12-02 长沙开山斧智能科技有限公司 Wireless and ultrasonic composite location system and location method for wireless and ultrasonic composite location system
CN106647766A (en) * 2017-01-13 2017-05-10 广东工业大学 Robot cruise method and system based on complex environment UWB-vision interaction
GB201703647D0 (en) * 2017-03-07 2017-04-19 Sonitor Technologies As Ultrasound position-determination system
CN107356256A (en) * 2017-07-05 2017-11-17 中国矿业大学 A kind of indoor high-accuracy position system and method for multi-source data mixing
CN108810133A (en) * 2018-06-08 2018-11-13 深圳勇艺达机器人有限公司 A kind of intelligent robot localization method and positioning system based on UWB and TDOA algorithms
CN109129507A (en) * 2018-09-10 2019-01-04 北京联合大学 A kind of medium intelligent introduction robot and explanation method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
徐晋鸿: "基于多传感器融合的机器人导航级大范围室内定位研究", 《中国优秀硕士学位论文全文数据库》 *
韩昊旻: "基于多传感信息融合的迎宾机器人导航系统设计与实现", 《中国优秀硕士学位论文全文数据库》 *

Also Published As

Publication number Publication date
CN111947659B (en) 2022-05-24

Similar Documents

Publication Publication Date Title
US10771935B2 (en) Device locating using angle of arrival measurements
US10444751B2 (en) Surveying system
US6496755B2 (en) Autonomous multi-platform robot system
CN108536145A (en) A kind of robot system intelligently followed using machine vision and operation method
CN108762255A (en) A kind of indoor intelligent mobile robot and control method
WO2014068406A2 (en) Device for optically scanning and measuring an environment
CA2628657A1 (en) Landmark navigation for vehicles using blinking optical beacons
US11105886B2 (en) Three-dimensional asset tracking using radio frequency-enabled nodes
CN214520204U (en) Port area intelligent inspection robot based on depth camera and laser radar
CN106646335A (en) Positioning device based on intelligent street lamp communication network and using method thereof
WO2020156523A1 (en) Intelligent power wireless charging system for electric wheelchairs, robotic arm and ranging sensor thereof
CN208027170U (en) A kind of power-line patrolling unmanned plane and system
CN104953709A (en) Intelligent patrol robot of transformer substation
CN109471124A (en) Indoor Global localization system and method based on line laser rotary scanning
WO2021243696A1 (en) Vehicle navigation positioning method and apparatus, and base station, system and readable storage medium
CN109571470A (en) A kind of robot
CN109029423A (en) Substation's indoor mobile robot navigation positioning system and its navigation locating method
KR100581086B1 (en) Method and apparatus for mobile robot localization using led of rfid tag
CN111947659B (en) Acoustic-optical-electric multi-mode distribution cooperative positioning and navigation system for mobile robot
CN211317332U (en) AGV positioning system based on ultra wide band and vision two-dimensional code navigation technology
CN115248039A (en) Multi-robot-multi-person cooperation control method, device and system
CN112327868A (en) Intelligent robot automatic navigation system
CN110440812A (en) A kind of interior unmanned plane high-precision three-dimensional positioning navigation device
KR100590210B1 (en) Method for mobile robot localization and navigation using RFID, and System for thereof
Zeng et al. Study on inspection robot for substation based on ultra-wide-band wireless localization system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant