CN108536145A - A kind of robot system intelligently followed using machine vision and operation method - Google Patents
A kind of robot system intelligently followed using machine vision and operation method Download PDFInfo
- Publication number
- CN108536145A CN108536145A CN201810317154.1A CN201810317154A CN108536145A CN 108536145 A CN108536145 A CN 108536145A CN 201810317154 A CN201810317154 A CN 201810317154A CN 108536145 A CN108536145 A CN 108536145A
- Authority
- CN
- China
- Prior art keywords
- target
- image
- module
- sensor
- machine vision
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 17
- 238000001514 detection method Methods 0.000 claims abstract description 28
- 230000033001 locomotion Effects 0.000 claims abstract description 27
- 230000004888 barrier function Effects 0.000 claims abstract description 22
- 238000004458 analytical method Methods 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 3
- 238000012216 screening Methods 0.000 claims description 3
- 238000004519 manufacturing process Methods 0.000 abstract description 4
- 238000011017 operating method Methods 0.000 abstract 1
- 230000006870 function Effects 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- DMBHHRLKUKUOEG-UHFFFAOYSA-N diphenylamine Chemical compound C=1C=CC=CC=1NC1=CC=CC=C1 DMBHHRLKUKUOEG-UHFFFAOYSA-N 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Acoustics & Sound (AREA)
- Electromagnetism (AREA)
- Manipulator (AREA)
Abstract
The invention discloses a kind of robot system intelligently followed using machine vision and operation methods, main control module is connect with camera, module of target detection, motion module, intelligent early-warning system and sensor detecting module signal respectively, and the camera is divided into the fixed left camera of relative position and right camera;Operating procedure includes Image Acquisition, determines that target, image pairing and camera calibration, target search and image arrange, establish coordinate, determine relative position and detection of obstacles, path planning, motion tracking:Driving equipment system follows target to move.The present invention can make robot equipment complete object and follow the Relatively orientation between object, and then realize optimal path planning.Manufacturing cost is low simultaneously, is combined detection barrier using multiple sensors, can successfully avoid the barrier of various situations.
Description
【Technical field】
It is especially a kind of to carry out intelligent follow using machine vision the present invention relates to the technical field for independently following robot
Robot system and operation method technical field.
【Background technology】
Currently, one of the technological difficulties independently followed are exactly object and follow being mutually positioned between object, positioning be with
The premise that the complex tasks such as path planning, independent navigation are completed with object, is the research hotspot in mobile robot field.
Existing location technology is divided by realization method satellite positioning tech (GPS, the Big Dipper, Galileo etc.);Wireless electromagnetic
Wave location technology (including infrared ray, bluetooth, wi f i, RF I D, UWB, mobile communication signal etc.);Ultrasonic wave positions;Laser
Radar fix;Machine vision positioning etc..Wherein, satellite positioning especially GPS is current most widely used outdoor positioning technology,
Its advantage is that satellite effective coverage range is big, and location navigation signal is free.But when GPS receiver works indoors, signal by
The influence of building and be greatly attenuated, positioning accuracy is relatively low, so not being suitable for indoor positioning;Radio magnetic wave location technology needs
It to be pre-designed according to signal characteristic according to the suitable electromagnetic wave signal of environmental selection, is not easy to be integrated into other systems;It is super
Although acoustic location precision is up to Centimeter Level, precision is relatively high, and ultrasonic wave is decayed in transmission process obviously, to influence it
Effective range is positioned, and cost is higher.In recent years with the rise of unmanned technology, but occur laser radar location technology and
Machine vision technique.Both technologies are all based on visible light and carry out Context awareness and target identification, to realize positioning function.
The opposite machine vision fixation and recognition of laser radar positioning is more accurate, but the cost of its parts is higher, and power consumption is also larger.
【Invention content】
The purpose of the present invention is exactly to solve the problems of the prior art, and proposition is a kind of to carry out intelligent follow using machine vision
Robot system and operation method, robot equipment can be made to complete object and follow relative position between object fixed
Position, and then realize optimal path planning.Manufacturing cost is low simultaneously, and detection barrier, Ke Yicheng are combined using multiple sensors
Work(avoids the barrier of various situations.
To achieve the above object, the present invention proposes a kind of robot system intelligently followed using machine vision,
Including
Main control module, the information for handling and transferring system modules transmit;
Camera, for obtaining external world image;
Module of target detection, the image returned for screening thecamera head, and determine target image;
Motion module is used for mobile tracking target;
Intelligent early-warning system;Parameter information for handling the acquisition of sensor detecting module,
And it communicates information to main control module and does decision;
Sensor detecting module, for detecting barrier, there are situations;
The main control module is visited with camera, module of target detection, motion module, intelligent early-warning system and sensor respectively
Module by signal connection is surveyed, the camera is divided into the fixed left camera of relative position and right camera.
Preferably, further including the extended function module interface for extending detection obstacle sensor, the extension work(
Can module interface can be connect at least one of infrared sensor, ultrasonic sensor and laser radar sensor, institute
Extended function module interface is stated to connect with main control module signal.The extended function module interface can with infrared sensor,
Ultrasonic sensor is connected with laser radar sensor.
Preferably, the motion module includes chassis, land wheel and power motor, the power motor is installed in chassis
On, the transmission of rotary axis of the land wheel and power motor connects.
Preferably, the motion module includes chassis and the support leg that can walk with joint, the support leg is solid
It is located on chassis.
Preferably, the motion module includes chassis, floatage air bag and propeller, the floatage air bag and propeller point
It is not installed in chassis lower part.
It is passed preferably, the sensor detecting module includes infrared sensor, ultrasonic sensor and laser radar
At least one of sensor.The sensor detecting module includes that infrared sensor, ultrasonic sensor and laser radar pass
Three kinds of sensors of sensor.
To achieve the above object, the present invention proposes a kind of robot system fortune intelligently followed using machine vision
Row method, including include the following steps:
S1, Image Acquisition:Left and right cameras acquires image simultaneously;
S2 determines target:In left image, selected target is manually operated;
S3, image pairing and camera calibration:The image that corresponding left and right cameras acquires simultaneously, it is opposite to obtain figure;Output
Calibrating parameters;
S4, target search and image arrange:Target is searched in left image, and exports the pixel coordinate of target;Utilize a left side
Right image calls binocular vision module;
S5 establishes coordinate:Establish the three-dimensional coordinate of target and ambient enviroment;
S6 determines relative position and detection of obstacles:According to three-dimensional coordinate export angle between target and this equipment, away from
From information;Detect and determine the position of barrier;
S7, path planning:Comprehensive analysis obtains the path of optimal tracking target;
S8, motion tracking:Driving equipment system follows target to move.
Preferably, the detection of obstacles includes:
S61, ranging:Barrier is measured at a distance from this equipment.
S62, data processing:The numerical value that comprehensive different sensors measure obtains unified value;
S63 determines Obstacle Position:Obtaining barrier, there are situations and conclusion.
Preferably, in the step S2, determine target be by the artificial selected object imported in advance in systems or
Person people's image.
Preferably, in the step S6, detection of obstacles passes through infrared sensor, ultrasonic sensor and laser thunder
Up at least one of sensor.
Preferably, in the step S6, detection of obstacles passes through infrared sensor, ultrasonic sensor and laser thunder
Up to three kinds of sensors of sensor.
Beneficial effects of the present invention:The present invention is by using the location technology based on machine vision.In the present solution, machine
Two fixed cameras of relative position are equipped on device people, by the image of one of camera, detection in real time and (or) tracking
Target object;Simultaneously using the image of double image machine, the relative position and target and machine of robot and ambient enviroment are calculated
Relative dimensional information between people, and then robot is cooked up to the best route of target, while manufacturing cost is low, and use is a variety of
Sensor is combined detection barrier, can successfully avoid the barrier of various situations.
The feature and advantage of the present invention will be described in detail by embodiment combination attached drawing.
【Description of the drawings】
Fig. 1 is a kind of frame diagram of the robot system intelligently followed using machine vision of the present invention;
The step of Fig. 2 is a kind of robot system operation method intelligently followed using machine vision of the invention is schemed;
Fig. 3 is the block diagram of detection of obstacles step.
【Specific implementation mode】
Refering to fig. 1, a kind of robot system intelligently followed using machine vision of the present invention, including
Main control module, the information for handling and transferring system modules transmit;
Camera, for obtaining external world image;
Module of target detection, the image returned for screening thecamera head, and determine target image;
Motion module is used for mobile tracking target;
Intelligent early-warning system;Parameter information for handling the acquisition of sensor detecting module, and communicate information to master control
Module does decision;
Sensor detecting module, for detecting barrier, there are situations;
The main control module is visited with camera, module of target detection, motion module, intelligent early-warning system and sensor respectively
Module by signal connection is surveyed, the camera is divided into the fixed left camera of relative position and right camera.Further include for extending
Detect the extended function module interface of obstacle sensor, the extended function module interface can be with infrared sensor, super
At least one of sonic sensor and laser radar sensor connect, the extended function module interface and main control module signal
Connection.The extended function module interface can be connect with infrared sensor, ultrasonic sensor and laser radar sensor.
The motion module includes chassis, land wheel and power motor, and the power motor is installed on chassis, the land wheel and power electric
The transmission of rotary axis of machine connects.The motion module includes chassis and the support leg that can walk with joint, the support leg
It is installed on chassis.The motion module includes chassis, floatage air bag and propeller, and the floatage air bag and propeller are solid respectively
It is located at chassis lower part.The sensor detecting module includes infrared sensor, ultrasonic sensor and laser radar sensor
At least one of.The sensor detecting module includes infrared sensor, ultrasonic sensor and laser radar sensor
Three kinds of sensors.
Refering to Fig. 2 and Fig. 3, a kind of robot system operation method intelligently followed using machine vision of the invention,
Include the following steps:
S1, Image Acquisition:Left and right cameras acquires image simultaneously;
S2 determines target:In left image, selected target is manually operated;
S3, image pairing and camera calibration:The image that corresponding left and right cameras acquires simultaneously, it is opposite to obtain figure;Output
Calibrating parameters;
S4, target search and image arrange:Target is searched in left image, and exports the pixel coordinate of target;Utilize a left side
Right image calls binocular vision module;
S5 establishes coordinate:Establish the three-dimensional coordinate of target and ambient enviroment;
S6 determines relative position and detection of obstacles:According to three-dimensional coordinate export angle between target and this equipment, away from
From information;Detect and determine the position of barrier;
S7, path planning:Comprehensive analysis obtains the path of optimal tracking target;
S8, motion tracking:Driving equipment system follows target to move.
Preferably, the detection of obstacles includes:
S61, ranging:Barrier is measured at a distance from this equipment.
S62, data processing:The numerical value that comprehensive different sensors measure obtains unified value;
S63 determines Obstacle Position:Obtaining barrier, there are situations and conclusion.
In the step S2, determine that target is by the artificial selected object imported in advance or people's image in systems.,
In the step S6, detection of obstacles by infrared sensor, ultrasonic sensor and laser radar sensor at least
It is a kind of., in the step S6, detection of obstacles passes through infrared sensor, ultrasonic sensor and laser radar sensor three
Kind sensor.
The intelligence follows robot system mainly to use machine vision technique, in combination with traditional sensor measurement techniques
(such as laser measurement, ultrasonic measurement, infra-red detection) is realized robot to the intelligent retrieval of target and is followed.
Concrete scheme is used by the system:The homotype that two relative positions are fixed and optical axis is parallel is carried in robot
(subsequent narration is denoted as left camera and right camera respectively for convenience, they remember respectively in the image of synchronization for number video camera
Make left image and right image), and image is acquired with identical sampling frequency synchronization.It carves at the beginning, needs the hand in left image
The dynamic interested target of selection, as tracking object, then in subsequent left image sequence, search (tracking and detection) respectively
Go out the position coordinates (pixel coordinate) of target.Recycle binocular vision theoretical, respectively according to the texture information of left images, extraction
Corresponding feature, and the feature of left images is matched one by one, corresponding parallax information is calculated, physical space is then recovered
Three-dimensional coordinate of the interior each point relative to robot, constructs corresponding map in real time.In combination with the target searched out in image
Pixel coordinate calculates three-dimensional coordinate of the interesting target relative to robot, and then calculates the relative angle between them
And relative distance, guidance machine people move to target.
The program can calculate the three-dimensional seat within the scope of certain angle in front of robot while realizing target following
Mark, including information it is relatively abundant, facilitate the structure of local map, be conducive to guidance machine people and get around barrier, cook up conjunction
The path of reason, and then realize the real-time tracking to target.In addition, the project is also mounted with multiple traditional types in robot
Sensor, the barrier in assist process machine vision blind area, improves the stability and reliability of system.The robot system,
By vision module and traditional sensor, data are transmitted to control module in real time, data are carried out by software kernels algorithm
Fusion and analysis, carry out comprehensive judgement and decision, provide rational movement instruction, and the base plate electric machine for being transferred to robot is held
Row module realizes that the autonomous of mobile robot follows movement.
In this scenario, it can be changed according to the location information of target, adjust movement speed and the direction of robot in real time,
To ensure that robot and target are maintained in rational distance and angular range.That is the movement speed of robot and its distance objective
Distance there are positive correlation, there are positive correlations for moving direction and relative angle variation between them.The program is adopted
With multiple sensors, and the movement of robot is controlled by data fusion, passes through data analysis and deeply learning algorithm
Integrated decision-making is carried out, smooth track is cooked up, single sensing data is reduced as much as possible and is interfered or failure conditions
Under to moving the influence of decision, while the barrier of direction of advance can be evaded in time, increase the safety of robot motion
A kind of robot system intelligently followed using machine vision of the present invention, the present invention is in machine vision technique
On the basis of complete object and follow being mutually positioned between object, and then functions such as realizing route planning.The core of the present invention
Heart parts are binocular cameras, and cost only has the 1/20 of laser radar, and at present the related algorithm of machine vision more at
Ripe, operation platform is more efficient, and high-precision positioning and target identification may be implemented, while being combined using multiple sensors
Detecting module can successfully avoid the barrier of various situations using the control algolithm realizing route contexture by self efficiently simplified.
In addition, using BLDC brshless DC motors in terms of the movement chassis of the present invention, execution efficiency is high, and production cost is low.Using scene
Bus (CAN bus) realizes the stable communication between each function module.
The foregoing is merely the preferred embodiment of the present invention, are not intended to limit the scope of the invention, every utilization
Equivalent structure or equivalent flow shift made by description of the invention and accompanying drawing content is applied directly or indirectly in other correlations
Technical field, be included within the scope of the present invention.
Claims (10)
1. a kind of robot system intelligently followed using machine vision, it is characterised in that:Including
Main control module, the information for handling and transferring system modules transmit;
Camera, for obtaining external world image;
Module of target detection, the image returned for screening thecamera head, and determine target image;
Motion module is used for mobile tracking target;
Intelligent early-warning system;Parameter information for handling the acquisition of sensor detecting module, and communicate information to main control module
Do decision;
Sensor detecting module, for detecting barrier, there are situations;
The main control module detects mould with camera, module of target detection, motion module, intelligent early-warning system and sensor respectively
Block signal connects, and the camera is divided into the fixed left camera of relative position and right camera.
2. a kind of robot system intelligently followed using machine vision as described in claim 1, it is characterised in that:Also
Include for extend detection obstacle sensor extended function module interface, the extended function module interface can with it is infrared
At least one of line sensor, ultrasonic sensor and laser radar sensor connect, the extended function module interface with
Main control module signal connects.
3. a kind of robot system intelligently followed using machine vision as claimed in claim 2, it is characterised in that:Institute
It includes chassis, land wheel and power motor to state motion module, and the power motor is installed on chassis, the land wheel and power motor
Transmission of rotary axis connection.
4. a kind of robot system intelligently followed using machine vision as claimed in claim 2, it is characterised in that:Institute
It includes chassis and the support leg that can walk with joint to state motion module, and the support leg is installed on chassis.
5. a kind of robot system intelligently followed using machine vision as claimed in claim 2, it is characterised in that:Institute
It includes chassis, floatage air bag and propeller to state motion module, and the floatage air bag and propeller are installed in chassis lower part respectively.
6. a kind of robot system intelligently followed using machine vision as described in claim 1, it is characterised in that:Institute
It includes at least one of infrared sensor, ultrasonic sensor and laser radar sensor to state sensor detecting module.
7. a kind of robot system operation method intelligently followed using machine vision, it is characterised in that:Including walking as follows
Suddenly:
S1, Image Acquisition:Left and right cameras acquires image simultaneously;
S2 determines target:In left image, selected target is manually operated;
S3, image pairing and camera calibration:The image that corresponding left and right cameras acquires simultaneously, it is opposite to obtain figure;Output calibration
Parameter;
S4, target search and image arrange:Target is searched in left image, and exports the pixel coordinate of target;Schemed using left and right
Picture calls binocular vision module;
S5 establishes coordinate:Establish the three-dimensional coordinate of target and ambient enviroment;
S6 determines relative position and detection of obstacles:Angle, distance between target and this equipment are exported according to three-dimensional coordinate to believe
Breath;Detect and determine the position of barrier;
S7, path planning:Comprehensive analysis obtains the path of optimal tracking target;
S8, motion tracking:Driving equipment system follows target to move.
8. a kind of robot system operation method intelligently followed using machine vision as claimed in claim 7, special
Sign is:The detection of obstacles includes:
S61, ranging:Barrier is measured at a distance from this equipment.
S62, data processing:The numerical value that comprehensive different sensors measure obtains unified value;
S63 determines Obstacle Position:Obtaining barrier, there are situations and conclusion.
9. a kind of robot system operation method intelligently followed using machine vision as claimed in claim 7, special
Sign is:In the step S2, determine that target is by the artificial selected object imported in advance or people's image in systems.
10. a kind of robot system operation method intelligently followed using machine vision as claimed in claim 7, special
Sign is:In the step S6, detection of obstacles passes through in infrared sensor, ultrasonic sensor and laser radar sensor
At least one.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810317154.1A CN108536145A (en) | 2018-04-10 | 2018-04-10 | A kind of robot system intelligently followed using machine vision and operation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810317154.1A CN108536145A (en) | 2018-04-10 | 2018-04-10 | A kind of robot system intelligently followed using machine vision and operation method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108536145A true CN108536145A (en) | 2018-09-14 |
Family
ID=63479855
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810317154.1A Pending CN108536145A (en) | 2018-04-10 | 2018-04-10 | A kind of robot system intelligently followed using machine vision and operation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108536145A (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109869799A (en) * | 2019-01-02 | 2019-06-11 | 厦门攸信信息技术有限公司 | A kind of mobile robot and its heating method of heating |
CN109947105A (en) * | 2019-03-27 | 2019-06-28 | 科大智能机器人技术有限公司 | A kind of speed regulating method and speed regulation device of automatic tractor |
CN110103257A (en) * | 2019-05-10 | 2019-08-09 | 东方电子股份有限公司 | One kind is based on double line laser radar independent navigation robots |
CN110103237A (en) * | 2019-05-13 | 2019-08-09 | 湖北经济学院 | The follower type robot Fellow of view-based access control model target following |
CN110320523A (en) * | 2019-07-05 | 2019-10-11 | 齐鲁工业大学 | Follow the target locating set and method of robot |
CN110320912A (en) * | 2019-07-10 | 2019-10-11 | 苏州欧博智慧机器人有限公司 | The AGV positioning navigation device and method that laser is merged with vision SLAM |
CN110456791A (en) * | 2019-07-30 | 2019-11-15 | 中国地质大学(武汉) | A kind of leg type mobile robot object ranging and identifying system based on monocular vision |
CN110543177A (en) * | 2019-09-27 | 2019-12-06 | 珠海市一微半导体有限公司 | Robot for walking baby automatically and method for walking baby automatically |
CN111157008A (en) * | 2020-03-05 | 2020-05-15 | 齐鲁工业大学 | Local autonomous navigation system and method based on multidimensional environment information perception |
CN111823228A (en) * | 2020-06-08 | 2020-10-27 | 中国人民解放军战略支援部队航天工程大学 | Indoor following robot system and operation method |
CN112286184A (en) * | 2020-09-30 | 2021-01-29 | 广东唯仁医疗科技有限公司 | Outdoor surveying robot control method and system based on 5G network |
CN112650245A (en) * | 2020-12-22 | 2021-04-13 | 江苏艾雨文承养老机器人有限公司 | Following robot, following system and following method thereof |
CN112859854A (en) * | 2021-01-08 | 2021-05-28 | 姜勇 | Camera system and method of camera robot capable of automatically following camera shooting |
CN113093729A (en) * | 2021-03-10 | 2021-07-09 | 上海工程技术大学 | Intelligent shopping trolley based on vision and laser radar and control method |
CN113352313A (en) * | 2020-03-06 | 2021-09-07 | 思特威(上海)电子科技股份有限公司 | Multi-level sensor decision control system of robot |
CN113552868A (en) * | 2020-04-22 | 2021-10-26 | 西门子股份公司 | Navigation method and navigation device of fire-fighting robot |
CN114102601A (en) * | 2021-12-07 | 2022-03-01 | 深圳市坤易电子有限公司 | Matrix queuing control system of fighting robot |
WO2022173476A1 (en) * | 2021-02-12 | 2022-08-18 | Sony Group Corporation | Auto-calibrating n-configuration volumetric camera capture array |
CN115437299A (en) * | 2022-10-10 | 2022-12-06 | 北京凌天智能装备集团股份有限公司 | Accompanying transportation robot advancing control method and system |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100635826B1 (en) * | 2005-04-25 | 2006-10-19 | 엘지전자 주식회사 | System and method for estimating location of a robot |
CN104960652A (en) * | 2015-06-23 | 2015-10-07 | 山东科技大学 | Underwater operation robot and working method thereof |
US9372095B1 (en) * | 2014-05-08 | 2016-06-21 | Google Inc. | Mobile robots moving on a visual display |
CN105955251A (en) * | 2016-03-11 | 2016-09-21 | 北京克路德人工智能科技有限公司 | Vision following control method of robot and robot |
CN106094875A (en) * | 2016-06-27 | 2016-11-09 | 南京邮电大学 | A kind of target follow-up control method of mobile robot |
CN106444763A (en) * | 2016-10-20 | 2017-02-22 | 泉州市范特西智能科技有限公司 | Intelligent automatic following method based on visual sensor, system and suitcase |
CN106680832A (en) * | 2016-12-30 | 2017-05-17 | 深圳优地科技有限公司 | Obstacle detection method and device of mobile robot and mobile robot |
CN106774324A (en) * | 2016-12-22 | 2017-05-31 | 以恒激光科技(北京)有限公司 | A kind of three-dimensional identification patrol robot of dual camera |
CN106970627A (en) * | 2017-05-17 | 2017-07-21 | 深圳市元时科技有限公司 | A kind of intelligent system for tracking |
CN107422730A (en) * | 2017-06-09 | 2017-12-01 | 武汉市众向科技有限公司 | The AGV transportation systems of view-based access control model guiding and its driving control method |
CN107659918A (en) * | 2017-08-11 | 2018-02-02 | 东北电力大学 | A kind of method and system intelligently followed |
-
2018
- 2018-04-10 CN CN201810317154.1A patent/CN108536145A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100635826B1 (en) * | 2005-04-25 | 2006-10-19 | 엘지전자 주식회사 | System and method for estimating location of a robot |
US9372095B1 (en) * | 2014-05-08 | 2016-06-21 | Google Inc. | Mobile robots moving on a visual display |
CN104960652A (en) * | 2015-06-23 | 2015-10-07 | 山东科技大学 | Underwater operation robot and working method thereof |
CN105955251A (en) * | 2016-03-11 | 2016-09-21 | 北京克路德人工智能科技有限公司 | Vision following control method of robot and robot |
CN106094875A (en) * | 2016-06-27 | 2016-11-09 | 南京邮电大学 | A kind of target follow-up control method of mobile robot |
CN106444763A (en) * | 2016-10-20 | 2017-02-22 | 泉州市范特西智能科技有限公司 | Intelligent automatic following method based on visual sensor, system and suitcase |
CN106774324A (en) * | 2016-12-22 | 2017-05-31 | 以恒激光科技(北京)有限公司 | A kind of three-dimensional identification patrol robot of dual camera |
CN106680832A (en) * | 2016-12-30 | 2017-05-17 | 深圳优地科技有限公司 | Obstacle detection method and device of mobile robot and mobile robot |
CN106970627A (en) * | 2017-05-17 | 2017-07-21 | 深圳市元时科技有限公司 | A kind of intelligent system for tracking |
CN107422730A (en) * | 2017-06-09 | 2017-12-01 | 武汉市众向科技有限公司 | The AGV transportation systems of view-based access control model guiding and its driving control method |
CN107659918A (en) * | 2017-08-11 | 2018-02-02 | 东北电力大学 | A kind of method and system intelligently followed |
Non-Patent Citations (1)
Title |
---|
林琳: "机器人双目视觉定位技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109869799A (en) * | 2019-01-02 | 2019-06-11 | 厦门攸信信息技术有限公司 | A kind of mobile robot and its heating method of heating |
CN109947105A (en) * | 2019-03-27 | 2019-06-28 | 科大智能机器人技术有限公司 | A kind of speed regulating method and speed regulation device of automatic tractor |
CN110103257A (en) * | 2019-05-10 | 2019-08-09 | 东方电子股份有限公司 | One kind is based on double line laser radar independent navigation robots |
CN110103237A (en) * | 2019-05-13 | 2019-08-09 | 湖北经济学院 | The follower type robot Fellow of view-based access control model target following |
CN110320523B (en) * | 2019-07-05 | 2020-12-11 | 齐鲁工业大学 | Target positioning device and method for following robot |
CN110320523A (en) * | 2019-07-05 | 2019-10-11 | 齐鲁工业大学 | Follow the target locating set and method of robot |
CN110320912A (en) * | 2019-07-10 | 2019-10-11 | 苏州欧博智慧机器人有限公司 | The AGV positioning navigation device and method that laser is merged with vision SLAM |
CN110456791A (en) * | 2019-07-30 | 2019-11-15 | 中国地质大学(武汉) | A kind of leg type mobile robot object ranging and identifying system based on monocular vision |
CN110543177A (en) * | 2019-09-27 | 2019-12-06 | 珠海市一微半导体有限公司 | Robot for walking baby automatically and method for walking baby automatically |
CN111157008A (en) * | 2020-03-05 | 2020-05-15 | 齐鲁工业大学 | Local autonomous navigation system and method based on multidimensional environment information perception |
CN111157008B (en) * | 2020-03-05 | 2022-06-21 | 齐鲁工业大学 | Local autonomous navigation system and method based on multidimensional environment information perception |
CN113352313A (en) * | 2020-03-06 | 2021-09-07 | 思特威(上海)电子科技股份有限公司 | Multi-level sensor decision control system of robot |
CN113552868A (en) * | 2020-04-22 | 2021-10-26 | 西门子股份公司 | Navigation method and navigation device of fire-fighting robot |
CN111823228A (en) * | 2020-06-08 | 2020-10-27 | 中国人民解放军战略支援部队航天工程大学 | Indoor following robot system and operation method |
CN112286184A (en) * | 2020-09-30 | 2021-01-29 | 广东唯仁医疗科技有限公司 | Outdoor surveying robot control method and system based on 5G network |
CN112286184B (en) * | 2020-09-30 | 2023-02-24 | 广东唯仁医疗科技有限公司 | Outdoor surveying robot control method based on 5G network |
CN112650245A (en) * | 2020-12-22 | 2021-04-13 | 江苏艾雨文承养老机器人有限公司 | Following robot, following system and following method thereof |
CN112859854A (en) * | 2021-01-08 | 2021-05-28 | 姜勇 | Camera system and method of camera robot capable of automatically following camera shooting |
WO2022173476A1 (en) * | 2021-02-12 | 2022-08-18 | Sony Group Corporation | Auto-calibrating n-configuration volumetric camera capture array |
CN113093729A (en) * | 2021-03-10 | 2021-07-09 | 上海工程技术大学 | Intelligent shopping trolley based on vision and laser radar and control method |
CN114102601A (en) * | 2021-12-07 | 2022-03-01 | 深圳市坤易电子有限公司 | Matrix queuing control system of fighting robot |
CN115437299A (en) * | 2022-10-10 | 2022-12-06 | 北京凌天智能装备集团股份有限公司 | Accompanying transportation robot advancing control method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108536145A (en) | A kind of robot system intelligently followed using machine vision and operation method | |
CN110446159B (en) | System and method for accurate positioning and autonomous navigation of indoor unmanned aerial vehicle | |
CN106168805A (en) | The method of robot autonomous walking based on cloud computing | |
CN100468265C (en) | Combined type vision navigation method and device | |
CN109737981B (en) | Unmanned vehicle target searching device and method based on multiple sensors | |
CN105823478A (en) | Autonomous obstacle avoidance navigation information sharing and using method | |
CN109599945A (en) | A kind of autonomous crusing robot cruising inspection system of wisdom power plant and method | |
CN107966989A (en) | A kind of robot autonomous navigation system | |
WO2020199589A1 (en) | Recharging control method for desktop robot | |
CN105629970A (en) | Robot positioning obstacle-avoiding method based on supersonic wave | |
CN105044754A (en) | Mobile platform outdoor positioning method based on multi-sensor fusion | |
CN106384353A (en) | Target positioning method based on RGBD | |
CN108801269A (en) | A kind of interior cloud Algorithms of Robots Navigation System and method | |
CN106066179A (en) | A kind of robot location based on ROS operating system loses method for retrieving and control system | |
CN108844543A (en) | Indoor AGV navigation control method based on UWB positioning and dead reckoning | |
CN106441306A (en) | Intelligent life detecting robot with capabilities of independent positioning and map building | |
CN106898249A (en) | A kind of map structuring system and its construction method for earthquake-stricken area communication failure region | |
CN111077907A (en) | Autonomous positioning method of outdoor unmanned aerial vehicle | |
CN107063242A (en) | Have the positioning navigation device and robot of virtual wall function | |
CN107490377A (en) | Indoor map-free navigation system and navigation method | |
CN114923477A (en) | Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology | |
CN116352722A (en) | Multi-sensor fused mine inspection rescue robot and control method thereof | |
CN107647828A (en) | The sweeping robot of fish-eye camera is installed | |
CN111157008B (en) | Local autonomous navigation system and method based on multidimensional environment information perception | |
Liu et al. | A Review of Sensing Technologies for Indoor Autonomous Mobile Robots |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180914 |