CN103885449A - Intelligent visual tracking wheeled robot based on multiple sensors and control method thereof - Google Patents

Intelligent visual tracking wheeled robot based on multiple sensors and control method thereof Download PDF

Info

Publication number
CN103885449A
CN103885449A CN201410136228.3A CN201410136228A CN103885449A CN 103885449 A CN103885449 A CN 103885449A CN 201410136228 A CN201410136228 A CN 201410136228A CN 103885449 A CN103885449 A CN 103885449A
Authority
CN
China
Prior art keywords
robot body
control terminal
control
adapter
wheel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410136228.3A
Other languages
Chinese (zh)
Other versions
CN103885449B (en
Inventor
曲海成
孟煜
刘万军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaoning Technical University
Original Assignee
Liaoning Technical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaoning Technical University filed Critical Liaoning Technical University
Priority to CN201410136228.3A priority Critical patent/CN103885449B/en
Publication of CN103885449A publication Critical patent/CN103885449A/en
Application granted granted Critical
Publication of CN103885449B publication Critical patent/CN103885449B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an intelligent visual tracking wheeled robot based on multiple sensors and a control method of the intelligent visual tracking wheeled robot. The intelligent visual tracking wheeled robot comprises a robot body and a robot control table. The robot body comprises a four-wheel drive chassis, a camera, a vehicle light, a communication module, a control module, a sensor module and a power module. The sensor module comprises a temperature sensor, a smoke sensor, a light sensor and a distance measuring sensor. The power module comprises a direct-current motor, a double-freedom-degree steering engine pan-tilt and a front wheel steering engine. The communication module comprises a machine body Wi-Fi adapter and a machine body ZigBee adapter. The robot control table comprises a control terminal with a Wi-Fi adapter arranged inside and a control terminal ZigbBee adapter. The Wi-Fi adapter of the control terminal and the machine body Wi-Fi adapter are in wireless communication, and the control terminal ZigbBee adapter and the machine body ZigbBee adapter are in wireless communication. According to the intelligent visual tracking wheeled robot, a dynamic target can be dynamically tracked, multi-sensor intelligent visual tracking is achieved, meanwhile, a Wi-Fi channel and a ZigBee channel are adopted, and video data transmission and command communication are separated.

Description

Intelligent vision based on multisensor is followed the tracks of wheeled robot and control method thereof
Technical field
The present invention relates to artificial intelligence robot vision field, be specifically related to a kind of intelligent vision based on multisensor and follow the tracks of wheeled robot and control method thereof.
Background technology
Robot slave computer carry out conversion processing by the instruction of receiving that achieves a butt joint of Arduino Romeo V2 control panel, and power system and sensor are controlled, robot host computer utilizes CamShift track algorithm to follow the tracks of target selected in vision window, CamShift track algorithm utilizes the color histogram graph model of target that image is converted to color probability distribution graph, size and the position of a search window of initialization, and the result self-adaptation obtaining according to previous frame is adjusted position and the size of search window, thereby orient the center of target in present image, algorithm is divided into three parts: 1) color perspective view (back projection), RGB color space is changed comparatively responsive to illumination brightness, change the impact on tracking effect in order to reduce this, first image is transformed into HSV space from rgb space, then H component is wherein made to histogram, in histogram, represented probability or number of pixels that different H component values occur, can find out in other words probability or number of pixels that H component size is h, obtained color probability search table, the probability that the value of each pixel in image is occurred by its color, to replacing, has just obtained color probability distribution graph.It is a gray level image that this process is just back projection, color probability distribution graph; 2) MeanShift algorithm is the nonparametric technique that a kind of density function gradient is estimated, finds the extreme value of probability distribution to carry out localizing objects by iteration optimizing; 3) MeanShift algorithm is expanded to consecutive image sequence, it is exactly CamShift algorithm, the all frames in video are carried out MeanShift computing by it, and by the result of previous frame, search for size and the center of window, as the initial value of next frame MeanShift algorithm search window, so iteration is gone down, just can realize the tracking to target, although CamShift algorithm has solved the problem of target following on image, but in the sight line of intelligent-tracking robot, can not accurately judge the distance of tracking target, thereby tracking effect is bad in actual tracing process.
Summary of the invention
The problem existing for prior art, the invention provides a kind of intelligent vision based on multisensor and follows the tracks of wheeled robot and control method thereof.
Technical scheme of the present invention is:
Intelligent vision based on multisensor is followed the tracks of a wheeled robot, comprises robot body and robot control desk;
Described robot body comprises four-wheel drive chassis, camera, car light, communication module, control module, sensor assembly and power plant module; Described car light, communication module, control module, sensor assembly and power plant module are all arranged on four-wheel drive chassis;
Described sensor assembly comprises temperature sensor, smoke transducer, light sensor and distance measuring sensor; The output terminal of the output terminal of temperature sensor, the output terminal of smoke transducer, light sensor is the different analog signal interfaces of link control module respectively; The digital signal interface of the output terminal link control module of distance measuring sensor;
Described power plant module comprises d.c. motor, double freedom steering wheel The Cloud Terrace and front-wheel steer steering wheel; Camera is installed on double freedom steering wheel The Cloud Terrace top, the digital signal interface of the signal input part link control module of d.c. motor, double freedom steering wheel The Cloud Terrace and front-wheel steer steering wheel, and front-wheel steer steering wheel connects the front-wheel of four-wheel drive chassis;
Described communication module comprises fuselage Wi-Fi adapter and fuselage ZigBee adapter; Fuselage Wi-Fi adapter is connected with control module respectively with fuselage ZigBee adapter, and fuselage Wi-Fi adapter is also connected with the output terminal of camera;
Described robot control desk comprises the control terminal and the control terminal ZigBee adapter that are built-in with Wi-Fi adapter; Described control terminal ZigBee adapter is connected with control terminal, and the Wi-Fi adapter of control terminal and fuselage Wi-Fi adapter are set up radio communication, and control terminal ZigBee adapter and fuselage ZigBee adapter are set up radio communication.
Between described control module and d.c. motor, be connected with motor electronic speed regulator.
The described intelligent vision based on multi-sensor cooperation processing is followed the tracks of the control method of wheeled robot, comprises the following steps,
Step 1: the video data of camera collection sends to through fuselage Wi-Fi adapter the control terminal that is built-in with Wi-Fi adapter, video data is shown on the display of control terminal in real time;
Step 2: temperature sensor, smoke transducer, light sensor and range sensor be Real-time Collection temperature information, gas concentration information, ambient brightness information and target range information respectively;
Step 3: control terminal ZigBee adapter and fuselage ZigBee adapter carry out point to point wireless communication, transmits the temperature information, gas concentration information, ambient brightness information and the target range information that collect;
Step 4: select tracking target and vision tracing area by control terminal, utilize CamShift algorithm to extract the target signature in current frame image, and range information between the center of record object feature, the current frame image current tracking target and the robot body that gather at position coordinates and the distance measuring sensor of vision tracing area;
Step 5: control double freedom steering wheel The Cloud Terrace and move up or down tracking target to be adjusted to the center of vision tracing area;
Step 6: in robot body motion process, tracking target is carried out to distance tracking and vision tracking: control terminal is followed the tracks of to robot body and sent and advance, retreat or halt instruction according to distance; Control terminal is followed the tracks of to robot body and is sent, turn right or become a full member instruction according to vision;
In robot body motion process, robot body is carried out as follows apart from the step of following the tracks of:
1) distance B of the selected tracking target moment robot body of record and tracking target 0, and set Minimum sliding distance, i.e. at least Minimum sliding distance S of tracking target, robot body is just followed the tracks of;
2) record the distance B n of current time robot body and tracking target, if Dn>D 0+ S, control terminal sends advancement commands to robot body, and control module receives this instruction and is issued to d.c. motor control body advances, if Dn<D 0-S, control terminal sends and retreats instruction to robot body, and control module receives this instruction and is issued to d.c. motor control body and retreats, otherwise, stop motion;
In robot body motion process, robot body is carried out to the step of vision tracking as follows:
1) utilize the target signature of former frame image and the current frame image of camera collection to mate, determine the position of tracking target in current frame image, and redefine vision tracing area;
2) vision tracing area is divided into left-hand rotation, three regions of becoming a full member, turn right, control terminal sends control command according to tracking target region of living in and motion state;
When tracking position of object is in left-hand rotation region and robot body during in forward travel state, control terminal sends order by control terminal ZigBee adapter to robot body, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn left to;
When tracking position of object is in left-hand rotation region and robot body during in fallback state, control terminal sends to robot body the order of turning right by control terminal ZigBee adapter, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn right to;
When tracking position of object is in left-hand rotation region and robot body during in halted state, control terminal sends order by control terminal ZigBee adapter to robot body, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn left to;
When tracking position of object in become a full member region and robot body in advancing, retreat or when halted state, control terminal sends to robot body the order of becoming a full member by control terminal ZigBee adapter, and the control module of robot body receives this order and the front-wheel that is issued on front-wheel steer steering wheel control four-wheel drive chassis is become a full member;
When tracking position of object is in right-hand rotation region and robot body during in forward travel state, control terminal sends to robot body the order of turning right by control terminal ZigBee adapter, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn right to;
When tracking position of object is in right-hand rotation region and robot body during in fallback state, control terminal sends order by control terminal ZigBee adapter to robot body, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn left to;
When tracking position of object is in right-hand rotation region and robot body during in fallback state, control terminal sends to robot body the order of turning right by control terminal ZigBee adapter, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn right to;
Step 7: whether control desk sends the instruction stopping the vision of tracking target is followed the tracks of and distance is followed the tracks of, if so, finish to follow the tracks of, otherwise repeated execution of steps 4 is to step 6.
Beneficial effect of the present invention: the present invention realizes the dynamic tracking to dynamic object, and combining target distance and video information realize Multi-sensor intelligent vision and follow the tracks of, it is sensitiveer, accurate to make to follow the tracks of; Adopt Wi-Fi channel and ZigBee channel simultaneously, video data transmitting and command communication are separated, not only utilized that Wi-Fi channel data transmission speed is high, the advantage of ZigBee channel data transmission distance but also the Socket that avoided Wi-Fi to use communicate by letter the same time can only one-way transmission, ZigBee transfer rate is low, shortcoming that cannot transmit image data; And solve traditional wheeled robot and can only, with the problem of the poor pivot stud of wheel speed, realize by forerunner's steering engine the middle free turning function of advancing; The highly difficult function such as the present invention has realized that vision real-time Transmission, remote control are advanced, dynamic target tracking, environmental monitoring and environment are taken pictures, is that intelligent transportation realizes intelligent-tracking, drives requisite ingredient, and application prospect is comparatively extensive.
Accompanying drawing explanation
Fig. 1 is that the intelligent vision based on multisensor of the specific embodiment of the invention is followed the tracks of wheeled robot architecture's block diagram;
Fig. 2 is TTL and the Micro USB port connection diagram of the specific embodiment of the invention;
Fig. 3 is that the USB of the specific embodiment of the invention turns TTL circuit theory diagrams;
Fig. 4 is the control module control d.c. motor signal conversion schematic diagram of the specific embodiment of the invention;
Fig. 5 is the control method process flow diagram that the intelligent vision based on multi-sensor cooperation processing of the specific embodiment of the invention is followed the tracks of wheeled robot;
Fig. 6 is the process flow diagram that in the robot body motion process of the specific embodiment of the invention, robot body is carried out to distance tracking;
Fig. 7 is that the tracing area of the specific embodiment of the invention is divided schematic diagram;
Fig. 8 is control module and the peripheral circuit connection diagram thereof of the specific embodiment of the invention;
Fig. 9 is that the multithreading of the specific embodiment of the invention is processed schematic diagram.
Embodiment
Below in conjunction with accompanying drawing, the specific embodiment of the present invention is elaborated.
Intelligent vision based on multisensor is followed the tracks of wheeled robot, comprises robot body and robot control desk.
As shown in Figure 1, robot body comprises four-wheel drive chassis, camera, car light, communication module, control module, sensor assembly and power plant module; Car light adopts LED lamp, and camera is selected a day quick S607, and control module is selected Arduino Romeo V2, and chip model is ARMEGA32U4, and this control chip and peripheral circuit thereof connect as shown in Figure 8; Car light, communication module, control module, sensor assembly and power plant module are all arranged on four-wheel drive chassis.
Sensor assembly comprises temperature sensor, smoke transducer, light sensor and distance measuring sensor; The model of temperature sensor is LM35, and smoke transducer adopts the smoke transducer that model is MQ2, and light sensor adopts photoresistance, and distance measuring sensor model is Arduino URM37V3.2.The output terminal of the output terminal of temperature sensor, the output terminal of smoke transducer, light sensor is the different analog signal interfaces of link control module respectively; The digital signal interface of the output terminal link control module of distance measuring sensor.
Power plant module comprises d.c. motor, double freedom steering wheel The Cloud Terrace and front-wheel steer steering wheel; It is the High-speed DC motor of RC380 that d.c. motor is selected model, and front-wheel steer steering wheel model is MG995, and the steering wheel of double freedom steering wheel The Cloud Terrace adopts MG995 steering wheel; Camera is installed on double freedom steering wheel The Cloud Terrace top, and front-wheel steer steering wheel connects the front-wheel of four-wheel drive chassis.Connecting motor electron speed regulator between control module and d.c. motor, the signal of motor electronic speed regulator receives pin and is connected with the digital signal output pin of control module, utilize the digital signal of control module to simulate the size of current that pwm pulse signal sends to electron speed regulator to control to send to d.c. motor, control d.c. motor rotating speed with this, principle as shown in Figure 4.
Double freedom steering wheel The Cloud Terrace provides wide angular field of view for camera, left and right can be within the scope of 180 degree free adjustment, the degree of freedom of upper and lower pitch angle also can reach 135 degree, spends hemisphere scope so the angular field of view of camera can cover front 180 completely; Forerunner's steering engine is intelligent-tracking robot steering power system, is 45 degree by its left and right steering locking angle degree under the control of master control borad, can be as real motor vehicle in the process of advancing freely, turn to neatly.
Communication module comprises fuselage Wi-Fi adapter and fuselage ZigBee adapter; Fuselage Wi-Fi adapter is reequiped by TP-LINK TL-WR703N: TTL is connected with Micro USB port, make TP-LINK TL-WR703N router receive and dispatch TTL data by Micro USB port, as shown in Figure 2, USB turns TTL circuit theory as shown in Figure 3 to principle.Fuselage Wi-Fi adapter is connected with control module respectively with fuselage ZigBee adapter, and fuselage Wi-Fi adapter is also connected with the output terminal of camera.
Robot control desk comprises the control terminal and the control terminal ZigBee adapter that are built-in with Wi-Fi adapter; Described control terminal ZigBee adapter is connected with control terminal, and the Wi-Fi adapter of control terminal and fuselage Wi-Fi adapter are set up radio communication, and control terminal ZigBee adapter and fuselage ZigBee adapter are set up radio communication.
The control method of following the tracks of wheeled robot based on the intelligent vision of multi-sensor cooperation processing, as shown in Figure 5, comprises the following steps,
Step 1: camera gathers the video data of pixel as 640*480 take 20 frames per second, sends to through fuselage Wi-Fi adapter the control terminal that is built-in with Wi-Fi adapter, and video data is shown on the display of control terminal in real time;
Step 2: temperature sensor, smoke transducer, light sensor and range sensor be Real-time Collection temperature information, gas concentration information, ambient brightness information and target range information respectively;
Step 3: control terminal ZigBee adapter and fuselage ZigBee adapter carry out point to point wireless communication, transmits the temperature information, gas information, ambient brightness information and the target range information that collect;
Digital-to-analog sensor is by the digital-to-analog interface of wire link control module, the digital/analog signal that control module is beamed back sensor changes into TTL signal and sends to control terminal ZigBee module by fuselage ZigBee module, and control terminal obtains sensor information by resolution data.
Step 4: control terminal is according to collecting the inflammable gas in gas concentration information real-time judge environment, and control terminal carries out alarm indication on its display in the time inflammable gas being detected; Judge whether to open car light according to the real time environment monochrome information collecting, when ambient brightness is during lower than minimum setting brightness value, car light is opened;
The concentration of inflammable gas is as follows:
The gas concentration signal detecting is converted into digital signal (0~1023), in the time that digital signal is greater than 100, just reports to the police, show and find that there is poisonous gas.
Photoresistance application illumination range 1~6000LUX changes into the digital signal of (0~1023), in the time that threshold signal is greater than 500, and turn on lights irradiation.
Step 5: select tracking target and vision tracing area by control terminal: utilize CamShift algorithm to extract the target signature in current frame image, and range information between the center of record object feature, the current frame image current tracking target and the robot body that gather at position coordinates and the distance measuring sensor of vision tracing area;
Step 5.1: utilize back projection method to carry out pre-service to current frame image, obtain the color probability distribution graph of current frame image;
Step 5.1.1: current frame image is transformed into HSV space from rgb space;
RGB color space changes comparatively responsive to illumination brightness, change the impact on tracking effect in order to reduce this, image need to be transformed into HSV space from rgb space;
Step 5.1.2: the H component to HSV space is made histogram, histogram represents probability or the number of pixels that different H component values occur, histogram is color probability search table;
Step 5.1.3: the value of each pixel in current frame image is replaced with the probability that its color occurs, obtain color probability distribution graph, complete back projection process, color probability distribution graph is target signature;
Color probability distribution graph is a gray level image;
Step 5.2: the color probability distribution graph obtaining according to back projection, calculate the probable value that each pixel in current frame image belongs to tracking target, then using these probable values as weights, utilize MeanShift algorithm to be weighted and solve, determine vision tracing area;
Step 5.2.1: in color probability distribution graph I (x, y), set initial search point (x 0, y 0), windows radius h and precision;
Step 5.2.2: to each search window centered by search point, calculate the first moment M of the color probability distribution graph in this search window 10, M 01with zeroth order square M 00;
Formula is as follows:
M 00 = &Sigma; x &Sigma; y I ( x , y ) - - - ( 1 )
M 10 = &Sigma; x &Sigma; y xI ( x , y ) - - - ( 2 )
M 01 = &Sigma; x &Sigma; y yI ( x , y ) - - - ( 3 )
Step 5.2.3: according to the first moment M of color probability distribution graph 10, M 01with zeroth order square M 00calculate the center-of-mass coordinate (x of search window c, y c)
x c = M 10 M 00 , y c = M 01 M 00 - - - ( 4 )
Step 5.2.4: adjust search box size, the width of search window
Figure BDA0000487599880000075
length is 1.2s;
Step 5.2.5: the center of mobile search window is to barycenter, if the displacement between the center of search window and barycenter is greater than default fixed threshold ε=0.1, repeating step 5.2.2-5.2.4, until the displacement between center and the barycenter of search window is less than default fixed threshold ε, or the number of times of loop computation reaches maximum times (10 times), stop calculating, search window is now vision tracing area;
Step 5.3: to all two field pictures execution step 5.2 in video data, using the size of the search window of former frame and center as the initial value of the search window of next frame, realize the tracking to target;
MeanShift algorithm is expanded to consecutive image sequence, is exactly CamShift algorithm performing step.Be the problem that CamShift can effectively solve target distortion and block, less demanding to system resource, time complexity is low, under simple background, can obtain good tracking effect.
Step 6: control double freedom steering wheel The Cloud Terrace and move up or down tracking target to be adjusted to the center of vision tracing area;
Corresponding relation between table 1 steering order and receiving and transmitting signal
Order Transmitted signal type Transmitted signal content Receive signal device
Advance PWM Dutycycle forward increases Electron speed regulator
Retreat PWM The reverse increase of dutycycle Electron speed regulator
Turn left PWM Dutycycle forward increases Steering engine
Turn right PWM The reverse increase of dutycycle Steering engine
On see PWM Dutycycle forward increases The Cloud Terrace steering wheel (controlling upper and lower)
Under see PWM The reverse increase of dutycycle The Cloud Terrace steering wheel (controlling upper and lower)
A left side is seen PWM Dutycycle forward increases The Cloud Terrace steering wheel (controlling left and right)
The right side is seen PWM The reverse increase of dutycycle The Cloud Terrace steering wheel (controlling left and right)
Step 7: in robot body motion process, tracking target is carried out to distance tracking and vision tracking: control terminal is followed the tracks of to robot body and sent and advance, retreat or halt instruction according to distance; Control terminal is followed the tracks of to robot body and is sent, turn right or become a full member instruction according to vision;
In robot body motion process, robot body is carried out apart from the process of following the tracks of as shown in Figure 6, specific as follows:
1) distance B of the selected tracking target moment robot body of record and tracking target 0, and set Minimum sliding distance, i.e. at least Minimum sliding distance S of tracking target, robot body is just followed the tracks of;
2) record the distance B n of current time robot body and tracking target, if Dn>D 0+ S, control terminal sends advancement commands to robot body, and control module receives this instruction and is issued to d.c. motor control body advances, if Dn<D 0-S, control terminal sends and retreats instruction to robot body, and control module receives this instruction and is issued to d.c. motor control body and retreats, otherwise, stop motion;
In robot body motion process, robot body is carried out to vision tracking, specific as follows:
1) utilize the target signature of former frame image and the current frame image of camera collection to mate, determine the position of tracking target in current frame image, and redefine vision tracing area;
2) vision tracing area is divided into left-hand rotation, three regions of becoming a full member, turn right, as shown in Figure 7, wherein L is left-hand rotation region, M is the region of becoming a full member, R is right-hand rotation region, control terminal sends control command according to tracking target region of living in and motion state;
If the center of the vision tracing area in current frame image may be dropped in L, M, a certain region of R, the possible motion state of robot has three kinds: F (advancing), B (retreating), P (stopping), the travel condition that diverse location according to tracking target in image is different from robot, control terminal sends to robot the order of turning in the mode of table 2:
Table 2 turning command list
Figure BDA0000487599880000091
When tracking position of object is in left-hand rotation region and robot body during in forward travel state, control terminal sends order by control terminal ZigBee adapter to robot body, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn left to;
When tracking position of object is in left-hand rotation region and robot body during in fallback state, control terminal sends to robot body the order of turning right by control terminal ZigBee adapter, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn right to;
When tracking position of object is in left-hand rotation region and robot body during in halted state, control terminal sends order by control terminal ZigBee adapter to robot body, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn left to;
When tracking position of object in become a full member region and robot body in advancing, retreat or when halted state, control terminal sends to robot body the order of becoming a full member by control terminal ZigBee adapter, and the control module of robot body receives this order and the front-wheel that is issued on front-wheel steer steering wheel control four-wheel drive chassis is become a full member;
When tracking position of object is in right-hand rotation region and robot body during in forward travel state, control terminal sends to robot body the order of turning right by control terminal ZigBee adapter, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn right to;
When tracking position of object is in right-hand rotation region and robot body during in fallback state, control terminal sends order by control terminal ZigBee adapter to robot body, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn left to;
When tracking position of object is in right-hand rotation region and robot body during in fallback state, control terminal sends to robot body the order of turning right by control terminal ZigBee adapter, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn right to;
Step 8: whether control desk sends the instruction stopping the vision of tracking target is followed the tracks of and distance is followed the tracks of, if so, finish to follow the tracks of, otherwise repeated execution of steps 5 is to step 7.
Ambient light sensor can environment-identification light power, at light An Shi intelligent-tracking, robot will light car light automatically;
If contain toxic gas in environment, Study of Intelligent Robot Control platform can show the warning of poisonous gas; Control desk can be taken pictures to vision imaging, and photo can be stored in the hard disk of control desk;
Thread 1 obtains view data by WiFi radio communication as shown in Figure 4, and thread 2 obtains space length information by ZigBee radio communication.
This method adopts imageing sensor and range sensor associated treatment, utilize computer multiple thread technology to carry out parallel processing to Multiple Source Sensor image data information, as shown in Figure 9, thread 1 obtains view data by WiFi radio communication, thread 2 obtains space length information by ZigBee radio communication, to reach the object that target multi-angle multiple information sources is followed the tracks of.
The highly difficult function such as the present invention has realized that vision real-time Transmission, remote control are advanced, dynamic target tracking, environmental monitoring and environment are taken pictures, that intelligent transportation realizes intelligent-tracking, drives requisite ingredient, its application prospect is comparatively extensive, for example military target investigation, tracking, to the monitoring of dangerous vehicle tracking, the field such as intelligent-tracking is unmanned.

Claims (3)

1. the intelligent vision based on multisensor is followed the tracks of a wheeled robot, it is characterized in that: comprise robot body and robot control desk;
Described robot body comprises four-wheel drive chassis, camera, car light, communication module, control module, sensor assembly and power plant module; Described car light, communication module, control module, sensor assembly and power plant module are all arranged on four-wheel drive chassis;
Described sensor assembly comprises temperature sensor, smoke transducer, light sensor and distance measuring sensor; The output terminal of the output terminal of temperature sensor, the output terminal of smoke transducer, light sensor is the different analog signal interfaces of link control module respectively; The digital signal interface of the output terminal link control module of distance measuring sensor;
Described power plant module comprises d.c. motor, double freedom steering wheel The Cloud Terrace and front-wheel steer steering wheel; Camera is installed on double freedom steering wheel The Cloud Terrace top, the digital signal interface of the signal input part link control module of d.c. motor, double freedom steering wheel The Cloud Terrace and front-wheel steer steering wheel, and front-wheel steer steering wheel connects the front-wheel of four-wheel drive chassis;
Described communication module comprises fuselage Wi-Fi adapter and fuselage ZigBee adapter; Fuselage Wi-Fi adapter is connected with control module respectively with fuselage ZigBee adapter, and fuselage Wi-Fi adapter is also connected with the output terminal of camera;
Described robot control desk comprises the control terminal and the control terminal ZigBee adapter that are built-in with Wi-Fi adapter; Described control terminal ZigBee adapter is connected with control terminal, and the Wi-Fi adapter of control terminal and fuselage Wi-Fi adapter are set up radio communication, and control terminal ZigBee adapter and fuselage ZigBee adapter are set up radio communication.
2. the intelligent vision based on multisensor according to claim 1 is followed the tracks of wheeled robot, it is characterized in that: between described control module and d.c. motor, be connected with motor electronic speed regulator.
3. the intelligent vision based on multi-sensor cooperation processing claimed in claim 1 is followed the tracks of the control method of wheeled robot, it is characterized in that: comprise the following steps,
Step 1: the video data of camera collection sends to through fuselage Wi-Fi adapter the control terminal that is built-in with Wi-Fi adapter, video data is shown on the display of control terminal in real time;
Step 2: temperature sensor, smoke transducer, light sensor and range sensor be Real-time Collection temperature information, gas concentration information, ambient brightness information and target range information respectively;
Step 3: control terminal ZigBee adapter and fuselage ZigBee adapter carry out point to point wireless communication, transmits the temperature information, gas concentration information, ambient brightness information and the target range information that collect;
Step 4: control terminal is according to collecting the inflammable gas in gas concentration information real-time judge environment, and control terminal carries out alarm indication on its display in the time inflammable gas being detected; Judge whether to open car light according to the real time environment monochrome information collecting, when ambient brightness is during lower than minimum setting brightness value, car light is opened;
Step 5: select tracking target and vision tracing area by control terminal, utilize CamShift algorithm to extract the target signature in current frame image, and range information between the center of record object feature, the current frame image current tracking target and the robot body that gather at position coordinates and the distance measuring sensor of vision tracing area;
Step 6: control double freedom steering wheel The Cloud Terrace and move up or down tracking target to be adjusted to the center of vision tracing area;
Step 7: in robot body motion process, tracking target is carried out to distance tracking and vision tracking: control terminal is followed the tracks of to robot body and sent and advance, retreat or halt instruction according to distance; Control terminal is followed the tracks of to robot body and is sent, turn right or become a full member instruction according to vision;
In robot body motion process, robot body is carried out as follows apart from the step of following the tracks of:
1) distance B of the selected tracking target moment robot body of record and tracking target 0, and set Minimum sliding distance, i.e. at least Minimum sliding distance S of tracking target, robot body is just followed the tracks of;
2) record the distance B n of current time robot body and tracking target, if Dn > is D 0+ S, control terminal sends advancement commands to robot body, and control module receives this instruction and is issued to d.c. motor control body advances, if Dn < is D 0-S, control terminal sends and retreats instruction to robot body, and control module receives this instruction and is issued to d.c. motor control body and retreats, otherwise, stop motion;
In robot body motion process, robot body is carried out to the step of vision tracking as follows:
1) utilize the target signature of former frame image and the current frame image of camera collection to mate, determine the position of tracking target in current frame image, and redefine vision tracing area;
2) vision tracing area is divided into left-hand rotation, three regions of becoming a full member, turn right, control terminal sends control command according to tracking target region of living in and motion state;
When tracking position of object is in left-hand rotation region and robot body during in forward travel state, control terminal sends order by control terminal ZigBee adapter to robot body, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn left to;
When tracking position of object is in left-hand rotation region and robot body during in fallback state, control terminal sends to robot body the order of turning right by control terminal ZigBee adapter, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn right to;
When tracking position of object is in left-hand rotation region and robot body during in halted state, control terminal sends order by control terminal ZigBee adapter to robot body, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn left to;
When tracking position of object in become a full member region and robot body in advancing, retreat or when halted state, control terminal sends to robot body the order of becoming a full member by control terminal ZigBee adapter, and the control module of robot body receives this order and the front-wheel that is issued on front-wheel steer steering wheel control four-wheel drive chassis is become a full member;
When tracking position of object is in right-hand rotation region and robot body during in forward travel state, control terminal sends to robot body the order of turning right by control terminal ZigBee adapter, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn right to;
When tracking position of object is in right-hand rotation region and robot body during in fallback state, control terminal sends order by control terminal ZigBee adapter to robot body, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn left to;
When tracking position of object is in right-hand rotation region and robot body during in fallback state, control terminal sends to robot body the order of turning right by control terminal ZigBee adapter, the control module of robot body receive this order and be issued to front-wheel on front-wheel steer steering wheel control four-wheel drive chassis turn right to;
Step 8: whether control desk sends the instruction stopping the vision of tracking target is followed the tracks of and distance is followed the tracks of, if so, finish to follow the tracks of, otherwise repeated execution of steps 5 is to step 7.
CN201410136228.3A 2014-04-04 2014-04-04 Intelligent vision based on multi-sensor cooperation process follows the tracks of the control method of wheeled robot Active CN103885449B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410136228.3A CN103885449B (en) 2014-04-04 2014-04-04 Intelligent vision based on multi-sensor cooperation process follows the tracks of the control method of wheeled robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410136228.3A CN103885449B (en) 2014-04-04 2014-04-04 Intelligent vision based on multi-sensor cooperation process follows the tracks of the control method of wheeled robot

Publications (2)

Publication Number Publication Date
CN103885449A true CN103885449A (en) 2014-06-25
CN103885449B CN103885449B (en) 2016-03-23

Family

ID=50954397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410136228.3A Active CN103885449B (en) 2014-04-04 2014-04-04 Intelligent vision based on multi-sensor cooperation process follows the tracks of the control method of wheeled robot

Country Status (1)

Country Link
CN (1) CN103885449B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104089629A (en) * 2014-06-30 2014-10-08 西北农林科技大学 Test platform of visual navigation electric vehicle
CN104270563A (en) * 2014-08-18 2015-01-07 吴建民 System and method for taking pictures/recording videos by using mobile phone/tablet personal computer under assistance of rotorcraft
CN104881884A (en) * 2015-06-29 2015-09-02 辽宁工程技术大学 Target tracking method based on visual quantum
CN105511474A (en) * 2016-01-18 2016-04-20 安徽工程大学 Independent type vehicle tracking and controlling system
CN105843229A (en) * 2016-05-17 2016-08-10 中外合资沃得重工(中国)有限公司 Unmanned intelligent vehicle and control method
CN106054894A (en) * 2016-07-05 2016-10-26 北京九星智元科技有限公司 Robot accompanying system, robot accompanying method and robot trolley
TWI568261B (en) * 2015-05-22 2017-01-21 北京橙鑫數據科技有限公司 Wisdom tracking camera
CN106383515A (en) * 2016-09-21 2017-02-08 哈尔滨理工大学 Wheel-type moving robot obstacle-avoiding control system based on multi-sensor information fusion
CN106647815A (en) * 2017-01-23 2017-05-10 昆山市工研院智能制造技术有限公司 Intelligent following robot based on multi-sensor information fusion and control method thereof
CN107065693A (en) * 2017-05-27 2017-08-18 安徽沪宁智能科技有限公司 A kind of remote control intelligent fire robot system based on ZigBee
CN107610097A (en) * 2017-08-16 2018-01-19 深圳市天益智网科技有限公司 Instrument localization method, device and terminal device
CN108381554A (en) * 2018-05-22 2018-08-10 中国矿业大学 Vision tracking mobile robot based on WIFI auxiliary positionings and control method
CN108931979A (en) * 2018-06-22 2018-12-04 中国矿业大学 Vision tracking mobile robot and control method based on ultrasonic wave auxiliary positioning
CN109240297A (en) * 2018-09-26 2019-01-18 深算科技(上海)有限公司 A kind of independent navigation robot that view-based access control model follows
CN109500817A (en) * 2018-12-07 2019-03-22 深圳市众智创新科技有限责任公司 The 360 degree of visual pursuit control systems and control method of multi-foot robot
CN110682303A (en) * 2019-10-15 2020-01-14 江苏艾什顿科技有限公司 Intelligent robot training device, control system and method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100008439A (en) * 2008-07-16 2010-01-26 주식회사 쏠리테크 Mobile image robot phone
CN101786272A (en) * 2010-01-05 2010-07-28 深圳先进技术研究院 Multisensory robot used for family intelligent monitoring service
CN201638053U (en) * 2009-12-31 2010-11-17 重庆工商职业学院 Polling robot
CN102253673A (en) * 2011-07-08 2011-11-23 上海合时智能科技有限公司 Household movable security robot based on target identification technique
CN202075626U (en) * 2011-04-14 2011-12-14 山东大学 Multi-sensor system of intelligent space and nurse robot
CN102340894A (en) * 2011-08-26 2012-02-01 东北大学 Wireless-sensor-network-based remote control rescue robot system and control method
CN203012510U (en) * 2013-01-07 2013-06-19 西北农林科技大学 Mountainous region agricultural robot obstacle-avoiding system based on multi-sensor information fusion

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100008439A (en) * 2008-07-16 2010-01-26 주식회사 쏠리테크 Mobile image robot phone
CN201638053U (en) * 2009-12-31 2010-11-17 重庆工商职业学院 Polling robot
CN101786272A (en) * 2010-01-05 2010-07-28 深圳先进技术研究院 Multisensory robot used for family intelligent monitoring service
CN202075626U (en) * 2011-04-14 2011-12-14 山东大学 Multi-sensor system of intelligent space and nurse robot
CN102253673A (en) * 2011-07-08 2011-11-23 上海合时智能科技有限公司 Household movable security robot based on target identification technique
CN102340894A (en) * 2011-08-26 2012-02-01 东北大学 Wireless-sensor-network-based remote control rescue robot system and control method
CN203012510U (en) * 2013-01-07 2013-06-19 西北农林科技大学 Mountainous region agricultural robot obstacle-avoiding system based on multi-sensor information fusion

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104089629A (en) * 2014-06-30 2014-10-08 西北农林科技大学 Test platform of visual navigation electric vehicle
CN104270563A (en) * 2014-08-18 2015-01-07 吴建民 System and method for taking pictures/recording videos by using mobile phone/tablet personal computer under assistance of rotorcraft
CN104270563B (en) * 2014-08-18 2018-02-02 吴建民 Gyroplane aids in mobile phone/tablet personal computer take pictures/system and method recorded a video
TWI568261B (en) * 2015-05-22 2017-01-21 北京橙鑫數據科技有限公司 Wisdom tracking camera
CN104881884B (en) * 2015-06-29 2017-07-11 辽宁工程技术大学 A kind of method for tracking target of view-based access control model quantum
CN104881884A (en) * 2015-06-29 2015-09-02 辽宁工程技术大学 Target tracking method based on visual quantum
CN105511474A (en) * 2016-01-18 2016-04-20 安徽工程大学 Independent type vehicle tracking and controlling system
CN105843229A (en) * 2016-05-17 2016-08-10 中外合资沃得重工(中国)有限公司 Unmanned intelligent vehicle and control method
CN105843229B (en) * 2016-05-17 2018-12-18 中外合资沃得重工(中国)有限公司 Unmanned intelligent carriage and control method
CN106054894A (en) * 2016-07-05 2016-10-26 北京九星智元科技有限公司 Robot accompanying system, robot accompanying method and robot trolley
CN106054894B (en) * 2016-07-05 2019-04-09 北京九星智元科技有限公司 A kind of robot adjoint system, adjoint method and robot car
CN106383515A (en) * 2016-09-21 2017-02-08 哈尔滨理工大学 Wheel-type moving robot obstacle-avoiding control system based on multi-sensor information fusion
CN106647815A (en) * 2017-01-23 2017-05-10 昆山市工研院智能制造技术有限公司 Intelligent following robot based on multi-sensor information fusion and control method thereof
CN107065693A (en) * 2017-05-27 2017-08-18 安徽沪宁智能科技有限公司 A kind of remote control intelligent fire robot system based on ZigBee
CN107610097A (en) * 2017-08-16 2018-01-19 深圳市天益智网科技有限公司 Instrument localization method, device and terminal device
CN108381554A (en) * 2018-05-22 2018-08-10 中国矿业大学 Vision tracking mobile robot based on WIFI auxiliary positionings and control method
CN108931979A (en) * 2018-06-22 2018-12-04 中国矿业大学 Vision tracking mobile robot and control method based on ultrasonic wave auxiliary positioning
CN109240297A (en) * 2018-09-26 2019-01-18 深算科技(上海)有限公司 A kind of independent navigation robot that view-based access control model follows
CN109500817A (en) * 2018-12-07 2019-03-22 深圳市众智创新科技有限责任公司 The 360 degree of visual pursuit control systems and control method of multi-foot robot
CN109500817B (en) * 2018-12-07 2024-05-10 深圳市众智创新科技有限责任公司 360-Degree vision tracking control system and control method for multi-legged robot
CN110682303A (en) * 2019-10-15 2020-01-14 江苏艾什顿科技有限公司 Intelligent robot training device, control system and method

Also Published As

Publication number Publication date
CN103885449B (en) 2016-03-23

Similar Documents

Publication Publication Date Title
CN103885449B (en) Intelligent vision based on multi-sensor cooperation process follows the tracks of the control method of wheeled robot
Li et al. Development of a following agricultural machinery automatic navigation system
US11042723B2 (en) Systems and methods for depth map sampling
CN103455822B (en) Crusing robot system in complex industrial environment and plurality of human faces Dynamic Tracking
CN107650908B (en) Unmanned vehicle environment sensing system
CN109270534A (en) A kind of intelligent vehicle laser sensor and camera online calibration method
CN103454919B (en) The control method of the kinetic control system of mobile robot in intelligent space
CN105629970A (en) Robot positioning obstacle-avoiding method based on supersonic wave
CN110082781A (en) Fire source localization method and system based on SLAM technology and image recognition
CN203870474U (en) Automatic navigation patrol robot for visual monitoring
CN108563236B (en) Target tracking method of nano unmanned aerial vehicle based on concentric circle characteristics
CN104049634A (en) Intelligent body fuzzy dynamic obstacle avoidance method based on Camshift algorithm
CN101976079A (en) Intelligent navigation control system and method
CN110837800A (en) Port severe weather-oriented target detection and identification method
CN109352654A (en) A kind of intelligent robot system for tracking and method based on ROS
CN105700528A (en) Autonomous navigation and obstacle avoidance system and method for robot
CN105334347A (en) Particle image velocimetry system and method based on unmanned plane
CA3139421A1 (en) Automatic annotation of object trajectories in multiple dimensions
Ismail et al. Vision-based system for line following mobile robot
CN104820435A (en) Quadrotor moving target tracking system based on smart phone and method thereof
Juang et al. Real-time indoor surveillance based on smartphone and mobile robot
CN113822251B (en) Ground reconnaissance robot gesture control system and control method based on binocular vision
CN105701789A (en) Airplane near-field flight trajectory tracking method
CN212781778U (en) Intelligent vehicle based on vision SLAM
CN104238558A (en) Tracking robot quarter turn detecting method and device based on single camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 123000 Zhonghua Road, Liaoning, Fuxin, China, No. 47

Applicant after: Liaoning Technical University

Address before: 110043 Zhonghua Road, Liaoning, Fuxin, China, No. 47

Applicant before: Liaoning Technical University

COR Change of bibliographic data
C14 Grant of patent or utility model
GR01 Patent grant