CN208034687U - Follow robot - Google Patents

Follow robot Download PDF

Info

Publication number
CN208034687U
CN208034687U CN201820511018.1U CN201820511018U CN208034687U CN 208034687 U CN208034687 U CN 208034687U CN 201820511018 U CN201820511018 U CN 201820511018U CN 208034687 U CN208034687 U CN 208034687U
Authority
CN
China
Prior art keywords
micromainframe
robot
module
information
mobile robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201820511018.1U
Other languages
Chinese (zh)
Inventor
张伟民
汤月娟
李明珠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Polytechnic Huahui Technology Co Ltd
Original Assignee
Beijing Polytechnic Huahui Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Polytechnic Huahui Technology Co Ltd filed Critical Beijing Polytechnic Huahui Technology Co Ltd
Priority to CN201820511018.1U priority Critical patent/CN208034687U/en
Application granted granted Critical
Publication of CN208034687U publication Critical patent/CN208034687U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Manipulator (AREA)
  • Toys (AREA)

Abstract

This application discloses one kind following robot.Include the kinect visual sensors for sampling depth image, it is characterised in that:The micromainframe being set on wheeled mobile robot;The data receiver of the micromainframe is connect with the data output end of the kinect visual sensors, and the control command output end of the micromainframe is connect with the control command receiving terminal of the wheeled mobile robot;In such a way that micromainframe is set on wheeled mobile robot, it is connect with the data output end of the kinect visual sensors by the data receiver of micromainframe, achieve the purpose that control command is output to wheeled mobile robot after micromainframe data processing, the technique effect of target precisely is followed to realize, and then solves the problems, such as that robot cannot follow specific objective and Target Acquisition slow during following.

Description

Follow robot
Technical field
This application involves robot fields, and robot is followed in particular to one kind.
Background technology
With the quick deep development of robot field, human-computer interaction technology also increasingly becomes one of the hot spot of research. It is main study portion in mobile robot field that wherein intelligent human-body, which follows, the technology can better service user, carry Height simultaneously improves the intelligent of human-computer interaction, and effective response is made to the instruction that user sends out.
In the related technology be applied to robot follower method there are many, such as based on ultrasonication, be based on infrared place Reason, view-based access control model image procossing etc..Utility model people has found, if being difficult that determination follows specific mesh using ultrasonic processing method Mark, and have many limitations using infrared processing method, such as the problems such as detection range is short, specific objective is difficult determining so that people Volume tracing is vulnerable to interference and tracks failure.If view-based access control model image procossing can solve the above problems to a certain extent, but It is to be followed according to the skeleton identification of kinect, and will appear that bone identification is slow, and it is difficult recapture that midway target, which is lost, The problems such as, and to follow effect poor.
Therefore, be badly in need of one kind and following robot, cannot be followed with to solve robot during following specific objective and The slow problem of Target Acquisition.
Utility model content
The main purpose of the application is that providing one kind following robot, cannot be with during following to solve robot With the problem that specific objective and Target Acquisition are slow.
To achieve the goals above, it according to the one side of the application, provides one kind and following robot.
Include according to the robot that follows of the application:For the kinect visual sensors of sampling depth image, feature It is:The micromainframe being set on wheeled mobile robot;The data receiver of the micromainframe is regarded with the kinect Feel the data output end connection of sensor, the control of the control command output end and the wheeled mobile robot of the micromainframe Order receiving terminal connection processed.
Further, the data receiver of the image capture module of the micromainframe and the kinect visual sensors Data output end connection.
Further, the data output end of the image capture module of the micromainframe and the figure for carrying out image procossing As the data receiver of processing module connects.
Further, the effective information output end of the image processing module of the micromainframe with for according to it is described effectively Information carries out the effective information receiving terminal connection of the speech processing module of voice dialogue.
Further, the effective information output end of the image processing module of the micromainframe with for according to it is described effectively Information obtains the effective information receiving terminal connection of the information feedback module of control information.
Further, the control command output end Yu the wheeled mobile robot of the information feedback module of the micromainframe The control command receiving terminal of the information receiving module of people connects.
Further, the control information output of the information receiving module of the wheeled mobile robot with for will be described Control the control information receiving end connection that information converts the message processing module of signal in order to control.
Further, the control signal output of the message processing module of the wheeled mobile robot is driven with for controlling The control signal receiving end connection of the control module of dynamic motor.
Further, the wheeled mobile robot is additionally provided with the power module for providing electric power.
Further, the wheeled mobile robot is additionally provided with for receiving remote signal and controlling the distant of driving motor Control control module.
In the embodiment of the present application, in such a way that micromainframe is set on wheeled mobile robot, pass through miniature master The data receiver of machine is connect with the data output end of the kinect visual sensors, has been reached by micromainframe data Control command is output to the purpose of wheeled mobile robot after reason, to realize the technique effect for precisely following target, into And solving the problems, such as robot cannot follow specific objective and Target Acquisition slow during following.
Description of the drawings
The attached drawing constituted part of this application is used for providing further understanding of the present application so that the application's is other Feature, objects and advantages become more apparent upon.The illustrative examples attached drawing and its explanation of the application is for explaining the application, not Constitute the improper restriction to the application.In the accompanying drawings:
Fig. 1 is according to the modular structure schematic diagram described in the utility model for following robot;
Fig. 2 is according to the structural schematic diagram described in the utility model for following robot.
Specific implementation mode
In order to make those skilled in the art more fully understand application scheme, below in conjunction in the embodiment of the present application Attached drawing, technical solutions in the embodiments of the present application are clearly and completely described, it is clear that described embodiment is only The embodiment of the application part, instead of all the embodiments.Based on the embodiment in the application, ordinary skill people The every other embodiment that member is obtained without making creative work should all belong to the model of the application protection It encloses.
It should be noted that term " first " in the description and claims of this application and above-mentioned attached drawing, " Two " etc. be for distinguishing similar object, without being used to describe specific sequence or precedence.It should be appreciated that using in this way Data can be interchanged in the appropriate case, so as to embodiments herein described herein.In addition, term " comprising " and " tool Have " and their any deformation, it is intended that cover it is non-exclusive include, for example, containing series of steps or unit Process, method, system, product or equipment those of are not necessarily limited to clearly to list step or unit, but may include without clear It is listing to Chu or for these processes, method, product or equipment intrinsic other steps or unit.
In this application, term "upper", "lower", "left", "right", "front", "rear", "top", "bottom", "inner", "outside", " in ", "vertical", "horizontal", " transverse direction ", the orientation or positional relationship of the instructions such as " longitudinal direction " be orientation based on ... shown in the drawings or Position relationship.These terms are not intended to limit indicated primarily to preferably describe the utility model and embodiment Device, element or component must have particular orientation, or constructed and operated with particular orientation.
Also, above-mentioned part term is other than it can be used to indicate that orientation or positional relationship, it is also possible to for indicating it His meaning, such as term "upper" also are likely used for indicating certain relations of dependence or connection relation in some cases.For ability For the those of ordinary skill of domain, the concrete meaning of these terms in the present invention can be understood as the case may be.
In addition, term " installation ", " setting ", " being equipped with ", " connection ", " connected ", " socket " shall be understood in a broad sense.For example, It may be a fixed connection, be detachably connected or monolithic construction;Can be mechanical connection, or electrical connection;It can be direct phase Even, or indirectly connected through an intermediary, or it is two connections internal between device, element or component. For those of ordinary skills, it can understand that above-mentioned term in the present invention specific contains as the case may be Justice.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
As shown in Figure 1 and Figure 2, this application involves one kind following robot, and it includes being used for sampling depth figure that this, which follows robot, The kinect visual sensors 3 of picture, it is characterised in that:The micromainframe 1 being set on wheeled mobile robot 2;It is described miniature The data receiver of host 1 is connect with the data output end of the kinect visual sensors 3, the control of the micromainframe 1 Order output terminal is connect with the control command receiving terminal of the wheeled mobile robot 2, specifically discloses one kind by miniature master Machine 1 controls the structure and technical solution that robot follows operation.
Preferably, described that robot is followed to be made of three parts:Kinect visual sensors 3, micromainframe 1, wheeled shifting Mobile robot 2.
Specifically, the kinect visual sensors 3 are responsible for obtaining realtime graphic, and adopted by the image of micromainframe 1 Image data is transferred on micromainframe 1 by the acquisition image command that collection module is issued.
Specifically, the micromainframe 1 includes but not limited to:Image capture module, image processing module, speech processes mould Block and information feedback module, specifically, the data receiver of the image capture module of the micromainframe 1 and the kinect The data output end of visual sensor 3 connects;The data output end of the image capture module of the micromainframe 1 with for carrying out The data receiver of the image processing module of image procossing connects;The effective information of the image processing module of the micromainframe 1 Output end is connect with the effective information receiving terminal of the speech processing module for carrying out voice dialogue according to the effective information;Institute State the effective information output end of the image processing module of micromainframe 1 and for obtaining control information according to the effective information The effective information receiving terminal of information feedback module connects;The control command output end of the information feedback module of the micromainframe 1 It is connect with the control command receiving terminal of the information receiving module of the wheeled mobile robot 2.
Preferably, the composed structure of the wheeled mobile robot 2 includes but not limited to:Power module 22, motor driving Module 23, remote control module 21, specifically, the control information output of the information receiving module of the wheeled mobile robot 2 It holds and is connect with the control information receiving end for the control information to be converted the message processing module of signal in order to control;The wheel The control of the control signal output of the message processing module of formula mobile robot 2 and the control module for controlling driving motor Signal receiving end connects;
Preferably, command information is issued from micromainframe 1 to wheeled mobile robot 2,2 people of wheeled mobile robot receives Effective instruction information controls motor drive module 23, and driving wheel-type mobile robot 2 is advanced, retreated, turned left, turned right.Electricity Source module 22 is mainly responsible for the powerup issue of robot, and remote control module 21 is then to control robot row by being remotely controlled It walks, can equally complete advance, retrogressing, left-hand rotation, the right-hand rotation of robot.
Specifically, micromainframe 1 is analyzed by 3 the image collected data of kinect visual sensors, controlled System instructs and then the follow movement, specific following algorithm flow of control wheeled mobile robot 2 are:It is obtained first by kinect Depth image, and use point feature is partitioned into the depth image of target body, depth image with the method that Gradient Features are combined Gray value and the transverse and longitudinal coordinate of image combine, in certain spatial dimension, can be used for indicating object in 3D skies Between in coordinate, it is possible thereby to calculate the depth of field distance of target body, i.e. distance of the human body apart from video camera, then according to people With at a distance from video camera to determine whether followed, if meet follow condition if information fed back into control system, robot Start that target is followed to move, otherwise reacquire image information, and re-starts judgement.
The device operation principle is as described below:
Specifically, above-mentioned follow robot to be mainly made of three parts, it is kinect visual sensors 3 respectively, miniature Host 1, wheeled mobile robot 2;Wherein kinect visual sensors 3 are responsible for obtaining realtime graphic, pass through the figure of micromainframe 1 It as acquisition module publication order acquisition image and is transferred on micromainframe 1, then by image processing module to the image of acquisition Effective information is handled and is obtained, effective information control voice module carries out corresponding voice dialogue, or anti-by information Effective information is transferred to wheeled mobile robot 2 by feedback module, and wheeled mobile robot 2 has been received by information receiving module Information is imitated, and using message processing module to the further conversion processing of information, and controls wheeled mobile robot 2 and carry out accordingly It is mobile, such as advance, retreat, turn left, turn right.
It can be seen from the above description that the application realizes following technique effect:
In the embodiment of the present application, by the way of micromainframe 1 is arranged on wheeled mobile robot 2, by miniature The data receiver of host 1 is connect with the data output end of the kinect visual sensors 3, has been reached by micromainframe 1 Control command is output to the purpose of wheeled mobile robot 2, to realize the technology for precisely following target after data processing Effect, and then solve the problems, such as that robot cannot follow specific objective and Target Acquisition slow during following.
The foregoing is merely the preferred embodiments of the application, are not intended to limit this application, for the skill of this field For art personnel, the application can have various modifications and variations.Within the spirit and principles of this application, any made by repair Change, equivalent replacement, improvement etc., should be included within the protection domain of the application.

Claims (10)

  1. Include the kinect visual sensors (3) for sampling depth image 1. one kind following robot, it is characterised in that:If The micromainframe (1) being placed on wheeled mobile robot (2);
    The data receiver of the micromainframe (1) is connect with the data output end of the kinect visual sensors (3), described The control command output end of micromainframe (1) is connect with the control command receiving terminal of the wheeled mobile robot (2).
  2. 2. according to claim 1 follow robot, which is characterized in that the image capture module of the micromainframe (1) Data receiver connect with the data output end of the kinect visual sensors (3).
  3. 3. according to claim 2 follow robot, which is characterized in that the image capture module of the micromainframe (1) Data output end connect with the data receiver for the image processing module that carries out image procossing.
  4. 4. according to claim 3 follow robot, which is characterized in that the image processing module of the micromainframe (1) Effective information output end and the effective information of speech processing module for carrying out voice dialogue according to the effective information connect Receiving end connects.
  5. 5. according to claim 4 follow robot, which is characterized in that the image processing module of the micromainframe (1) Effective information output end and the effective information of information feedback module for obtaining control information according to the effective information connect Receiving end connects.
  6. 6. according to claim 5 follow robot, which is characterized in that the information feedback module of the micromainframe (1) Control command output end connect with the control command receiving terminal of the information receiving module of the wheeled mobile robot (2).
  7. 7. according to claim 6 follow robot, which is characterized in that the information of the wheeled mobile robot (2) connects The control information output of receipts module and the control for the control information to be converted to the message processing module of signal in order to control Information receiving end connects.
  8. 8. according to claim 7 follow robot, which is characterized in that at the information of the wheeled mobile robot (2) The control signal output for managing module is connect with the control signal receiving end of the control module for controlling driving motor.
  9. 9. according to claim 1 follow robot, which is characterized in that the wheeled mobile robot (2) is additionally provided with Power module (22) for providing electric power.
  10. 10. according to claim 1 follow robot, which is characterized in that the wheeled mobile robot (2) is additionally provided with For receiving remote signal and controlling the remote control module (21) of driving motor.
CN201820511018.1U 2018-04-11 2018-04-11 Follow robot Active CN208034687U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201820511018.1U CN208034687U (en) 2018-04-11 2018-04-11 Follow robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201820511018.1U CN208034687U (en) 2018-04-11 2018-04-11 Follow robot

Publications (1)

Publication Number Publication Date
CN208034687U true CN208034687U (en) 2018-11-02

Family

ID=63944883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201820511018.1U Active CN208034687U (en) 2018-04-11 2018-04-11 Follow robot

Country Status (1)

Country Link
CN (1) CN208034687U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108381552A (en) * 2018-04-11 2018-08-10 北京理工华汇智能科技有限公司 Follow robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108381552A (en) * 2018-04-11 2018-08-10 北京理工华汇智能科技有限公司 Follow robot

Similar Documents

Publication Publication Date Title
CN103955215B (en) Automatic obstacle avoidance trolley based on gesture identification and control device and control method
CN100493856C (en) Moving object capable of recognizing image and moving-object directing system equipped with the same
CN109008119A (en) Luggage case, smart machine and the system of automatically walk
CN111055281A (en) ROS-based autonomous mobile grabbing system and method
CN208930235U (en) A kind of movable self formula charging robot
CN109532522A (en) A kind of unmanned charging system of automobile based on 3D vision technique and its application method
CN108381552A (en) Follow robot
CN103171552A (en) AVM top view based parking support system
CN202512439U (en) Human-robot cooperation system with webcam and wearable sensor
CN105629969A (en) Restaurant service robot
CN106737714A (en) A kind of service robot
KR101795843B1 (en) Following Robot and Its Control method
CN105406556B (en) A kind of ancillary equipment charging system and method
WO2020096170A1 (en) Mobile robot usable as shopping cart
CN110900575A (en) Parallel intelligent robot with automatic guiding function and guiding method thereof
CN106774318A (en) Multiple agent interactive environment is perceived and path planning kinematic system
CN109473168A (en) A kind of medical image robot and its control, medical image recognition methods
CN208034687U (en) Follow robot
CN208000498U (en) Indoor crusing robot trolley
Petit et al. An integrated framework for humanoid embodiment with a BCI
CN106325306B (en) A kind of camera assembly apparatus of robot and its shooting and tracking
CN111966100A (en) Robot
CN218398132U (en) Indoor multifunctional operation robot of transformer substation
CN110920450A (en) Full-automatic charging system of electric automobile
CN107589747A (en) A kind of full drive intelligently guiding shifting apparatus

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant