CN109955254A - The remote operating control method of Mobile Robot Control System and robot end's pose - Google Patents
The remote operating control method of Mobile Robot Control System and robot end's pose Download PDFInfo
- Publication number
- CN109955254A CN109955254A CN201910363155.4A CN201910363155A CN109955254A CN 109955254 A CN109955254 A CN 109955254A CN 201910363155 A CN201910363155 A CN 201910363155A CN 109955254 A CN109955254 A CN 109955254A
- Authority
- CN
- China
- Prior art keywords
- pose
- gesture
- virtual
- mechanical arm
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
Abstract
The present disclosure proposes the remote operating control methods of a kind of Mobile Robot Control System and robot end's pose, the remote operating control method of a kind of Mobile Robot Control System based on wearable binocular vision and robot end's pose is provided for the mobile robot for carrying multi-degree-of-freemechanical mechanical arm, control method process is pose-virtual machine arm end pose-multi-degree-of-freemechanical mechanical arm end pose of the virtual gesture model of pose-of operator's gesture, by the driving relationship for establishing operator's gesture pose and multi-degree-of-freemechanical mechanical arm end pose, realize the continuous control to multi-degree-of-freemechanical mechanical arm end pose.Show virtual machine arm end and virtual gesture model in head virtual display simultaneously follows process, so that control process is more intuitive, to solve the problems, such as complicated there are control mode in the vehicle-mounted multiple degrees of freedom reconnaissance system control of existing movable reconnaissance robot and cannot intuitively control vehicle-mounted multiple degrees of freedom reconnaissance system end pose.
Description
Technical field
This disclosure relates in the long-range control correlative technology field of mobile robot, in particular to a kind of movement
The remote operating control method of robot control system and robot end's pose.
Background technique
The statement of this part only there is provided background technical information relevant to the disclosure, not necessarily constitutes first skill
Art.
Movable reconnaissance robot is usually to be made of mobile robot car body and vehicle-mounted reconnaissance system, it can execute battlefield
Approach reconnaissance and surveillance, a variety of combat duties such as surprise attack of moving under water, fixed point are cleaned up, core biochemical treatment and anti-terror explosive removing.Traditional is vehicle-mounted
Reconnaissance system be usually be made of the holder of camera and two-freedom, control mode generally pass through rocking bar pitch angle and partially
The angle information at boat angle realizes the pitch control to holder.For carrying the movable reconnaissance robot of multiple degrees of freedom reconnaissance system,
Its reconnaissance system is usually to be made of multi-degree-of-freemechanical mechanical arm and scouting camera, is fixedly connected on mostly certainly wherein scouting camera
By degree mechanical arm tail end.Robot end's pose refers to position and posture of the end effector of robot in specified coordinate system,
The end effector of movable reconnaissance robot be camera, reconnaissance robot end pose by multi-degree-of-freemechanical mechanical arm end position
Appearance determines, and the end Pose Control of multi-degree-of-freemechanical mechanical arm is usually using the control mode of button or rocking bar combination button, behaviour
Author needs to remember the corresponding relationship of each button and vehicle-mounted more free each joints of mechanical arm, therefore this mode of operation is complicated
It spends very high and not intuitive.
In recent years, there is the mode using the vehicle-mounted more free reconnaissance system ends pose of gesture control.One kind is more
Common gesture control mode is using wearing data glove or inertance element, its advantage is that discrimination is high, stability is good, is lacked
Point is to can not achieve control and the input that can only be realized to the control of vehicle-mounted multiple degrees of freedom reconnaissance system terminal position to posture
Equipment is expensive, dresses very inconvenient.Another gesture control mode is the control mode of view-based access control model, and this control mode again may be used
It is divided into the control mode based on image classification and the control mode based on image procossing, the former is generally by visual sensor knot
The type of gesture is analyzed in syntype recognition methods, is then realized according to the type information of gesture to vehicle-mounted multiple degrees of freedom reconnaissance system
The motion control of end pose, such as move up, move down, vehicle-mounted multiple degrees of freedom is scouted the disadvantage is that can not rapidly and accurately realize
The continuous control of system end pose;The latter analyzes the movement of gesture generally by visual sensor combination image processing method
Then the position control to vehicle-mounted multiple degrees of freedom reconnaissance system end is realized according to the location information of track in track, the disadvantage is that
It cannot achieve the control to vehicle-mounted multiple degrees of freedom reconnaissance system terminal angle.
Summary of the invention
The disclosure to solve the above-mentioned problems, proposes a kind of Mobile Robot Control System and robot end's pose
Remote operating control method provides a kind of based on wearable binocular vision for the mobile robot for carrying multi-degree-of-freemechanical mechanical arm
Mobile Robot Control System and robot end's pose remote operating control method, by freely dressing and unloading realize
Continuous control to multi-degree-of-freemechanical mechanical arm terminal position and posture, to solve the vehicle-mounted multiple degrees of freedom of existing movable reconnaissance robot
It is complicated there are control mode and cannot intuitively control vehicle-mounted multiple degrees of freedom reconnaissance system end pose in reconnaissance system control
The problem of.
To achieve the goals above, the disclosure adopts the following technical scheme that
One or more embodiments provide a kind of Mobile Robot Control System, including the wearable remote operating control in main side
Device and from end robot, the wearable remote operating control device in the main side and from end robot by wireless communication, the master
Wearable remote operating control device is held to be worn on operator, for sending control instruction and receiving from the acquisition of end robot
Data;
The wearable remote operating control device in main side includes wearable binocular camera shooting device, head virtual display, remote operating
Controller and main side wireless telecom equipment, the remote operating controller respectively with wearable binocular camera shooting device, wear it is virtual aobvious
Show that device is connected with main side wireless telecom equipment, wearable binocular camera shooting device is used to acquire the image of operator's gesture.Described
Head virtual display is used to show the image shot from end robot and the void for showing the mechanical arm from end robot
The dummy model of analog model and operator's gesture.
Wearable binocular camera shooting device and head virtual display are set on the head of operator, double-visual angle figure may be implemented
The acquisition of picture, head virtual display are arranged the investigation image that can be realized simultaneously dummy model and acquisition, enable to operate
Person has feeling on the spot in person, is able to achieve remotely from the visual control of end robot, is liberated by the setting of wearable device
The both hands of operator alleviate the burden of operator.
One or more embodiments provide the position of the robot end based on a kind of above-mentioned Mobile Robot Control System
The remote operating control method of appearance, characterized in that include the following steps:
Step 1, setting traction hand-type and unloading hand-type;
Step 2, building virtual machine arm and virtual gesture model and the front end for being shown in head virtual display what comes into a driver's body;
Step 3, the stereoscopic image for acquiring binocular camera;
Step 4 is detected using gestures detection algorithm, judge in stereoscopic image whether with the presence of operator gesture, such as
Fruit is then to perform the next step, no to then follow the steps 3;
Step 5 carries out hand-type identification to gesture using hand-type recognizer, judges whether traction hand-type occurred, if
It is to perform the next step, it is no to then follow the steps 3;
Step 6 handles the stereoscopic image of shooting and solves traction gesture and sat in wearable binocular camera shooting device
Pose P in mark systemH, by pose PHThe pose be converted in the screen coordinate system of head virtual display describes PV, using turn
Pose P after changeVDrive the virtual gesture model in head virtual display what comes into a driver's body;
Step 7, the pose P for judging virtual gesture modelVWhether it is less than in advance with the difference of virtual machine arm N6 end pose P_M
If threshold value, if so, perform the next step, it is no to then follow the steps 3;
Step 8, the traction hand-type pose for making the pose of multi-degree-of-freemechanical mechanical arm follow operator change;
Step 9 judges whether unloading hand-type occur, if so, the pose stopping of multi-degree-of-freemechanical mechanical arm follows operator
The variation of traction hand-type pose, and execute step 3;Otherwise, step 8 is executed.
The remote operating control method of robot end's pose of setting, when detecting traction hand-type, by establishing operator
The driving relationship of gesture pose and multi-degree-of-freemechanical mechanical arm end pose is realized to the continuous of multi-degree-of-freemechanical mechanical arm end pose
Control;When detecting unloading hand-type, the traction hand that unloading makes the pose of multi-degree-of-freemechanical mechanical arm stop following operator is carried out
The variation of type pose.Show in head virtual display simultaneously virtual machine arm end and virtual gesture model follow process with
Unloading process, so that control process is more intuitive.By be arranged corresponding gesture start and stop to from end robot mostly freely
The control of mechanical arm is spent, control method is simple and reliable.
Compared with prior art, the disclosure has the beneficial effect that
(1) wearable binocular camera shooting device and head virtual display, Ke Yishi is arranged on the head of operator in the disclosure
The investigation image that can be realized simultaneously dummy model and acquisition, energy is arranged in the acquisition of existing stereoscopic image, head virtual display
Enough so that operator has feeling on the spot in person, it is able to achieve remotely from the visual control of end robot, passes through wearable device
The both hands for having liberated operator are set, the burden of operator is alleviated.
(2) the remote operating control method of robot end's pose of disclosure setting, control flow are operator's gesture
Pose-virtual machine arm end pose-multi-degree-of-freemechanical mechanical arm end pose of the virtual gesture model of pose-, passes through foundation
The driving relationship of operator's gesture pose and multi-degree-of-freemechanical mechanical arm end pose is realized to multi-degree-of-freemechanical mechanical arm end pose
Continuous control.Show virtual machine arm end and virtual gesture model in head virtual display simultaneously follows process,
So that control process is more intuitive.It is started and stopped by the way that corresponding gesture is arranged to the multi-degree-of-freemechanical mechanical arm from end robot
Control, control method are simple and reliable.
Detailed description of the invention
The accompanying drawings constituting a part of this application is used to provide further understanding of the present application, and the application's shows
Meaning property embodiment and its explanation do not constitute the restriction to the application for explaining the application.
Fig. 1 is the schematic diagram of the embodiment of the present disclosure 2 virtually dressed;
Fig. 2 is the schematic diagram of the virtual unloading of the embodiment of the present disclosure 2;
Fig. 3 is the flow chart of the control method of the embodiment of the present disclosure 2;
Wherein: N1, mobile robot ontology, N2, multi-degree-of-freemechanical mechanical arm, N3, investigation camera, N4, video glass,
N5, binocular camera, N6, virtual machine arm.
Specific embodiment:
The disclosure is described further with embodiment with reference to the accompanying drawing.
It is noted that described further below be all exemplary, it is intended to provide further instruction to the application.Unless another
It indicates, all technical and scientific terms used herein has usual with the application person of an ordinary skill in the technical field
The identical meanings of understanding.
It should be noted that term used herein above is merely to describe specific embodiment, and be not intended to restricted root
According to the illustrative embodiments of the application.As used herein, unless the context clearly indicates otherwise, otherwise singular
Also it is intended to include plural form, additionally, it should be understood that, when in the present specification using term "comprising" and/or " packet
Include " when, indicate existing characteristics, step, operation, device, component and/or their combination.It should be noted that not conflicting
In the case where, the feature in embodiment and embodiment in the disclosure can be combined with each other.Below in conjunction with attached drawing to embodiment
It is described in detail.
Robot can be divided into many classes according to different end effectors, and end effector is fixed on robot arm
End for executing corresponding task, end effector such as Dextrous Hand and clamper, video camera etc., the end of Detecting Robot
Actuator is investigation video camera, and the present embodiment is illustrated by taking Detecting Robot as an example, but robot end's pose of the disclosure
Continuous control method be not limited in Detecting Robot, but be suitable for the control of all robots.
Embodiment 1
In the technical solution disclosed in one or more embodiments, as illustrated in fig. 1 and 2, a kind of mobile robot control
System processed, including the wearable remote operating control device in main side and from end robot, the wearable remote operating control device in main side
With from robot is held by wireless communication, the wearable remote operating control device in main side is worn on operator, for sending out
It send control instruction and receives the data acquired from end robot;
The wearable remote operating control device in main side includes wearable binocular camera shooting device, head virtual display, remote operating
Controller and main side wireless telecom equipment, the remote operating controller respectively with wearable binocular camera shooting device, wear it is virtual aobvious
Show that device is connected with main side wireless telecom equipment, wearable binocular camera shooting device is used to acquire the image of operator's gesture.Described
Head virtual display is used to show the image shot from end robot and the void for showing the mechanical arm from end robot
The dummy model of analog model and operator's gesture.Being set as binocular camera shooting device may be implemented acquisition stereoscopic image.
Remote operating controller can be wearable computer, and the wearable computer can acquire wearable binocular in real time
The stereoscopic image of the gesture of photographic device shooting, and believed according to the pose of stereoscopic image calculating operation person's gesture of gesture
Breath, and according to gesture posture information video glass perspective what comes into a driver's body front end one virtual gesture model of real-time display;
Wearable binocular camera shooting device can be binocular camera N5, and the binocular camera N5 is for acquiring operator
The stereoscopic image of gesture.Operator is realized using gesture pose within sweep of the eye in binocular camera N5 to vehicle-mounted mostly free
The control of reconnaissance system end pose.
Head virtual display can be video glass N4, scout detecing for camera N3 shooting for showing from end robot
Image is looked into, and the dummy model of dummy model and operator's gesture for showing multi-degree-of-freemechanical mechanical arm N2, wherein investigating
Image can be located at the rear end of the perspective what comes into a driver's body of video glass, the dummy model and operator's gesture of multi-degree-of-freemechanical mechanical arm N2
Dummy model be located at video glass perspective what comes into a driver's body front end;The present embodiment uses perspective what comes into a driver's body and shows, can adopt
With other what comes into a driver's bodies.Perspective what comes into a driver's body is the what comes into a driver's body for passing through perspective projection, and the what comes into a driver's body of perspective projection is similar to a top
The pyramid all cut through with bottom, i.e. terrace with edge, its feature are: near big and far smaller.
From end, robot includes mobile robot ontology N1, multi-degree-of-freemechanical mechanical arm N2, investigation camera N3, wireless communication
Equipment and Vehicle Controller, the Vehicle Controller are taken the photograph with mobile robot ontology N1, multi-degree-of-freemechanical mechanical arm N2, investigation respectively
As head N3 is connected with from end wireless telecom equipment.Scout camera N3 be mounted on the end multi-degree-of-freemechanical mechanical arm N2 for acquire detect
Look into data, mobile robot ontology N1 further includes car body driving motor group and motor driver, the motor driver respectively with
Vehicle Controller is connected with driving motor group.Mobile robot ontology N1 receives the wearable remote operating in main side by Vehicle Controller
The control of control device carries out the movement on position.Control command is sent to motor driver, motor driven by Vehicle Controller
Device controls the corresponding motor of driving motor group, realizes the movement from end robot location.
The control that vehicle-mounted multi-degree-of-freemechanical mechanical arm N2 receives the wearable remote operating control device in main side executes corresponding movement,
The vehicle-mounted multi-degree-of-freemechanical mechanical arm N2 includes link mechanism, mechanical arm driver and mechanical arm driving motor group.Vehicle-mounted control
Control command is sent to mechanical arm driver by device, and mechanical arm driver drives the corresponding motor of mechanical arm driving motor group, real
The movement of existing link mechanism angles and positions, to change the joint angle information in each joint of multi-degree-of-freemechanical mechanical arm N2.
The dummy model that dummy model from the mechanical arm of end robot is multi-degree-of-freemechanical mechanical arm N2.Described is mostly free
The dummy model for spending mechanical arm N2 can be for according to the virtual machine arm N6 of the D-H parameter of multi-degree-of-freemechanical mechanical arm N2 drafting.
Operator is realized using gesture pose within sweep of the eye in binocular camera N5 to vehicle-mounted more free reconnaissance systems
The control of end pose.
Embodiment 2
The present embodiment provides based on a kind of robot end's pose of Mobile Robot Control System described in embodiment 1
Remote operating control method, as shown in Figure 1-3, particularly multi-degree-of-freemechanical mechanical arm end pose remote operating control method, it can
The continuous control to mechanical arm tail end position and posture is realized by the movement of gesture, is included the following steps:
Step 1, setting traction hand-type and unloading hand-type;
The traction hand-type refers to when detecting operator's hand-type thus, so that the pose and video of virtual gesture model
Virtual machine arm end pose in glasses keeps being overlapped, and operator can drive the void in video glass N4 by the pose of gesture
The position of quasi- gesture model and posture (i.e. pose), then virtual gesture model can carry out virtual machine arm N6 end pose real
When continuous control.
When gesture becomes unloading gesture, then virtual gesture model no longer follows the gesture of operator mobile, operator's hand
Gesture also cannot carry out real-time continuous control to virtual machine arm N6.
It draws hand-type and unloading hand-type can be any hand-type, can according to need self-setting, the present embodiment setting is led
Drawing hand-type can be the hand-type that indicates cartesian coordinate system, and the third finger and little finger in the hand-type are bending state, thumb,
Index finger and middle finger are straight configuration, and the orthogonal composition cartesian coordinate system of three fingers;Unloading hand-type can be held for one hand
Boxer's type.
It can also include the steps that initializing and establishing before step 1 being wirelessly connected:
It is initialized by remote operating controller and from end robot;
Establish remote operating controller and from end robot N1 between wireless communication;
Step 2: building virtual machine arm and virtual gesture model and the front end for being shown in head virtual display what comes into a driver's body;
The method for the front end that the step 2 constructs virtual machine arm and is shown in head virtual display what comes into a driver's body is specific
Are as follows:
21) the joint angle information in each joint of the multi-degree-of-freemechanical mechanical arm from end robot, is read;
The movement of multi-degree-of-freemechanical mechanical arm is controlled by Vehicle Controller, and mechanical arm driver drives mechanical arm driving motor
The corresponding motor of group, realizes the movement of link mechanism angles and positions, thus change each joint of multi-degree-of-freemechanical mechanical arm N2
Joint angle information.The joint angle information in each joint of multi-degree-of-freemechanical mechanical arm can be directly read by Vehicle Controller.
22), remote operating controller calculates the D-H parameter of multi-degree-of-freemechanical mechanical arm according to the joint angle information of acquisition;
23) virtual machine arm, is constructed according to the D-H parameter of multi-degree-of-freemechanical mechanical arm, and virtual machine arm is shown in head
Wear the front end of virtual monitor what comes into a driver's body.
The angle in each joint of the virtual machine arm N6 is controlled by the joint angle information received, virtual machine arm N6
Basis coordinates system described by the screen coordinate system of video glass N4, the ending coordinates system of virtual machine arm N6 is denoted as (OM-XM-YM-
ZM), the pose of the end of virtual machine arm N6 is by PMIt indicates, including location information and posture information;
The construction method of virtual gesture model can be with specifically:
(1) the three-dimensional virtual gesture model of traction hand-type is established offline using 3D modeling software;
(2) the three-dimensional virtual gesture model is loaded and is rendered into the front end of head virtual display what comes into a driver's body in real time,
Position and posture in what comes into a driver's body is by the position for drawing hand-type of operator and gesture drive.
For there is purpose and accuracy convenient for the operation of operator, can also be shown in video glass N4 from terminal device
Investigation environmental information locating for people can specifically show the investigation image for investigating camera N3 acquisition video glass N4's
In what comes into a driver's body, it can also include the steps that showing the image shot from end robot on head virtual display, specific as follows:
Acquire the reconnaissance image from end robotic end;Remote operating controller receives reconnaissance image and it is shown in wears virtual display in real time
The what comes into a driver's body rear end of device.
Step 3, the stereoscopic image for acquiring binocular camera N5;The hand-type letter of operator is acquired by binocular camera N5
Breath.Stereoscopic image includes the image at two visual angles in left and right.
Step 4 is detected using gestures detection algorithm, judge in stereoscopic image whether with the presence of operator gesture, such as
Fruit is then to perform the next step, no to then follow the steps 3;As long as occurring the gesture of operator in stereoscopic image, execute at this time
Step 5.
Gestures detection algorithm can be specially the gestures detection algorithm based on colour of skin threshold value.
Step 5 carries out hand-type identification to gesture using hand-type recognizer, judges whether traction hand-type occurred, if
It is to perform the next step, it is no to then follow the steps 3;Hand-type recognizer is specially the hand-type recognizer based on deep learning.
Traction hand-type is monitored in multi-view image in pairs when detecting, to realize the hand-type traction control by operator at this time
Multi-degree-of-freemechanical mechanical arm N2 processed.If there is not traction hand-type, operator is acquired again through step 3 binocular camera N5 is executed
Hand-type information.
Step 6 handles the stereoscopic image of shooting and solves traction gesture and sat in wearable binocular camera shooting device
Pose P in mark systemH, by pose PHThe pose be converted in the screen coordinate system of head virtual display describes PV, using turn
Pose P after changeVDrive the virtual gesture model in head virtual display what comes into a driver's body;
Solve pose P of the traction gesture in wearable binocular camera shooting device coordinate systemHIt can be calculated using DeepPrior++
The estimation under stereoscopic vision to gesture pose may be implemented in method, DeepPrior++ algorithm.
Solve pose P of the traction gesture in wearable binocular camera shooting device coordinate systemHFollowing steps can also be used:
(1) gesture pose P is drawnH, include location information and posture information, wherein the solution of location information directly uses a left side
Gestures detection result and principle of parallax are realized in right view;
(2) gesture pose P is drawnHPosture information using based on recurrence learning method realize:
Draw gesture pose PHPosture information using based on recurrence learning method realization be specifically as follows:
(2.1) double-visual angle images of gestures and corresponding attitude data collection are acquired first.It can be using hand-held three-axis attitude sensing
Three axis of the device before double-visual angle camera rotating around three axis attitude sensor make rotating motion and to acquire attitude transducer each
The corresponding double-visual angle gestures detection result images of secondary output data.The two frame images of gestures and a frame posture number that synchronization obtains
It is respectively input sample and output sample according to work.Collected double-visual angle images of gestures and corresponding attitude data are respectively as input
Sample training collection and output sample set.
(2.2) using the mapping relations of recurrence learning method fitting double-visual angle images of gestures and attitude data.
(2.3) the posture letter of traction gesture can be directly solved by the images of gestures of double-visual angle by above two step
Breath.
Step 6 can be the traction gesture for establishing operator and the corresponding relationship of virtual gesture model first, pass through correspondence
Relationship is by pose PHBe converted to pose PV.It can be proportional relationship specific corresponding to relationship, the traction gesture of operator can worn
Wear the pose P in binocular camera shooting device coordinate systemHLocation information and pose PVLocation information direct proportionality, pose PH
Posture information and pose PVPosture information also direct proportionality.
The pose P of the traction gestureHIt is that can specify that traction hand described in the coordinate system of binocular camera N5
The palm of the hand of gesture is origin, and drawing the origin system at the gesture palm of the hand is (OH-XH-YH-ZH), draw the middle finger meaning side of gesture
To for X-direction, thumb direction is Y direction, and middle finger direction is Z-direction, wherein pose PHPosition letter
Cease the origin O by the traction gesture palm of the handHOffset relative to binocular camera N5 coordinate origin describes, pose PHPosture letter
Cease the coordinate system X by traction gestureHAxis, YHAxis and ZHRotation description of the axis for each axis of binocular camera N5 coordinate system.
The pose P of the virtual traction gestureVIt is that can specify that described in the screen coordinate system of video glass N4
The palm of the hand of virtual traction gesture is origin, and the origin system at the virtual traction gesture palm of the hand is denoted as (OV-XV-YV-
ZV), the virtual middle finger direction for drawing gesture is X-direction, and thumb direction is Y direction, middle finger direction
For Z-direction, wherein pose PVLocation information by virtually drawing the origin O of the gesture palm of the handVThe screen of relative video glasses N4
The offset of coordinate origin describes, pose PVPosture information by virtually drawing the coordinate system X of gestureVAxis, YVAxis and ZVAxis for
The rotation description of each axis of the screen coordinate system of video glass N4.
Secondly, using the pose P after conversionVThe virtual gesture model in head virtual display what comes into a driver's body is driven, virtually
Gesture model begins to follow the gesture of operator to move.
The driving method specifically: tie up after virtual gesture model is loaded into head virtual display, in what comes into a driver's body
In real-time rendering when desired position information by pose PVLocation information indirect assignment, the real-time rendering in what comes into a driver's body
The posture information of Shi Suoxu is by pose PVPosture information indirect assignment.
Since the pose of virtual gesture model is by pose PVLocation information and the real-time indirect assignment of posture information, therefore it is empty
Quasi- pose of the gesture model in what comes into a driver's body and pose PVIt is completely the same, it will thus be appreciated that the pose of virtual gesture model is by position
Appearance PVDriving.
Step 7, the pose P for judging virtual gesture modelVWith virtual machine arm N6 end pose PVDifference whether be less than it is default
Threshold value, if so, performing the next step;It is no to then follow the steps 3;
Step 7 is to realize the process of wearing, particularly realizes the pose P of virtual gesture modelVWith virtual machine arm N6
End pose PMDistance approach fastly.By the mobile traction gesture of step 6 operator, so that virtual gesture model also follows shifting
It is dynamic, until the pose P of virtual gesture modelVClose to virtual machine arm N6 end pose PM。
Step 7 concrete implementation process are as follows: virtually draw gesture in the perspective what comes into a driver's body of operator's observation video glass N4
Pose PVWith virtual machine arm N6 end pose PMBetween relativeness, by constantly move traction gesture pose PHTo make
The pose P of virtual traction gesture in the perspective what comes into a driver's body of video glass N4VWith virtual machine arm N6 end pose PMDifference constantly
Reduce, the difference of two poses is described by following formula:
D=| PV-PM|
As the pose P for virtually drawing gestureVWith virtual machine arm N6 end pose PMPoor d be less than preset threshold when, then recognize
Is overlapped with virtual traction gesture for the end virtual machine arm N6, can image think the end virtual machine arm N6 void at this time
It is quasi- to be worn in virtual traction gesture.During this, remote operating controller is realized by the way that step 3-7 is performed a plurality of times.When wearing
Process completion is worn, multi-degree-of-freemechanical mechanical arm N2 can be drawn.
Step 8, the traction hand-type pose for making the pose of multi-degree-of-freemechanical mechanical arm follow operator change;
The step of step 8 specifically:
So that virtual machine arm end pose PMValue and virtual gesture model pose PVIt is equal, solving virtual mechanical arm
The corresponding each joint angle value of N6;Specifically by solution of Inverse Kinematics algorithm Real-time solution when virtual machine arm end position
Appearance PMValue and virtual gesture model pose PVWhen equal, the corresponding each joint angle angle value of virtual machine arm N6.
Control instruction is converted to according to the corresponding each joint angle angle value of the virtual machine arm of solution to be transmitted to from end robot,
So that the joint angles in each joint of multi-degree-of-freemechanical mechanical arm are equal with each joint angle angle value of virtual machine arm.
It specifically can be with are as follows: each joint angle of virtual machine arm N6 is converted to control instruction and passes through nothing by remote operating controller
Line communication channel is sent to from end robot N1, from the Vehicle Controller of end robot N1 read the control instruction that receives it
Afterwards, control instruction is converted into motor driven instruction, then passes through the machine of mechanical arm driver control multi-degree-of-freemechanical mechanical arm N2
Each joint motor of tool arm driving motor group starts turning, and makes the joint angle and virtual machine in each joint multi-degree-of-freemechanical mechanical arm N2
Each joint angle of arm N6 is identical;So that the pose of multi-degree-of-freemechanical mechanical arm N2 follows the variation of operator's gesture pose.
Position to make the more intuitive adjustable virtual machine arm of manipulation of operator is posture, can make video
The pose variation of real-time display virtual machine arm N6 in glasses N4.
The step 8 further include: right in what comes into a driver's body according to the corresponding each joint angle angle value of the virtual machine arm N6 of solution
Virtual machine arm N6 is redrawn.Each joint angle angle value of the virtual machine arm N6 obtained according to Arithmetic of inverse kinematics of robots exists
Virtual machine arm N6 is redrawn in the perspective what comes into a driver's body of video glass N4, make the end virtual machine arm N6 pose always with void
The end pose of quasi- traction gesture keeps identical.
Step 9 judges whether unloading hand-type occur, if so, the pose stopping of multi-degree-of-freemechanical mechanical arm follows operator
The variation of traction hand-type pose, and execute step 3;Otherwise, step 8 is executed.Real-time judge operator gesture is in distraction procedure
No to become unloading gesture, the present embodiment unloading gesture can be set to left hand and clench fist state, if operator's gesture becomes unloading
Then the end virtual machine arm N6 pose is no longer controlled by operator gesture, can image think the end virtual machine arm N6 at this time
Virtual unloading is got off from the gesture of operator.Unloading gesture can be any gesture, can be singlehanded gesture, can also
Think bimanual input.So far, distraction procedure terminates, and can terminate that other orders also can be performed.
Embodiment 3
The present embodiment additionally provides a kind of electronic equipment, including memory and processor and storage on a memory and
The computer instruction run on processor when the computer instruction is run by processor, completes method in one embodiment
The step of.
Embodiment 4
The present embodiment additionally provides a kind of computer readable storage medium, for storing computer instruction, the computer
When instruction is executed by processor, in completion one embodiment the step of method.
The foregoing is merely preferred embodiment of the present application, are not intended to limit this application, for the skill of this field
For art personnel, various changes and changes are possible in this application.Within the spirit and principles of this application, made any to repair
Change, equivalent replacement, improvement etc., should be included within the scope of protection of this application.
Although above-mentioned be described in conjunction with specific embodiment of the attached drawing to the disclosure, model not is protected to the disclosure
The limitation enclosed, those skilled in the art should understand that, on the basis of the technical solution of the disclosure, those skilled in the art are not
Need to make the creative labor the various modifications or changes that can be made still within the protection scope of the disclosure.
Claims (10)
1. a kind of Mobile Robot Control System, it is characterized in that: including the wearable remote operating control device in main side and from terminal device
People, the wearable remote operating control device in the main side and from end robot by wireless communication, the wearable remote operating in main side
Control device is worn on operator, for sending control instruction and receiving the data acquired from end robot;
The wearable remote operating control device in main side includes wearable binocular camera shooting device, head virtual display, remote operating control
Device and main side wireless telecom equipment, the remote operating controller respectively with wearable binocular camera shooting device, head virtual display
It is connected with main side wireless telecom equipment, wearable binocular camera shooting device is used to acquire the image of operator's gesture.Described wears
Virtual monitor is used to show the image shot from end robot and the virtual mould for showing the mechanical arm from end robot
The dummy model of type and operator's gesture.
2. a kind of Mobile Robot Control System as described in claim 1, it is characterized in that: including mobile machine from end robot
Human body, multi-degree-of-freemechanical mechanical arm, investigation camera, wireless telecom equipment and Vehicle Controller, the Vehicle Controller difference
It is connect with mobile robot ontology N1, multi-degree-of-freemechanical mechanical arm N2, investigation camera N3, wireless telecom equipment;Mobile robot
The control that ontology receives the wearable remote operating control device in main side carries out the movement on position, and vehicle-mounted multi-degree-of-freemechanical mechanical arm receives
The control of the wearable remote operating control device in main side executes corresponding movement, the dummy model of the mechanical arm from end robot
For the dummy model of multi-degree-of-freemechanical mechanical arm.
3. a kind of Mobile Robot Control System as described in claim 1, it is characterized in that: mobile robot ontology further includes vehicle
Body driving motor group and motor driver, the motor driver are connect with Vehicle Controller and driving motor group respectively.
4. the remote operating of robot end's pose based on a kind of Mobile Robot Control System described in claim 1-3 controls
Method, characterized in that include the following steps:
Step 1, setting traction hand-type and unloading hand-type;
Step 2, building virtual machine arm and virtual gesture model and the front end for being shown in head virtual display what comes into a driver's body;
Step 3, the stereoscopic image for acquiring binocular camera;
Step 4 is detected using gestures detection algorithm, judge in stereoscopic image whether with the presence of operator gesture, if so,
Then perform the next step, it is no to then follow the steps 3;
Step 5 carries out hand-type identification to gesture using hand-type recognizer, judges whether traction hand-type occurred, if so, holding
Row is in next step, no to then follow the steps 3;
Step 6 handles the stereoscopic image of shooting and solves traction gesture in wearable binocular camera shooting device coordinate system
In pose PH, by pose PHThe pose be converted in the screen coordinate system of head virtual display describes PV, after conversion
Pose PVDrive the virtual gesture model in head virtual display what comes into a driver's body;
Step 7, the pose P for judging virtual gesture modelVWith virtual machine arm N6 end pose PVDifference whether be less than default threshold
Value, if so, performing the next step;It is no to then follow the steps 3;
Step 8, the traction hand-type pose for making the pose of multi-degree-of-freemechanical mechanical arm follow operator change;
Step 9 judges whether unloading hand-type occur, if so, the pose stopping of multi-degree-of-freemechanical mechanical arm follows leading for operator
Draw the variation of hand-type pose, and executes step 3;Otherwise, step 8 is executed.
5. remote operating control method as claimed in claim 4, it is characterized in that: the step 8 makes multi-degree-of-freemechanical mechanical arm
Pose follows the traction hand-type pose of operator to change, step specifically:
So that virtual machine arm end pose PMValue and virtual gesture model pose PVEqual, solving virtual mechanical arm is corresponding
Each joint angle value;
Control instruction is converted to according to the corresponding each joint angle angle value of the virtual machine arm of solution to be transmitted to from end robot, so that
The joint angles in each joint of multi-degree-of-freemechanical mechanical arm are equal with each joint angle angle value of virtual machine arm;
Or/and
The step 8 further include: according to the corresponding each joint angle angle value of the virtual machine arm N6 of solution to virtual in what comes into a driver's body
Mechanical arm N6 is redrawn.
6. remote operating control method as claimed in claim 4, it is characterized in that: traction gesture is sat in wearable binocular camera shooting device
Pose P in mark systemHLocation information and pose PVLocation information direct proportionality, pose PHPosture information and pose
PVPosture information also direct proportionality.
7. remote operating control method as claimed in claim 4, it is characterized in that: the step 2 constructs multi-dummy machine tool arm and shows
Show the method in the front end of head virtual display what comes into a driver's body specifically:
Read the joint angle information in each joint of the multi-degree-of-freemechanical mechanical arm from end robot;
Remote operating controller calculates the D-H parameter of multi-degree-of-freemechanical mechanical arm according to the joint angle information of acquisition;
Virtual machine arm is constructed according to the D-H parameter of multi-degree-of-freemechanical mechanical arm, and virtual machine arm is shown and is wearing virtual show
Show the front end of device what comes into a driver's body.
8. remote operating control method as claimed in claim 4, it is characterized in that: further including wearing virtual display before step 3
The step of image that display is shot from end robot on device:
Acquire the reconnaissance image from end robotic end;
Remote operating controller receives reconnaissance image and its what comes into a driver's body rear end for being shown in head virtual display in real time.
9. a kind of electronic equipment, characterized in that on a memory and on a processor including memory and processor and storage
The computer instruction of operation when the computer instruction is run by processor, is completed described in any one of claim 4-8 method
Step.
10. a kind of computer readable storage medium, characterized in that for storing computer instruction, the computer instruction is located
When managing device execution, step described in any one of claim 4-8 method is completed.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910363155.4A CN109955254B (en) | 2019-04-30 | 2019-04-30 | Mobile robot control system and teleoperation control method for robot end pose |
PCT/CN2020/087846 WO2020221311A1 (en) | 2019-04-30 | 2020-04-29 | Wearable device-based mobile robot control system and control method |
KR1020207030337A KR102379245B1 (en) | 2019-04-30 | 2020-04-29 | Wearable device-based mobile robot control system and control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910363155.4A CN109955254B (en) | 2019-04-30 | 2019-04-30 | Mobile robot control system and teleoperation control method for robot end pose |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109955254A true CN109955254A (en) | 2019-07-02 |
CN109955254B CN109955254B (en) | 2020-10-09 |
Family
ID=67026942
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910363155.4A Active CN109955254B (en) | 2019-04-30 | 2019-04-30 | Mobile robot control system and teleoperation control method for robot end pose |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109955254B (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110394803A (en) * | 2019-08-14 | 2019-11-01 | 纳博特南京科技有限公司 | A kind of robot control system |
CN110413113A (en) * | 2019-07-18 | 2019-11-05 | 华勤通讯技术有限公司 | A kind of on-vehicle machines people and exchange method |
CN110815258A (en) * | 2019-10-30 | 2020-02-21 | 华南理工大学 | Robot teleoperation system and method based on electromagnetic force feedback and augmented reality |
CN111476909A (en) * | 2020-03-04 | 2020-07-31 | 哈尔滨工业大学 | Teleoperation control method and teleoperation control system for compensating time delay based on virtual reality |
WO2020221311A1 (en) * | 2019-04-30 | 2020-11-05 | 齐鲁工业大学 | Wearable device-based mobile robot control system and control method |
CN112405530A (en) * | 2020-11-06 | 2021-02-26 | 齐鲁工业大学 | Robot vision tracking control system and control method based on wearable vision |
CN112650120A (en) * | 2020-12-22 | 2021-04-13 | 华中科技大学同济医学院附属协和医院 | Robot remote control system, method and storage medium |
CN113146612A (en) * | 2021-01-05 | 2021-07-23 | 上海大学 | Virtual-real combination and man-machine interaction underwater remote control robot manipulator operation system and method |
CN113822251A (en) * | 2021-11-23 | 2021-12-21 | 齐鲁工业大学 | Ground reconnaissance robot gesture control system and control method based on binocular vision |
WO2022002155A1 (en) * | 2020-07-01 | 2022-01-06 | 北京术锐技术有限公司 | Master-slave motion control method, robot system, device, and storage medium |
CN114683288A (en) * | 2022-05-07 | 2022-07-01 | 法奥意威(苏州)机器人系统有限公司 | Robot display and control method and device and electronic equipment |
CN114713421A (en) * | 2022-05-05 | 2022-07-08 | 罗海华 | Control method and system for remote control spraying |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104057450A (en) * | 2014-06-20 | 2014-09-24 | 哈尔滨工业大学深圳研究生院 | Teleoperation method of high-dimensional motion arm aiming at service robot |
CN106444861A (en) * | 2016-11-21 | 2017-02-22 | 清华大学深圳研究生院 | Space robot teleoperation system based on three-dimensional gestures |
EP3214835A1 (en) * | 2016-02-16 | 2017-09-06 | Ricoh Company, Ltd. | Information terminal, recording medium, communication control method, and communication system |
US20170258549A1 (en) * | 2016-03-11 | 2017-09-14 | Sony Olympus Medical Solutions Inc. | Medical observation device |
CN108044625A (en) * | 2017-12-18 | 2018-05-18 | 中南大学 | A kind of robot arm control method based on the virtual gesture fusions of more Leapmotion |
WO2018152504A1 (en) * | 2017-02-20 | 2018-08-23 | Irobot Defense Holdings, Inc. | Robotic gripper camera |
CN108453742A (en) * | 2018-04-24 | 2018-08-28 | 南京理工大学 | Robot man-machine interactive system based on Kinect and method |
CN108638069A (en) * | 2018-05-18 | 2018-10-12 | 南昌大学 | A kind of mechanical arm tail end precise motion control method |
CN108828996A (en) * | 2018-05-31 | 2018-11-16 | 四川文理学院 | A kind of the mechanical arm remote control system and method for view-based access control model information |
CN109219856A (en) * | 2016-03-24 | 2019-01-15 | 宝利根 T·R 有限公司 | For the mankind and robot cooperated system and method |
CN109514521A (en) * | 2018-12-18 | 2019-03-26 | 合肥工业大学 | The servo operation and its method of manpower collaboration Dextrous Hand based on multi-information fusion |
CN109571403A (en) * | 2018-12-12 | 2019-04-05 | 杭州申昊科技股份有限公司 | A kind of track trace navigation intelligent inspection robot and its air navigation aid |
CN208713510U (en) * | 2018-08-01 | 2019-04-09 | 珠海市有兴精工机械有限公司 | CNC processing elasticity crawl gripper |
-
2019
- 2019-04-30 CN CN201910363155.4A patent/CN109955254B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104057450A (en) * | 2014-06-20 | 2014-09-24 | 哈尔滨工业大学深圳研究生院 | Teleoperation method of high-dimensional motion arm aiming at service robot |
EP3214835A1 (en) * | 2016-02-16 | 2017-09-06 | Ricoh Company, Ltd. | Information terminal, recording medium, communication control method, and communication system |
US20170258549A1 (en) * | 2016-03-11 | 2017-09-14 | Sony Olympus Medical Solutions Inc. | Medical observation device |
CN109219856A (en) * | 2016-03-24 | 2019-01-15 | 宝利根 T·R 有限公司 | For the mankind and robot cooperated system and method |
CN106444861A (en) * | 2016-11-21 | 2017-02-22 | 清华大学深圳研究生院 | Space robot teleoperation system based on three-dimensional gestures |
WO2018152504A1 (en) * | 2017-02-20 | 2018-08-23 | Irobot Defense Holdings, Inc. | Robotic gripper camera |
CN108044625A (en) * | 2017-12-18 | 2018-05-18 | 中南大学 | A kind of robot arm control method based on the virtual gesture fusions of more Leapmotion |
CN108453742A (en) * | 2018-04-24 | 2018-08-28 | 南京理工大学 | Robot man-machine interactive system based on Kinect and method |
CN108638069A (en) * | 2018-05-18 | 2018-10-12 | 南昌大学 | A kind of mechanical arm tail end precise motion control method |
CN108828996A (en) * | 2018-05-31 | 2018-11-16 | 四川文理学院 | A kind of the mechanical arm remote control system and method for view-based access control model information |
CN208713510U (en) * | 2018-08-01 | 2019-04-09 | 珠海市有兴精工机械有限公司 | CNC processing elasticity crawl gripper |
CN109571403A (en) * | 2018-12-12 | 2019-04-05 | 杭州申昊科技股份有限公司 | A kind of track trace navigation intelligent inspection robot and its air navigation aid |
CN109514521A (en) * | 2018-12-18 | 2019-03-26 | 合肥工业大学 | The servo operation and its method of manpower collaboration Dextrous Hand based on multi-information fusion |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020221311A1 (en) * | 2019-04-30 | 2020-11-05 | 齐鲁工业大学 | Wearable device-based mobile robot control system and control method |
CN110413113A (en) * | 2019-07-18 | 2019-11-05 | 华勤通讯技术有限公司 | A kind of on-vehicle machines people and exchange method |
CN110394803A (en) * | 2019-08-14 | 2019-11-01 | 纳博特南京科技有限公司 | A kind of robot control system |
CN110815258A (en) * | 2019-10-30 | 2020-02-21 | 华南理工大学 | Robot teleoperation system and method based on electromagnetic force feedback and augmented reality |
CN111476909A (en) * | 2020-03-04 | 2020-07-31 | 哈尔滨工业大学 | Teleoperation control method and teleoperation control system for compensating time delay based on virtual reality |
WO2022002155A1 (en) * | 2020-07-01 | 2022-01-06 | 北京术锐技术有限公司 | Master-slave motion control method, robot system, device, and storage medium |
CN112405530A (en) * | 2020-11-06 | 2021-02-26 | 齐鲁工业大学 | Robot vision tracking control system and control method based on wearable vision |
CN112405530B (en) * | 2020-11-06 | 2022-01-11 | 齐鲁工业大学 | Robot vision tracking control system and control method based on wearable vision |
CN112650120A (en) * | 2020-12-22 | 2021-04-13 | 华中科技大学同济医学院附属协和医院 | Robot remote control system, method and storage medium |
CN113146612A (en) * | 2021-01-05 | 2021-07-23 | 上海大学 | Virtual-real combination and man-machine interaction underwater remote control robot manipulator operation system and method |
CN113822251A (en) * | 2021-11-23 | 2021-12-21 | 齐鲁工业大学 | Ground reconnaissance robot gesture control system and control method based on binocular vision |
CN114713421A (en) * | 2022-05-05 | 2022-07-08 | 罗海华 | Control method and system for remote control spraying |
CN114713421B (en) * | 2022-05-05 | 2023-03-24 | 罗海华 | Control method and system for remote control spraying |
CN114683288A (en) * | 2022-05-07 | 2022-07-01 | 法奥意威(苏州)机器人系统有限公司 | Robot display and control method and device and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN109955254B (en) | 2020-10-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109955254A (en) | The remote operating control method of Mobile Robot Control System and robot end's pose | |
WO2020221311A1 (en) | Wearable device-based mobile robot control system and control method | |
CN110039545B (en) | Robot remote control system and control method based on wearable equipment | |
CN104589356B (en) | The Dextrous Hand remote operating control method caught based on Kinect human hand movement | |
KR101785360B1 (en) | Method and system for hand presence detection in a minimally invasive surgical system | |
KR101762638B1 (en) | Method and apparatus for hand gesture control in a minimally invasive surgical system | |
KR101789064B1 (en) | Method and system for hand control of a teleoperated minimally invasive slave surgical instrument | |
KR101762631B1 (en) | A master finger tracking device and method of use in a minimally invasive surgical system | |
CN109164829B (en) | Flying mechanical arm system based on force feedback device and VR sensing and control method | |
WO2011065035A1 (en) | Method of creating teaching data for robot, and teaching system for robot | |
CN108972494A (en) | A kind of Apery manipulator crawl control system and its data processing method | |
WO2011065034A1 (en) | Method for controlling action of robot, and robot system | |
CN106737668A (en) | A kind of hot line robot teleoperation method based on virtual reality | |
CN112634318B (en) | Teleoperation system and method for underwater maintenance robot | |
CN107030692B (en) | Manipulator teleoperation method and system based on perception enhancement | |
CN103533909A (en) | Estimation of a position and orientation of a frame used in controlling movement of a tool | |
CN111459277B (en) | Mechanical arm teleoperation system based on mixed reality and interactive interface construction method | |
CN113021357A (en) | Master-slave underwater double-arm robot convenient to move | |
CN108062102A (en) | A kind of gesture control has the function of the Mobile Robot Teleoperation System Based of obstacle avoidance aiding | |
JP2003062775A (en) | Teaching system for human hand type robot | |
KR101956900B1 (en) | Method and system for hand presence detection in a minimally invasive surgical system | |
CN113561172A (en) | Dexterous hand control method and device based on binocular vision acquisition | |
CN111002295A (en) | Teaching glove and teaching system of two-finger grabbing robot | |
Matsuzaka et al. | Assistance for master-slave system for objects of various shapes by eye gaze tracking and motion prediction | |
JP7386451B2 (en) | Teaching system, teaching method and teaching program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20210406 Address after: 203-d, Shanke Zhongchuang space, 19 Keyuan Road, Lixia District, Jinan City, Shandong Province Patentee after: Shanke Huazhi (Shandong) robot intelligent technology Co.,Ltd. Address before: 250353 University Road, Changqing District, Ji'nan, Shandong Province, No. 3501 Patentee before: Qilu University of Technology |