CN111452046A - Virtual reality-based explosive-handling robot system, control method and storage medium - Google Patents

Virtual reality-based explosive-handling robot system, control method and storage medium Download PDF

Info

Publication number
CN111452046A
CN111452046A CN202010246143.6A CN202010246143A CN111452046A CN 111452046 A CN111452046 A CN 111452046A CN 202010246143 A CN202010246143 A CN 202010246143A CN 111452046 A CN111452046 A CN 111452046A
Authority
CN
China
Prior art keywords
motion
camera
information
angle
mechanical arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010246143.6A
Other languages
Chinese (zh)
Inventor
谭德政
潘志庚
李勇恒
张浩洋
许胡宇
曾瑜晴
曹明亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan University
Original Assignee
Foshan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan University filed Critical Foshan University
Priority to CN202010246143.6A priority Critical patent/CN111452046A/en
Publication of CN111452046A publication Critical patent/CN111452046A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/002Manipulators for defensive or military tasks
    • B25J11/0025Manipulators for defensive or military tasks handling explosives, bombs or hazardous objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Abstract

The invention relates to an explosion-venting robot system based on virtual reality, a control method and a storage medium, which comprises the following steps of step 201, obtaining image information shot by a camera cloud deck, imaging the image information in a VR head display, step 202, obtaining hand Motion information of a user, generating a Motion gesture of a mechanical arm according to the hand Motion information of the user, controlling the mechanical arm according to the Motion gesture, step 203, obtaining angle information of a gyroscope, controlling the camera cloud deck to rotate along with the angle information, and step 204, repeating the steps 201 to 203.

Description

Virtual reality-based explosive-handling robot system, control method and storage medium
Technical Field
The invention relates to the field of artificial intelligence, in particular to an explosion-removing robot system based on virtual reality, a control method and a storage medium.
Background
With the increasing severity of anti-terrorism situation around the world and the occurrence of many unavoidable natural disasters, the explosive ordnance disposal robot capable of preventing human beings from directly contacting with dangerous objects and remotely removing obstacles is more and more emphasized by countries around the world. The explosive-handling robot in the market today usually operates the mechanical arm through a remote control device to perform various explosive-handling operations. It replaces the person approaching the suspicious object and carries out the operations of explosive identification, transfer, destruction and the like. The robot body and the robot arm are provided with a plurality of groups of photographic lenses and sensors, and high-definition images of the explosion-removing site can be transmitted to a remote monitoring control system for remote control outside 200 meters.
But the mechanical arm of traditional explosive ordnance disposal robot is controlled difficultly, because the problem of imaging mode leads to the degree of depth that does not have the scene for explosive ordnance disposal personnel also can not be fine grasp with the distance of actual object.
Disclosure of Invention
The invention aims to solve the defects of the prior art, provides an explosion-venting robot system based on virtual reality, a control method and a storage medium, can obtain a depth image of an explosion-venting site by adopting a VR imaging mode, and combines an L eapMotion controller to identify hand actions of a user, so as to control a mechanical arm of the explosion-venting robot to perform corresponding actions according to the hand actions of the user, and drives a camera holder to synchronously rotate by rotating a control head display through a gyroscope, so that the explosion venting is more intelligent, and the explosion-venting precision is greatly improved.
In order to achieve the purpose, the invention adopts the following technical scheme:
the explosive-handling robot system based on virtual reality is provided, which comprises:
the camera module comprises a camera holder, the camera holder can rotate in multiple degrees of freedom, and the camera module is arranged on the mechanical arm;
the VR head display is used for being worn by a user and is in communication connection with the camera holder, when the head of the user rotates, the VR head display rotates along with the VR head display, and the camera holder rotates along with the VR head display synchronously;
a gyroscope arranged in the VR head display for acquiring attitude angle information of the VR head display, wherein the attitude angle information includes course angle
Figure BDA0002434026550000011
A pitch angle theta and a roll angle gamma;
l eap Motion controller, which is used to identify the hand Motion of the user and convert the Motion gesture to obtain the Motion gesture of the mechanical arm;
a processing module for, in use,
acquiring image information shot by a camera holder, and imaging the image information in a VR head display;
controlling the mechanical arm according to the motion posture of the mechanical arm;
and acquiring angle information of a gyroscope, and controlling the camera holder to rotate along with the gyroscope according to the angle information.
The invention also provides a control method of the explosion-removing robot based on the virtual reality, which comprises the following steps:
step 201, acquiring image information shot by a camera holder, and imaging the image information in a VR head display;
202, acquiring hand motion information of a user, generating a motion gesture of the mechanical arm according to the hand motion information of the user, and controlling the mechanical arm according to the motion gesture;
step 203, acquiring angle information of a gyroscope, and controlling the camera holder to rotate along with the gyroscope according to the angle information;
and step 204, repeating the steps 201 to 203.
Further, the step 201 of imaging the image information in the VR headset specifically includes the following steps:
step 301, setting a Unity camera in Unity, and importing the image information into the Unity camera;
step 302, introducing an AVPro motion Capture plug-in library file into the Unity, and performing frame-by-frame screenshot on the image information introduced into the Unity camera through the AVPro motion Capture;
303, switching the Unity camera to six different directions in each frame, and respectively obtaining screenshots corresponding to the six different directions to obtain data corresponding to the six screenshots;
and 304, storing the data corresponding to the screenshot into a texture map, and synthesizing according to the data in the texture map to obtain the panoramic video.
Further, the six orientations in the step 303 are:
(0,0,0),(0,90,0),(0,-90,0),(90,0,0),(-90,0,0),(0,0,180)。
further, the method for generating the motion posture of the robot arm according to the hand motion information of the user in step 202 specifically includes the following steps:
and converting the Motion posture of the hand Motion information of the user by adopting a gesture recognition mode based on L eap Motion and according to a related algorithm for gesture recognition of L eap Motion, so as to obtain the Motion posture of the mechanical arm.
Further, the manner of controlling the camera pan-tilt to follow the rotation according to the angle information in step 203 specifically includes the following:
and carrying out settlement of attitude angles in a quaternion mode to obtain the following rotating angle of the camera holder, and controlling the camera holder to rotate correspondingly.
Further, the attitude angle is calculated through quaternion, wherein the method for calculating the heading angle phi, the pitch angle theta and the roll angle gamma comprises the following steps:
step 701, initializing quaternion q0、q1、q2、q3The following were used:
Figure BDA0002434026550000031
step 702, determining an attitude matrix from the quaternion
Figure BDA0002434026550000032
Figure BDA0002434026550000033
Step 703, the attitude angle can be solved according to the equations of step 701 and step 702
Figure BDA0002434026550000034
θ、γ:
Figure BDA0002434026550000035
θ=-arcsin[2(q1q3-q0q2)]
Figure BDA0002434026550000036
The invention also proposes a computer-readable storage medium, in which a computer program is stored, characterized in that the computer program realizes the steps of the method according to any one of claims 2 to 7 when executed by a processor.
The invention can obtain the following beneficial effects when adopting the system and the method:
according to the invention, the depth image of the explosion-venting site can be obtained by adopting a VR imaging mode, the hand action of a user is identified by combining with an L eap Motion controller, the mechanical arm of the explosion-venting robot is controlled to perform corresponding action according to the hand action of the user, and the control head is rotated by the gyroscope to drive the camera holder to rotate synchronously, so that the explosion venting is more intelligent, and the explosion-venting precision is greatly improved.
Drawings
Fig. 1 is a flowchart illustrating a control method of an explosion venting robot based on virtual reality according to the present invention.
Detailed Description
The conception, the specific structure and the technical effects of the present invention will be clearly and completely described in conjunction with the embodiments and the accompanying drawings to fully understand the objects, the schemes and the effects of the present invention. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The same reference numbers will be used throughout the drawings to refer to the same or like parts.
The invention provides an explosion-removing robot system based on virtual reality, which comprises:
the camera module comprises a camera holder, the camera holder can rotate in multiple degrees of freedom, and the camera module is arranged on the mechanical arm;
the VR head display is used for being worn by a user and is in communication connection with the camera holder, when the head of the user rotates, the VR head display rotates along with the VR head display, and the camera holder rotates along with the VR head display synchronously;
a gyroscope arranged in the VR head display for acquiring attitude angle information of the VR head display, wherein the attitude angle information includes course angle
Figure BDA0002434026550000041
A pitch angle theta and a roll angle gamma;
l eap Motion controller, which is used to identify the hand Motion of the user and convert the Motion gesture to obtain the Motion gesture of the mechanical arm;
a processing module for, in use,
acquiring image information shot by a camera holder, and imaging the image information in a VR head display;
controlling the mechanical arm according to the motion posture of the mechanical arm;
and acquiring angle information of a gyroscope, and controlling the camera holder to rotate along with the gyroscope according to the angle information.
Specifically, when the robot is implemented, a camera holder capable of rotating with multiple degrees of freedom is placed on the mechanical arm, a gyroscope is mounted on the VR head display, and the camera can rotate up, down, left and right by acquiring information fed back by the VR head display. The video in the camera is obtained through the Unity, namely, the explosive ordnance man brings a head display, the image shot by the camera is formed in front of eyes, the head is moved, and the camera can make corresponding actions along with the actions of the head, so that the explosive ordnance man can obtain sufficient immersion feeling and can be placed in the explosive ordnance disposal process to a greater extent;
firstly L eap Motion servo control of the mechanical arm, wherein the built-in sensor establishes a rectangular coordinate system, the origin of the coordinates is the center of the sensor, the X axis of the coordinates is parallel to the sensor and points to the right of the screen, the Y axis points to the upper direction, the Z axis points to the direction away from the screen, during the use process, the L eap Motion sensor periodically sends the Motion information about the hand, for example, if two hands are detected and move towards one direction, the hand is considered to be translational, if the hand rotates like holding a ball, the hand is considered to be rotational, if the hand approaches or separates, the hand is considered to be zooming, the generated data comprises the axial vector of the rotation, the angle of the rotation (clockwise positive), a matrix describing the rotation, a zooming factor and a translation vector, and the information is converted into the posture of a corresponding servo motor and then is transmitted into a controller of the mechanical arm through a serial port, thereby realizing the Motion control of the mechanical arm along with the gesture.
Secondly, VR image feedback and the design of VR head display and camera follow-up control:
(1) the on-site environment is recorded by the camera, transmitted back to the Unity for processing and finally imaged in the VR head display, so that the on-site image shot by the camera and the video in the VR head display can be shared.
(2) Then, the rotation of the head display control camera is realized: carry on the control panel of a built-in mup6050 on the head shows, acquire head pivoted gesture information through a control panel, the host computer is transmitted to rethread serial ports, the up-and-down side-to-side motion of control cloud platform to control the camera and show the motion along with the head.
The design is as follows:
1. an Intel Curie Genduino101 development board was used, mounted on the VR head display, as an attitude sensor. And after the posture and direction data of the VR head display are collected by Curie, the data are transmitted to a computer through a Serial port.
2. And running Processing software on the computer, receiving VR head display direction data from Curie, Processing the VR head display direction data and then forwarding the VR head display direction data to a makeBlock MegaPi development board.
3. The MegaPi development board is used as a motor motion control board to push a two-axis cradle head steering engine to rotate synchronously along with the VR head display in the yaw and roll directions.
And finally, the control design of the chassis of the explosive-handling machine is that a 480A brushed electronic speed regulator is adopted for driving a chassis motor, so that the chassis motor has strong current resistance and high-voltage protection, and the mobile control adopts an F L ASPEED native 2.4G protocol originally created by the space and ground flying company, so that the chassis motor can be directly driven by a high-speed bus, the intermediate time delay is avoided, the control agility is greatly improved, and the chassis motor is a good choice for driving a platform.
Referring to fig. 1, the invention further provides a control method of the explosive ordnance disposal robot based on virtual reality, which comprises the following steps:
step 201, acquiring image information shot by a camera holder, and imaging the image information in a VR head display;
202, acquiring hand motion information of a user, generating a motion gesture of the mechanical arm according to the hand motion information of the user, and controlling the mechanical arm according to the motion gesture;
step 203, acquiring angle information of a gyroscope, and controlling the camera holder to rotate along with the gyroscope according to the angle information;
and step 204, repeating the steps 201 to 203.
As a preferred embodiment of the present invention, the step 201 of imaging the image information on the VR headset specifically includes the following steps:
step 301, setting a Unity camera in Unity, and importing the image information into the Unity camera;
step 302, introducing an AVPro motion Capture plug-in library file into the Unity, and performing frame-by-frame screenshot on the image information introduced into the Unity camera through the AVPro motion Capture;
303, switching the Unity camera to six different directions in each frame, and respectively obtaining screenshots corresponding to the six different directions to obtain data corresponding to the six screenshots;
and 304, storing the data corresponding to the screenshot into a texture map, and synthesizing according to the data in the texture map to obtain the panoramic video.
As a preferred embodiment of the present invention, the six orientations in step 303 are:
(0,0,0),(0,90,0),(0,-90,0),(90,0,0),(-90,0,0),(0,0,180)。
as a preferred embodiment of the present invention, the method for generating the motion gesture of the robot arm from the hand motion information of the user in step 202 specifically includes the following steps:
and converting the Motion posture of the hand Motion information of the user by adopting a gesture recognition mode based on L eap Motion and according to a related algorithm for gesture recognition of L eap Motion, so as to obtain the Motion posture of the mechanical arm.
As a preferred embodiment of the present invention, the manner of controlling the camera pan-tilt to perform the following rotation according to the angle information in step 203 specifically includes the following:
and carrying out settlement of attitude angles in a quaternion mode to obtain the following rotating angle of the camera holder, and controlling the camera holder to rotate correspondingly.
As a preferred embodiment of the present invention, the attitude angle is calculated by quaternion, wherein the method of calculating the heading angle Φ, the pitch angle θ, and the roll angle γ includes the following:
step 701, initializing quaternion q0、q1、q2、q3The following were used:
Figure BDA0002434026550000061
step 702, determining an attitude matrix from the quaternion
Figure BDA0002434026550000062
Figure BDA0002434026550000063
Step 703, the attitude angle can be solved according to the equations of step 701 and step 702
Figure BDA0002434026550000064
θ、γ:
Figure BDA0002434026550000065
θ=-arcsin[2(q1q3-q0q2)]
Figure BDA0002434026550000066
The invention also proposes a computer-readable storage medium, in which a computer program is stored, characterized in that the computer program realizes the steps of the method according to any one of claims 2 to 7 when executed by a processor.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium and can implement the steps of the above-described method embodiments when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
While the present invention has been described in considerable detail and with particular reference to a few illustrative embodiments thereof, it is not intended to be limited to any such details or embodiments or any particular embodiments, but it is to be construed as effectively covering the intended scope of the invention by providing a broad, potential interpretation of such claims in view of the prior art with reference to the appended claims. Furthermore, the foregoing describes the invention in terms of embodiments foreseen by the inventor for which an enabling description was available, notwithstanding that insubstantial modifications of the invention, not presently foreseen, may nonetheless represent equivalent modifications thereto.
The above description is only a preferred embodiment of the present invention, and the present invention is not limited to the above embodiment, and the present invention shall fall within the protection scope of the present invention as long as the technical effects of the present invention are achieved by the same means. The invention is capable of other modifications and variations in its technical solution and/or its implementation, within the scope of protection of the invention.

Claims (8)

1. Arrange and explode robot system based on virtual reality, its characterized in that includes:
the camera module comprises a camera holder, the camera holder can rotate in multiple degrees of freedom, and the camera module is arranged on the mechanical arm;
the VR head display is used for being worn by a user and is in communication connection with the camera holder, when the head of the user rotates, the VR head display rotates along with the VR head display, and the camera holder rotates along with the VR head display synchronously;
a gyroscope arranged in the VR head display for acquiring attitude angle information of the VR head display, wherein the attitude angle information includes course angle
Figure FDA0002434026540000011
A pitch angle theta and a roll angle gamma;
l eap Motion controller, which is used to identify the hand Motion of the user and convert the Motion gesture to obtain the Motion gesture of the mechanical arm;
a processing module for, in use,
acquiring image information shot by a camera holder, and imaging the image information in a VR head display;
controlling the mechanical arm according to the motion posture of the mechanical arm;
and acquiring angle information of a gyroscope, and controlling the camera holder to rotate along with the gyroscope according to the angle information.
2. The control method of the explosive-handling robot based on virtual reality is characterized by comprising the following steps:
step 201, acquiring image information shot by a camera holder, and imaging the image information in a VR head display;
202, acquiring hand motion information of a user, generating a motion gesture of the mechanical arm according to the hand motion information of the user, and controlling the mechanical arm according to the motion gesture;
step 203, acquiring angle information of a gyroscope, and controlling the camera holder to rotate along with the gyroscope according to the angle information;
and step 204, repeating the steps 201 to 203.
3. The virtual reality-based explosive-handling robot control method according to claim 2, wherein the imaging of the image information in the VR headset in step 201 specifically includes the following steps:
step 301, setting a Unity camera in Unity, and importing the image information into the Unity camera;
step 302, introducing an AVPro motion Capture plug-in library file into the Unity, and performing frame-by-frame screenshot on the image information introduced into the Unity camera through the AVPro motion Capture;
303, switching the Unity camera to six different directions in each frame, and respectively obtaining screenshots corresponding to the six different directions to obtain data corresponding to the six screenshots;
and 304, storing the data corresponding to the screenshot into a texture map, and synthesizing according to the data in the texture map to obtain the panoramic video.
4. The virtual reality-based explosive ordnance disposal robot control method according to claim 3, wherein the six orientations in the step 303 are respectively:
(0,0,0),(0,90,0),(0,-90,0),(90,0,0),(-90,0,0),(0,0,180)。
5. the virtual reality-based explosive-handling robot control method according to claim 2, wherein the manner of generating the motion posture of the robot arm according to the hand motion information of the user in step 202 specifically includes the following:
and converting the Motion posture of the hand Motion information of the user by adopting a gesture recognition mode based on L eap Motion and according to a related algorithm for gesture recognition of L eap Motion, so as to obtain the Motion posture of the mechanical arm.
6. The virtual reality-based explosive ordnance disposal robot control method according to claim 2, wherein the manner of controlling the camera pan-tilt to follow and rotate according to the angle information in the step 203 specifically comprises the following steps:
and carrying out settlement of attitude angles in a quaternion mode to obtain the following rotating angle of the camera holder, and controlling the camera holder to rotate correspondingly.
7. The virtual reality based explosive ordnance disposal robot control method according to claim 6, wherein the attitude angle is calculated by quaternion, wherein the method for calculating the heading angle φ, the pitch angle θ and the roll angle γ comprises the following steps:
step 701, initializing quaternion q0、q1、q2、q3The following were used:
Figure FDA0002434026540000021
step 702, determining an attitude matrix from the quaternion
Figure FDA0002434026540000022
Figure FDA0002434026540000023
Step 703, the attitude angle can be solved according to the equations of step 701 and step 702
Figure FDA0002434026540000024
θ、γ:
Figure FDA0002434026540000025
θ=-arcsin[2(q1q3-q0q2)]
Figure FDA0002434026540000026
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 2-7.
CN202010246143.6A 2020-03-31 2020-03-31 Virtual reality-based explosive-handling robot system, control method and storage medium Pending CN111452046A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010246143.6A CN111452046A (en) 2020-03-31 2020-03-31 Virtual reality-based explosive-handling robot system, control method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010246143.6A CN111452046A (en) 2020-03-31 2020-03-31 Virtual reality-based explosive-handling robot system, control method and storage medium

Publications (1)

Publication Number Publication Date
CN111452046A true CN111452046A (en) 2020-07-28

Family

ID=71673531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010246143.6A Pending CN111452046A (en) 2020-03-31 2020-03-31 Virtual reality-based explosive-handling robot system, control method and storage medium

Country Status (1)

Country Link
CN (1) CN111452046A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114227689A (en) * 2021-12-30 2022-03-25 深圳市优必选科技股份有限公司 Robot motion control system and motion control method thereof
CN116372954A (en) * 2023-05-26 2023-07-04 苏州融萃特种机器人有限公司 AR immersed teleoperation explosive-handling robot system, control method and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014179449A1 (en) * 2013-05-01 2014-11-06 Hillcrest Laboratories, Inc. Mapped variable smoothing evolution method and device
CN104656663A (en) * 2015-02-15 2015-05-27 西北工业大学 Vision-based UAV (unmanned aerial vehicle) formation sensing and avoidance method
CN105116926A (en) * 2015-08-20 2015-12-02 深圳一电科技有限公司 Holder control method and device
CN107145167A (en) * 2017-04-07 2017-09-08 南京邮电大学 A kind of video target tracking method based on digital image processing techniques
CN107168182A (en) * 2017-06-28 2017-09-15 范崇山 A kind of system and method for Indoor Robot VR applications
CN107199550A (en) * 2017-03-15 2017-09-26 南昌大学 The mechanical exoskeleton formula explosive-removal robot of display is worn based on FPV
CN108009124A (en) * 2017-11-29 2018-05-08 天津聚飞创新科技有限公司 Spin matrix computational methods and device
CN108325077A (en) * 2018-01-23 2018-07-27 佛山科学技术学院 A kind of healing hand function system based on virtual reality technology
CN108363415A (en) * 2018-03-29 2018-08-03 燕山大学 A kind of vision remote control servomechanism and method applied to underwater robot
CN108769531A (en) * 2018-06-21 2018-11-06 深圳市道通智能航空技术有限公司 Control method, control device and the wearable device of the shooting angle of filming apparatus
CN109470266A (en) * 2018-11-02 2019-03-15 佛山科学技术学院 A kind of star sensor Gyro method for determining posture handling multiplicative noise
CN110096057A (en) * 2019-04-10 2019-08-06 广东工业大学 A kind of Intelligent carrier control system
CN110622091A (en) * 2018-03-28 2019-12-27 深圳市大疆创新科技有限公司 Cloud deck control method, device and system, computer storage medium and unmanned aerial vehicle
CN110728739A (en) * 2019-09-30 2020-01-24 杭州师范大学 Virtual human control and interaction method based on video stream

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014179449A1 (en) * 2013-05-01 2014-11-06 Hillcrest Laboratories, Inc. Mapped variable smoothing evolution method and device
CN104656663A (en) * 2015-02-15 2015-05-27 西北工业大学 Vision-based UAV (unmanned aerial vehicle) formation sensing and avoidance method
CN105116926A (en) * 2015-08-20 2015-12-02 深圳一电科技有限公司 Holder control method and device
CN107199550A (en) * 2017-03-15 2017-09-26 南昌大学 The mechanical exoskeleton formula explosive-removal robot of display is worn based on FPV
CN107145167A (en) * 2017-04-07 2017-09-08 南京邮电大学 A kind of video target tracking method based on digital image processing techniques
CN107168182A (en) * 2017-06-28 2017-09-15 范崇山 A kind of system and method for Indoor Robot VR applications
CN108009124A (en) * 2017-11-29 2018-05-08 天津聚飞创新科技有限公司 Spin matrix computational methods and device
CN108325077A (en) * 2018-01-23 2018-07-27 佛山科学技术学院 A kind of healing hand function system based on virtual reality technology
CN110622091A (en) * 2018-03-28 2019-12-27 深圳市大疆创新科技有限公司 Cloud deck control method, device and system, computer storage medium and unmanned aerial vehicle
CN108363415A (en) * 2018-03-29 2018-08-03 燕山大学 A kind of vision remote control servomechanism and method applied to underwater robot
CN108769531A (en) * 2018-06-21 2018-11-06 深圳市道通智能航空技术有限公司 Control method, control device and the wearable device of the shooting angle of filming apparatus
CN109470266A (en) * 2018-11-02 2019-03-15 佛山科学技术学院 A kind of star sensor Gyro method for determining posture handling multiplicative noise
CN110096057A (en) * 2019-04-10 2019-08-06 广东工业大学 A kind of Intelligent carrier control system
CN110728739A (en) * 2019-09-30 2020-01-24 杭州师范大学 Virtual human control and interaction method based on video stream

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114227689A (en) * 2021-12-30 2022-03-25 深圳市优必选科技股份有限公司 Robot motion control system and motion control method thereof
CN114227689B (en) * 2021-12-30 2023-11-17 深圳市优必选科技股份有限公司 Robot motion control system and motion control method thereof
CN116372954A (en) * 2023-05-26 2023-07-04 苏州融萃特种机器人有限公司 AR immersed teleoperation explosive-handling robot system, control method and storage medium

Similar Documents

Publication Publication Date Title
JP6768156B2 (en) Virtually enhanced visual simultaneous positioning and mapping systems and methods
US9789403B1 (en) System for interactive image based game
CN108769531B (en) Method for controlling shooting angle of shooting device, control device and remote controller
CN112634318B (en) Teleoperation system and method for underwater maintenance robot
CN110456967B (en) Information processing method, information processing apparatus, and program
CN108805979B (en) Three-dimensional reconstruction method, device, equipment and storage medium for dynamic model
US20170180721A1 (en) System and method for performing electronic display stabilization via retained lightfield rendering
CN111800589B (en) Image processing method, device and system and robot
EP3692506A1 (en) Shadow generation for inserted image content into an image
CN111452046A (en) Virtual reality-based explosive-handling robot system, control method and storage medium
CN103977539A (en) Cervical vertebra rehabilitation and health care training aiding system
CN106357966A (en) Panoramic image photographing device and panoramic image acquiring method
CN111966217A (en) Unmanned aerial vehicle control method and system based on gestures and eye movements
WO2019183789A1 (en) Method and apparatus for controlling unmanned aerial vehicle, and unmanned aerial vehicle
WO2023280082A1 (en) Handle inside-out visual six-degree-of-freedom positioning method and system
CN110060295B (en) Target positioning method and device, control device, following equipment and storage medium
KR20200020295A (en) AUGMENTED REALITY SERVICE PROVIDING APPARATUS INTERACTING WITH ROBOT and METHOD OF THEREOF
JP2024503275A (en) Mobile robot control method, computer-implemented storage medium, and mobile robot
WO2018140397A1 (en) System for interactive image based game
CN109531578B (en) Humanoid mechanical arm somatosensory control method and device
WO2023142555A1 (en) Data processing method and apparatus, computer device, storage medium, and computer program product
CN113496503A (en) Point cloud data generation and real-time display method, device, equipment and medium
JP2023100258A (en) Pose estimation refinement for aerial refueling
CN116012913A (en) Model training method, face key point detection method, medium and device
CN116012459A (en) Mouse positioning method based on three-dimensional sight estimation and screen plane estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200728

RJ01 Rejection of invention patent application after publication