CN112383345A - Distributed remote control device - Google Patents

Distributed remote control device Download PDF

Info

Publication number
CN112383345A
CN112383345A CN202011269866.4A CN202011269866A CN112383345A CN 112383345 A CN112383345 A CN 112383345A CN 202011269866 A CN202011269866 A CN 202011269866A CN 112383345 A CN112383345 A CN 112383345A
Authority
CN
China
Prior art keywords
signal
control
control signal
transceiver
main control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011269866.4A
Other languages
Chinese (zh)
Inventor
刘杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202010679784.0A external-priority patent/CN111917454A/en
Application filed by Individual filed Critical Individual
Publication of CN112383345A publication Critical patent/CN112383345A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/14Relay systems
    • H04B7/15Active relay systems
    • H04B7/185Space-based or airborne stations; Stations for satellite systems
    • H04B7/1851Systems using a satellite or space-based relay
    • H04B7/18517Transmission equipment in earth stations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/14Relay systems
    • H04B7/15Active relay systems
    • H04B7/185Space-based or airborne stations; Stations for satellite systems
    • H04B7/18502Airborne stations
    • H04B7/18506Communications with or from aircraft, i.e. aeronautical mobile service
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Abstract

The application discloses distributed remote control device, including master control equipment and distribution control equipment, distribution control equipment includes first controlgear and second controlgear at least. The first control equipment is in signal connection with the main control equipment and is used for generating a first control signal based on hand actions and/or hand gestures of a user and sending the first control signal to the main control equipment; the second control equipment is in signal connection with the main control equipment and is used for generating a second control signal based on the head action, the head posture and/or the external audio and video of the user and sending the second control signal to the main control equipment; the main control equipment is also in signal connection with the controlled unmanned motion platform and is used for sending a main control signal comprising a first control signal and/or a second control signal to the controlled unmanned motion platform to realize control over the platform and the load component. Compared with the cooperation of multiple persons, the cooperation of different parts of the same person is definitely more tacitly coordinated, so that the problem of poor control effect during the operation of multiple persons is solved.

Description

Distributed remote control device
The present application claims priority from chinese patent application No. 202010679784.0 entitled "a distributed remote control device" filed by the chinese patent office at 15/07/2020, and priority from chinese patent application No. 202021389525.6 entitled "a distributed remote control device" filed on the same day, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of control technologies, and more particularly, to a distributed remote control apparatus.
Background
Unmanned motion platform obtains more and more applications, like unmanned vehicle, unmanned aircraft, unmanned helicopter etc. generally speaking, except that few platforms can be based on artificial intelligence autonomous operation, most unmanned motion platform still need carry out the remote control operation through the manual type, and for autonomous operation, the manual type remote control can also realize operating personnel's current purpose in real time to have the advantage that a great deal of autonomous operation does not possess.
At present, when the unmanned aerial vehicle platform is remotely controlled, the platform and the platform load are often required to be simultaneously controlled, and because the operation required to be input and executed is more, a plurality of operators are sometimes required to respectively control the unmanned aerial vehicle platform by using corresponding remote controllers, and because the coordination among a plurality of people is difficult and the synchronism is poor, the control effect of the platform is poor.
Disclosure of Invention
In view of this, the present application provides a distributed remote control apparatus, which is used to remotely control an unmanned motion platform, so as to solve the problem of poor control effect when operated by multiple users.
In order to achieve the above object, the following solutions are proposed:
a distributed remote control apparatus, comprising a main control device and a distributed control device, wherein the control device at least comprises a first control device and a second control device, wherein:
the first control equipment is in signal connection with the main control equipment and is used for generating a first control signal based on hand actions and/or hand gestures of a user and sending the first control signal to the main control equipment;
the second control equipment is in signal connection with the main control equipment and is used for generating a second control signal based on the head action, the head posture and/or the external audio and video of the user and sending the second control signal to the main control equipment;
the main control equipment is further in signal connection with the controlled unmanned motion platform and used for sending a main control signal to the controlled unmanned motion platform, the main control signal is used for controlling a platform motion part of the unmanned motion platform to execute preset actions and controlling a load part of the unmanned motion platform to execute preset actions, and the main control signal comprises the first control signal and/or the second control signal.
Optionally, the first control device comprises at least one first sensor and a first transceiver, wherein:
the first sensor is arranged on a hand of a user and used for generating the first control signal based on the hand motion and/or the hand gesture;
the first transceiver is in signal connection with the first sensor and is used for sending the first control signal to the main control equipment and receiving a feedback signal sent by the main control equipment.
Optionally, the first transceiver is a wireless signal transmitter or a wired signal transmitter.
Optionally, the second control device includes at least one second sensor, an audio-video acquisition device, and a second transceiver, where:
the second sensor is arranged on the head of a user and used for generating the first control signal based on the head action and/or the head gesture;
the audio and video acquisition device is used for acquiring the external audio and video and acquiring a second control signal based on the external audio and video;
the second transceiver is in signal connection with the second sensor and the audio-video acquisition device respectively and is used for sending a second control signal to the master control equipment, and the second control signal comprises a first control signal and/or a second control signal.
Optionally, the main control device includes a third transceiver, a fourth transceiver, and a main signal transceiver, where:
the third transceiver is configured to receive the first control signal;
the fourth transceiver is configured to receive the second control signal;
the master signal transceiver is used for sending the master control signal to the controlled unmanned motion platform.
Optionally, the main signal transceiver is further configured to receive a feedback signal returned by the unmanned mobile platform.
Optionally, the feedback signal includes part or all of an audio signal, a video signal, an attitude signal, a velocity signal, a position signal, a temperature signal, a modal signal, and a barometric pressure signal.
Optionally, the second control device includes a display device, wherein:
the second control is further configured to receive the feedback signal sent by the fourth transceiver;
the display device is used for displaying the feedback signal.
Optionally, the display device is arranged on the head of the user and faces the eyes of the user.
Optionally, the second control device is a head-mounted VR device or a head-mounted AR device.
It can be seen from the foregoing technical solutions that the present application discloses a distributed remote control apparatus, which includes a main control device and a distributed control device, where the distributed control device includes at least a first control device and a second control device. The first control equipment is in signal connection with the main control equipment and is used for generating a first control signal based on hand actions and/or hand gestures of a user and sending the first control signal to the main control equipment; the second control equipment is in signal connection with the main control equipment and is used for generating a second control signal based on the head action, the head posture and/or the external audio and video of the user and sending the second control signal to the main control equipment; the main control equipment is also in signal connection with the controlled unmanned motion platform and is used for sending a main control signal comprising a first control signal and/or a second control signal to the controlled unmanned motion platform to realize control over the platform and the load component. Compared with the operation of different operators, the cooperation of different parts of the same operator is definitely more tacitly coordinated, so that the problem of poor control effect during multi-person operation is solved.
The distributed remote control device decouples the communication distance from the operability and the portability, the communication distance is only related to the main control device in the application at present, and the operability and the portability are only related to the first control device and the second control device, so that the portability of the device can be greatly improved on the premise of not reducing the functions.
Meanwhile, the main control device in the application is used as a wireless communication relay and core operation platform, so that the design of the first control device and the second control device for monitoring the user action is greatly simplified, and the two control devices do not need remote communication, so that the design and manufacturing cost is further reduced, and the portability is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of a distributed remote control apparatus according to an embodiment of the present application;
FIG. 2 is a block diagram of a first remote control device of an embodiment of the present application;
FIG. 3 is a block diagram of a second remote control device of an embodiment of the present application;
fig. 4 is a block diagram of a master device according to an embodiment of the present application;
fig. 5 is a schematic diagram of another distributed remote control apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The controlled unmanned motion platform in the present application may be an unmanned vehicle, an unmanned airplane, or an unmanned helicopter. The unmanned vehicle not only comprises an unmanned vehicle based on artificial intelligence, but also comprises a remote control vehicle capable of realizing walking and construction according to the remote control of a user.
Similarly, the unmanned aircraft and the unmanned helicopter include an unmanned aircraft based on artificial intelligence, a remote control aircraft based on remote control, or a remote control model airplane, and the like, and the unmanned aircraft and the unmanned helicopter in the present application refer to the remote control aircraft based on remote control or the remote control model airplane.
The wireless system and the wireless connection proposed in the present application mean a connection by a mobile base station, a connection by bluetooth, a connection by satellite communication, or another connection system by a wireless signal. The above connection modes have different advantages and disadvantages. The connection scheme based on mobile base stations can provide communication over longer distances, such as kilometers or tens of kilometers. Whereas a wireless connection based on satellite communication can provide signal interaction over hundreds or even thousands of kilometers. However, both of these methods have a problem of high cost.
Bluetooth is a radio technology that supports short-range communication (typically within 10 m) of devices, enabling wireless information exchange between many devices, including mobile phones, PDAs, wireless headsets, laptops, related peripherals, etc. By using the bluetooth technology, the communication between mobile communication terminal devices can be effectively simplified, and the communication between the devices and the Internet can also be successfully simplified, so that the data transmission becomes faster and more efficient, and the way is widened for wireless communication.
Bluetooth, as a small-range wireless connection technology, can implement data communication and voice communication between devices conveniently, quickly, flexibly, safely, with low cost and low power consumption, and thus it is one of the mainstream technologies for implementing wireless personal area network communication at present. Connection to other networks may lead to a wider range of applications. The wireless communication system is a sophisticated open wireless communication, can enable various digital devices to communicate wirelessly, is one of wireless network transmission technologies, and originally is used for replacing infrared.
Bluetooth is an open global specification for wireless data and voice communication that establishes an ad hoc connection for fixed and mobile device communication environments based on low-cost short-range wireless connections. The essence of the method is to establish a universal Radio Air Interface (Radio Air Interface) for the communication environment between fixed devices or mobile devices, and further combine the communication technology with the computer technology, so that various 3C devices can realize mutual communication or operation in a short distance range without mutual connection of wires or cables. Briefly, bluetooth technology is a technology for transmitting data to each other between various 3C devices using low power radio. Bluetooth operates in the global universal 2.4GHz ISM (i.e., industrial, scientific, medical) band using the IEEE802.15 protocol.
Based on the above description, the bluetooth connection method has the advantage of low cost, but its communication distance is short, which also limits its application scenarios. Based on the above remote control requirements of unmanned vehicles, unmanned airplanes and unmanned helicopters, and the manner of wireless connection, the present application provides the following specific embodiments.
Fig. 1 is a schematic diagram of a distributed remote control apparatus according to an embodiment of the present application.
As shown in fig. 1, the distributed remote control apparatus provided in this embodiment is used to remotely control a controlled unmanned moving platform 100, such as an unmanned vehicle, an unmanned airplane, an unmanned helicopter, and the like, and the system specifically includes a main control device 30 and a distributed control device, where the distributed control device includes, but is not limited to, a first control device 10 and a second control device 20, and may further include other sub-control devices, such as a control glove, a control bracelet, or other wearable devices in signal connection with the main control device.
This control gloves, control bracelet or other wearing equipment all can produce corresponding control command based on user's action or operation, and control command then can be sent to controlled unmanned motion platform through master control equipment and realize corresponding action.
The main control equipment is respectively connected with the first control equipment, the second control equipment, the control gloves, the control bracelet and other wearable equipment in a signal mode, the connection mode can be wireless connection or wired connection, the wireless connection mode is preferentially adopted in the application, the trouble of a signal line can be avoided when the main control equipment is installed on a user, and the use of the main control equipment is facilitated for the user.
The first control device is arranged at a hand of a user when in use, and is used for generating a first control signal based on the hand action or the hand gesture of the user and the hand action and the hand gesture. The hand posture is understood to mean the space vector position of the whole or part of the hand, and the hand motion is understood to mean the motion performed by each part of the hand, such as stretching or bending the palm.
The first control device can be a single component, or a combination of components formed by connecting a plurality of components correspondingly, so as to be arranged on the arm, wrist, palm, back of hand and fingers of the user, or can be arranged on the parts or parts of the parts, so that the corresponding first control signals can be generated according to the posture or action of a single part or the relative posture or matching action between two or more parts.
The first control signal is used for controlling the action of the unmanned motion platform, and the platform motion component of the unmanned motion platform carries out forward movement, backward movement, ascending, descending, turning and other actions based on the first control signal. The platform moving parts include, but are not limited to, road wheels, elevating platforms, propellers, elevators, and the like.
The first control device comprises at least a first sensor 11 and a first transceiver 12, as shown in fig. 2, which are connected by a signal line. The first sensor is used for detecting hand motion or hand gesture, generating a first control signal based on the hand motion or the hand gesture, and outputting the first control signal to the first transceiver. The first sensor may be a pressure sensor, a gyroscope, an acceleration sensor, a magnetic sensor, a barometer, etc.
The first transceiver is used for receiving the first control signal and outputting the first control signal to the main control equipment connected with the first transceiver in a wired mode or a wireless mode. In view of this, the first transmitter is a wired transceiver or a wireless transceiver, or a transceiver having both wired and wireless functions.
The second control device is arranged on the head of the user when the device is used, and outputs a second control signal to the main control device based on the head action and the head posture of the user, wherein the second control signal is used for controlling a load component arranged on the controlled unmanned moving platform to execute a preset action, for example, if the load component is a camera, the camera can be controlled to move up and down, move left and right, adjust the focal length and the like through the second control signal.
The second control device includes, but is not limited to, at least one second sensor 21, an audio/video acquisition device 23, and a second transceiver 22, as shown in fig. 3, the second transmitter is connected to the second sensor and the audio/video acquisition device through signal lines, respectively. The second sensor is used for detecting the head gesture and the head movement of the user and generating the first control signal based on the head gesture or the head movement. The head pose here refers to the vector position of the user's head in space, e.g., upward, leftward, or rightward. The head movement refers to a movement generated by the head of the user, such as nodding, shaking, and swinging. The second sensor includes, but is not limited to, a gyroscope, an acceleration sensor, and a magnetic sensor.
The image acquisition device is used for acquiring an external image such as a target object image or a hand image, and then processing the external lead to obtain a second control signal matched with the external image in a control purpose. The second transceiver combines the first steering signal and the second steering signal into a second control signal.
The second transmitter is used for transmitting the second control signal to the main control equipment in a wired mode or a wireless mode. In view of this, the second transmitter is a wired transmitter or a wireless transmitter, or a transmitter having both wired and wireless functions.
The main control device can be arranged on the body of a user, after receiving the first control signal and the second control signal, the two signals are combined or one of the two signals is converted into the main control signal, and the main control signal is sent to the controlled unmanned motion platform to control the platform to act or control the load component to act, or control the platform and the load component to act together.
The main control device includes a processor 31, a third transceiver 32, a fourth transceiver 33 and a main signal transceiver 34, as shown in fig. 4, the processor is connected to the third transceiver, the fourth transceiver and the main signal transceiver respectively. The third transceiver is used for receiving the first control signal sent by the first control device and outputting the first control signal to the processor, and the fourth transceiver is used for receiving the second control signal sent by the second control device and outputting the second control signal to the processor.
The processor is used for combining the first control signal and the second control signal or combining the first control signal or the second control signal separately to generate a main control signal. The processor outputs the main control signal to the main signal transceiver after obtaining the main control signal, and the main signal transceiver transmits the main control signal to the controlled unmanned motion platform in a wireless mode. The processor on the platform can interpret the remote control signal into a first control signal and a second control signal or one of the first control signal and the second control signal, and the first control signal and the second control signal can control the platform and the load component to act, so that the platform is remotely controlled.
When the load component of the controlled unmanned motion platform comprises a camera, the second control signal is also used for controlling the camera to move up and down, move left and right, adjust the focal length and the like.
The main signal transceiver is also used for receiving a feedback signal returned by the platform and outputting the feedback signal to the processor. The feedback signals include, but are not limited to, audio signals, video signals, attitude signals, velocity signals, position signals, temperature signals, and barometric signals, and may include one or more of these.
The second control device also comprises a display device 24, as shown in fig. 5. The fourth transceiver is used for sending the feedback signal to the second transceiver of the second controller, the second transceiver sends the feedback signal to the display device, and the display device displays the sound feedback signal to a user, so that the user can watch part or all of information, such as sound, pictures and other information, obtained by the load part in real time while controlling.
It can be seen from the foregoing technical solutions that the present embodiment provides a distributed remote control apparatus, including a main control device and a distributed control device, where the distributed control device at least includes a first control device and a second control device. The first control equipment is in signal connection with the main control equipment and is used for generating a first control signal based on hand actions and/or hand gestures of a user and sending the first control signal to the main control equipment; the second control equipment is in signal connection with the main control equipment and is used for generating a second control signal based on the head action, the head posture and/or the external audio and video of the user and sending the second control signal to the main control equipment; the main control equipment is also in signal connection with the controlled unmanned motion platform and is used for sending a main control signal comprising a first control signal and/or a second control signal to the controlled unmanned motion platform to realize control over the platform and the load component. Compared with the operation of different operators, the cooperation of different parts of the same operator is definitely more tacitly coordinated, so that the problem of poor control effect during multi-person operation is solved.
Because the main control equipment in the application is used as a wireless communication relay and core operation platform, the design of the first control equipment and the second control equipment for monitoring the user action is greatly simplified, and the design and manufacturing cost is further reduced because the two control equipment do not need remote communication.
Display device in this application can be liquid crystal display, and it can choose for use wear-type AR equipment or wear-type VR equipment when the concrete implementation towards user's eyes.
The AR device is a head-mounted device implemented based on AR technology. AR refers to augmented reality technology, is a new technology for seamlessly integrating real world information and virtual world information, and is characterized in that entity information (visual information, sound, taste, touch and the like) which is difficult to experience in a certain time and space range of the real world originally is overlapped after being simulated through scientific technologies such as computers, virtual information is applied to the real world and is perceived by human senses, and therefore sensory experience beyond reality is achieved. The real environment and the virtual object are superimposed on the same picture or space in real time and exist simultaneously.
The augmented reality technology not only shows real world information, but also displays virtual information at the same time, and the two kinds of information are mutually supplemented and superposed. In visual augmented reality, a user can see the real world around it by using a head-mounted display to multiply and combine the real world with computer graphics.
The augmented reality technology comprises new technologies and new means such as multimedia, three-dimensional modeling, real-time video display and control, multi-sensor fusion, real-time tracking and registration, scene fusion and the like. Augmented reality provides information that is generally different from what human beings can perceive.
The VR equipment is a head-mounted device based on VR technology. The VR technology is a brand-new man-machine interaction means created by means of computers and the latest sensor technology. The virtual reality is a virtual world which utilizes computer simulation to generate a three-dimensional space, provides simulation of senses of vision, hearing, touch and the like for a user, and enables the user to observe objects in the three-dimensional space in time without limitation as if the user is personally on the scene.
VR was systematically discussed in 1992 reports from the national science foundation funded interactive systems project working group and established and proposed research directions in the field of future virtual reality environments. The virtual reality technology integrates various scientific technologies such as a computer graphics technology, a computer simulation technology, a sensor technology, a display technology and the like, creates a virtual information environment on a multi-dimensional information space, enables a user to have an immersive sense, has perfect interaction capacity with the environment, and is helpful for inspiring ideas. So to speak, the immersive-interactive-idea is three basic characteristics of a VR environment system. The core of the virtual technology is modeling and simulation.
Virtual reality is widely used in medicine, entertainment, military space, indoor design, real estate development, industrial simulation, emergency deduction, gaming, geography, education, hydrogeology, maintenance, training and training, shipbuilding, automotive simulation, rail transit, energy field, biomechanics, rehabilitation training, digital earth, and the like.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The technical solutions provided by the present invention are described in detail above, and the principle and the implementation of the present invention are explained in this document by applying specific examples, and the descriptions of the above examples are only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A distributed remote control device, comprising a main control device and a distributed control device, wherein the control device at least comprises a first control device and a second control device, wherein:
the first control equipment is in signal connection with the main control equipment and is used for generating a first control signal based on hand actions and/or hand gestures of a user and sending the first control signal to the main control equipment;
the second control equipment is in signal connection with the main control equipment and is used for generating a second control signal based on the head action, the head posture and/or the external audio and video of the user and sending the second control signal to the main control equipment;
the main control equipment is further in signal connection with the controlled unmanned motion platform and used for sending a main control signal to the controlled unmanned motion platform, the main control signal is used for controlling a platform motion part of the unmanned motion platform to execute preset actions and controlling a load part of the unmanned motion platform to execute preset actions, and the main control signal comprises the first control signal and/or the second control signal.
2. The distributed remote control apparatus of claim 1, wherein the first control device comprises at least a first sensor and a first transceiver, wherein:
the first sensor is arranged on a hand of a user and used for generating the first control signal based on the hand motion and/or the hand gesture;
the first transceiver is in signal connection with the first sensor and is used for sending the first control signal to the main control equipment and receiving a feedback signal sent by the main control equipment.
3. The distributed remote control apparatus of claim 2, wherein the first transceiver is a wireless signal transmitter or a wired signal transmitter.
4. The distributed remote control apparatus of claim 1, wherein the second control device comprises at least one second sensor, an audio-visual acquisition device, and a second transceiver, wherein:
the second sensor is arranged on the head of a user and used for generating the first control signal based on the head action and/or the head gesture;
the audio and video acquisition device is used for acquiring the external audio and video and acquiring a second control signal based on the external audio and video;
the second transceiver is in signal connection with the second sensor and the audio-video acquisition device respectively and is used for sending a second control signal to the master control equipment, and the second control signal comprises a first control signal and/or a second control signal.
5. The distributed remote control apparatus of claim 1, wherein the master device comprises a third transceiver, a fourth transceiver receiver, and a master signal transceiver, wherein:
the third transceiver is configured to receive the first control signal;
the fourth transceiver is configured to receive the second control signal;
the master signal transceiver is used for sending the master control signal to the controlled unmanned motion platform.
6. The distributed remote control apparatus of claim 5, wherein the master signal transceiver is further configured to receive a feedback signal returned by the unmanned mobile platform.
7. The distributed remote control apparatus of claim 6 wherein the feedback signal comprises some or all of an audio signal, a video signal, a gesture signal, a velocity signal, a position signal, a temperature signal, a modal signal, a barometric pressure signal.
8. The distributed remote control apparatus of claim 6, wherein the second control device comprises a display device, wherein:
the second control is further configured to receive the feedback signal sent by the fourth transceiver;
the display device is used for displaying the feedback signal.
9. The distributed remote control apparatus of claim 8, wherein the display device is disposed on a user's head and facing the user's eyes.
10. The distributed remote control apparatus of claim 1, wherein the second control device is a head-mounted VR device or a head-mounted AR device.
CN202011269866.4A 2020-07-15 2020-11-13 Distributed remote control device Pending CN112383345A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2020106797840 2020-07-15
CN2020213895256 2020-07-15
CN202021389525 2020-07-15
CN202010679784.0A CN111917454A (en) 2020-07-15 2020-07-15 Distributed remote control device

Publications (1)

Publication Number Publication Date
CN112383345A true CN112383345A (en) 2021-02-19

Family

ID=74582481

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011269866.4A Pending CN112383345A (en) 2020-07-15 2020-11-13 Distributed remote control device
CN202022625376.5U Active CN213186104U (en) 2020-07-15 2020-11-13 Distributed remote control device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202022625376.5U Active CN213186104U (en) 2020-07-15 2020-11-13 Distributed remote control device

Country Status (1)

Country Link
CN (2) CN112383345A (en)

Also Published As

Publication number Publication date
CN213186104U (en) 2021-05-11

Similar Documents

Publication Publication Date Title
KR102236339B1 (en) Systems and methods for controlling images captured by an imaging device
US11760503B2 (en) Augmented reality system for pilot and passengers
CN107221223B (en) Virtual reality cockpit system with force/tactile feedback
Teixeira et al. Teleoperation using google glass and ar, drone for structural inspection
US20160225188A1 (en) Virtual-reality presentation volume within which human participants freely move while experiencing a virtual environment
Higuchi et al. Flying head: A head-synchronization mechanism for flying telepresence
US20120004791A1 (en) Teleoperation method and human robot interface for remote control of a machine by a human operator
KR102647544B1 (en) Information processing system and information processing method
Krückel et al. Intuitive visual teleoperation for UGVs using free-look augmented reality displays
CN111716365A (en) Immersive remote interaction system and method based on natural walking
EP3797931A1 (en) Remote control system, information processing method, and program
US11804052B2 (en) Method for setting target flight path of aircraft, target flight path setting system, and program for setting target flight path
KR20190117414A (en) AR Device and Method For Controlling The Same
Szczurek et al. Multimodal multi-user mixed reality human–robot interface for remote operations in hazardous environments
Sehad et al. Locomotion-based uav control towards the internet of senses
JP2020126666A (en) Mobile body operation system, operation signal transmission system, mobile body operation method, program, and recording medium
Mangina et al. Drones for live streaming of visuals for people with limited mobility
CN213186104U (en) Distributed remote control device
Betancourt et al. Exocentric control scheme for robot applications: An immersive virtual reality approach
CN108475064B (en) Method, apparatus, and computer-readable storage medium for apparatus control
Ji et al. Data-driven augmented reality display and operations for UAV ground stations
CN111917454A (en) Distributed remote control device
US11947350B2 (en) Devices, systems, and methods for operating intelligent vehicles using separate devices
Mahayuddin et al. Comparison of human pilot (remote) control systems in multirotor unmanned aerial vehicle navigation
KR102063520B1 (en) The simulating apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination