CN111796740A - Unmanned vehicle control method, device and system based on wearable intelligent equipment - Google Patents

Unmanned vehicle control method, device and system based on wearable intelligent equipment Download PDF

Info

Publication number
CN111796740A
CN111796740A CN202010677177.0A CN202010677177A CN111796740A CN 111796740 A CN111796740 A CN 111796740A CN 202010677177 A CN202010677177 A CN 202010677177A CN 111796740 A CN111796740 A CN 111796740A
Authority
CN
China
Prior art keywords
unmanned vehicle
control
information
image
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010677177.0A
Other languages
Chinese (zh)
Inventor
斯戈泰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neolix Technologies Co Ltd
Original Assignee
Jiashan Neolithic Zhiniu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiashan Neolithic Zhiniu Technology Co ltd filed Critical Jiashan Neolithic Zhiniu Technology Co ltd
Priority to CN202010677177.0A priority Critical patent/CN111796740A/en
Publication of CN111796740A publication Critical patent/CN111796740A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an unmanned vehicle control method, device and system based on wearable intelligent equipment, and relates to the technical field of unmanned vehicle control. The unmanned vehicle control method based on the wearable intelligent device comprises the following steps: when the real-view image acquired by the wearable intelligent device comprises the unmanned vehicle, generating an AR control menu image superposed on the real-view image according to control menu information sent by the unmanned vehicle; then projecting the AR control menu image to the eyes of a user wearing the wearable intelligent device, and receiving operation information aiming at the AR control menu image input by the user; finally, the unmanned vehicle is controlled to execute corresponding actions according to the operation information, control instructions of the unmanned vehicle (such as an automatic driving vehicle, an unmanned driving vehicle and the like) can be rapidly input through the wearable intelligent device, and the control efficiency and the interactivity are high.

Description

Unmanned vehicle control method, device and system based on wearable intelligent equipment
Technical Field
The application relates to the technical field of unmanned vehicle control, in particular to an unmanned vehicle control method, device and system based on wearable intelligent equipment.
Background
Nowadays, with the development of unmanned technology, unmanned vehicles have become increasingly widely used in the field of commodity distribution. The existing unmanned vehicle control method is generally controlled through a vehicle-mounted touch screen or a smart phone. In actual use, when the unmanned vehicle needs to be controlled to execute corresponding actions, a user needs to move to the unmanned vehicle, then a user operation interface is called on the vehicle-mounted touch screen, and then a control instruction is input on the user operation interface, so that the unmanned vehicle is controlled to execute the corresponding actions, and the control efficiency is low; or the unique identification code of the vehicle body is scanned through the smart phone, a control interface is opened, and then a control command is input on the control interface to control the unmanned vehicle, so that the command input step is complicated, the interactivity is poor, and the control efficiency is low. However, in practice, the existing unmanned vehicle control method is low in control efficiency and poor in interactivity.
Disclosure of Invention
An object of the embodiment of the application is to provide an unmanned vehicle control method, device and system based on wearable intelligent equipment, which can input control instructions rapidly through the wearable intelligent equipment, and have high control efficiency and good interactivity.
The embodiment of the application provides a first aspect of a method for controlling an unmanned vehicle based on wearable intelligent equipment, which comprises the following steps:
when the real-view image acquired by the wearable intelligent device comprises the unmanned vehicle, generating an AR control menu image superposed on the real-view image according to control menu information sent by the unmanned vehicle;
projecting the AR control menu image into the eyes of a user wearing the wearable intelligent device, and receiving operation information input by the user and aiming at the AR control menu image;
and controlling the unmanned vehicle to execute corresponding actions according to the operation information.
In the implementation process, an AR control menu image superposed on a real view image is generated according to control menu information sent by the unmanned vehicle; then projecting the AR control menu image to the eyes of a user wearing the wearable intelligent device, and receiving operation information aiming at the AR control menu image input by the user; and finally, the unmanned vehicle is controlled to execute corresponding actions according to the operation information, and control instructions can be rapidly input through the wearable intelligent equipment, so that the control efficiency is high, and the interactivity is strong.
Further, the method further comprises:
acquiring a real-view image of the wearable intelligent device, and judging whether the real-view image comprises a real image of the unmanned vehicle;
if yes, acquiring a real-time distance between the unmanned vehicle and the wearable intelligent device;
judging whether the real-time distance is smaller than a preset distance or not;
and if the current time is less than the preset time, executing the AR control menu image which is superposed on the real image and generated according to the control menu information sent by the unmanned vehicle and the real image.
In the implementation process, before generating the AR control menu image overlapped on the real image, whether the acquired real view image comprises the real image of the unmanned vehicle or not is judged, whether the user wearing the wearable intelligent device sees the unmanned vehicle or not is judged, and after the user sees the unmanned vehicle is judged, whether the real-time distance between the user and the unmanned vehicle is smaller than a preset distance or not is judged, so that the problem of interaction failure or poor interaction experience between the user and the unmanned vehicle caused by too far distance is avoided, and the interaction efficiency is favorably improved.
Further, when the real-view image acquired by the wearable smart device includes the unmanned vehicle, the method further includes:
receiving push information sent by the unmanned vehicle;
and generating an AR information push image according to the push information, and projecting the AR information push image to the eyes of a user wearing the wearable intelligent device.
In the implementation process, when the unmanned vehicle appears in the visual field, the pushing information of the unmanned vehicle is received, the corresponding AR information pushing image is generated, the AR information pushing image is projected to the eyes of the user wearing the wearable intelligent device, the corresponding advertisement information is timely pushed for the user, and the user can conveniently know the related information of the unmanned vehicle.
Further, generating an AR control menu image superimposed on the real-view image according to control menu information sent by the unmanned vehicle, includes:
receiving control menu information sent by the unmanned vehicle, and generating a control menu display image according to the control menu information;
determining a real image of the unmanned vehicle in the real-view image;
and coupling the control menu display image and the reality image to obtain an AR control menu image.
In the implementation process, after the user sees the unmanned vehicle, the control menu information sent by the unmanned vehicle can be automatically received, the control menu display image and the real image can be coupled through the AR technology to generate the AR control menu image, the user does not need to actively operate to trigger the control menu of the unmanned vehicle, the interaction convenience is improved, and the interaction experience of the user is further improved.
Further, the operation information is one or more of user eye gazing information, user gesture information and user voice information.
In the implementation process, the user can input the operation information in a gesture, voice and eye watching mode, the steps are simple, the interaction convenience is improved, and the interaction experience of the user is further improved.
Further, controlling the unmanned vehicle to execute corresponding actions according to the operation information, including:
identifying the operation information to obtain an operation identification result;
generating control information according to the operation identification result and the AR control menu image;
and sending the control information to the unmanned vehicle so that the unmanned vehicle executes corresponding actions according to the control information.
In the implementation process, the operation information input by the user can be identified, the command information triggered by the user is determined according to the operation information and the AR control menu image, the corresponding control information is generated according to the command information, the wearable intelligent device sends the control information to the unmanned vehicle, and the unmanned vehicle can execute corresponding actions according to the control information after receiving the control information.
The second aspect of the embodiments of the present application provides an unmanned vehicle control apparatus based on a wearable smart device, where the unmanned vehicle control apparatus based on a wearable smart device includes:
the generating unit is used for generating an AR control menu image superposed on the real-view image according to control menu information sent by the unmanned vehicle when the real-view image acquired by the wearable intelligent device comprises the unmanned vehicle;
a projection unit for projecting the AR control menu image in the eyes of a user wearing the wearable smart device;
a receiving unit for receiving operation information for the AR control menu image input by a user;
and the control unit is used for controlling the unmanned vehicle to execute corresponding actions according to the operation information.
In the implementation process, when the real-view image acquired by the wearable intelligent device comprises the unmanned vehicle, the generating unit generates an AR control menu image superimposed on the real-view image according to control menu information sent by the unmanned vehicle; then the projection unit projects the AR control menu image to the eyes of a user wearing the wearable intelligent device, and further the receiving unit receives operation information aiming at the AR control menu image input by the user; and finally, the control unit controls the unmanned vehicle to execute corresponding actions according to the operation information, and can quickly input a control command through the wearable intelligent equipment, so that the control efficiency is high, and the interactivity is strong.
The invention discloses a wearable intelligent device-based unmanned vehicle control system, which comprises a wearable intelligent device and an unmanned vehicle, wherein,
the wearable intelligent device is used for judging whether the real-view image of the wearable intelligent device comprises the unmanned vehicle or not, and if so, establishing communication connection with the unmanned vehicle;
the unmanned vehicle is used for sending control menu information of the unmanned vehicle to the wearable intelligent equipment through the communication connection;
the wearable intelligent equipment is used for receiving control menu information sent by the unmanned vehicle; generating an AR control menu image superimposed on the real-view image according to the control menu information; the AR control menu image is projected to the eyes of a user wearing the wearable intelligent device, and operation information for the AR control menu image input by the user is received; generating control information according to the operation information and sending the control information to the unmanned vehicle;
and the unmanned vehicle is used for receiving the control information and executing corresponding actions according to the control information.
In the implementation process, when the wearable intelligent device judges that the real-view image of the wearable intelligent device comprises the unmanned vehicle, communication connection can be automatically established between the wearable intelligent device and the unmanned vehicle, then the unmanned vehicle can send control menu information to the wearable intelligent device through the communication connection, after the wearable intelligent device receives the control menu information, the wearable intelligent device can generate an AR control menu image superposed on the real-view image according to the control menu information, and project the AR control menu image to the eyes of a user wearing the wearable intelligent device, and further, the wearable intelligent device receives operation information aiming at the AR control menu image and input by the user, and generates corresponding control information according to the operation information; finally, the unmanned vehicle can execute corresponding actions according to the control information sent by the wearable intelligent device, control instructions can be rapidly input through the wearable intelligent device, and the control efficiency is high and the interactivity is strong.
A fourth aspect of the embodiments of the present application provides a computer device, including a memory and a processor, where the memory is used to store a computer program, and the processor runs the computer program to make the computer device execute the wearable smart device-based unmanned vehicle control method according to any one of the first aspect of the embodiments of the present application.
A fifth aspect of the embodiments of the present application provides a computer-readable storage medium, which stores computer program instructions, where the computer program instructions, when read and executed by a processor, perform the wearable smart device-based unmanned vehicle control method according to any one of the first aspect of the embodiments of the present application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic flowchart of an unmanned vehicle control method based on a wearable smart device according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a method for controlling an unmanned vehicle based on a wearable smart device according to a second embodiment of the present application;
fig. 3 is a schematic structural diagram of an unmanned vehicle control device based on a wearable smart device according to a third embodiment of the present application;
fig. 4 is a schematic structural diagram of another unmanned vehicle control device based on a wearable smart device according to a third embodiment of the present application;
fig. 5 is a schematic system architecture diagram of an unmanned vehicle control system based on a wearable smart device according to a fourth embodiment of the present application;
fig. 6 is a schematic diagram of an effect of superimposing an AR control menu image and an actual unmanned vehicle according to an embodiment of the present application.
Icon: 410-wearable smart device, 420-unmanned vehicle.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Example 1
Referring to fig. 1, fig. 1 is a schematic block diagram of a flow of an unmanned vehicle control method based on a wearable smart device according to an embodiment of the present application. As shown in fig. 1, the wearable smart device-based unmanned vehicle control method includes:
s101, when the real-view image acquired by the wearable intelligent device comprises the unmanned vehicle, generating an AR control menu image superposed on the real-view image according to control menu information sent by the unmanned vehicle.
In this application embodiment, the wearable smart device may be wearable devices with an AR function, such as AR glasses and a helmet, and the like, which is not limited in this application embodiment.
In the embodiment of the application, the wearable intelligent device has an independent operating system like a smart phone, and the wearable intelligent device can be used by a user to install programs provided by software service providers such as software and games. The wearable intelligent device can complete functions of adding schedules, map navigation, interacting with friends, shooting photos and videos, performing video calls with friends and the like through voice or action control, and can realize wireless network access through a mobile communication network.
In the embodiment of the present application, an Augmented Reality (AR) technology is a technology for calculating the position and angle of a camera image in real time and adding a corresponding image, and is a new technology for seamlessly integrating real world information and virtual world information, and a virtual world can be sleeved in a real world and interact with the real world.
In the embodiment of the application, wearable smart machine has advantages such as use is simple and convenient, and the volume is less, can throw virtual image to the people's eye to realize virtual influence and real world superimposed effect.
In the embodiment of the application, the camera is arranged on the wearable intelligent device, the real-field image of the wearable intelligent device can be acquired in real time and is an actual real image, the real-field image is the same as the field of vision seen by the eyes of the user wearing the wearable intelligent device, and when the real-field image acquired by the wearable intelligent device comprises the unmanned vehicle, the user can see the unmanned vehicle. After a user wearing the wearable intelligent device sees the unmanned vehicle, the wearable intelligent device can receive control menu information sent by the unmanned vehicle and generate an AR control menu image superimposed on the real-view image according to the control menu information. Wearable smart machine can automatic identification unmanned car and trigger show information (AR control menu image), does not need the user to carry out the initiative operation and goes to trigger the control menu of unmanned car, has promoted interactive convenient degree, has further promoted user's interactive experience.
S102, projecting the AR control menu image to the eyes of a user wearing the wearable intelligent device, and receiving operation information input by the user and aiming at the AR control menu image.
In the embodiment of the application, be provided with projection arrangement on the wearable smart machine, this projection arrangement can throw AR control menu image to the retina of user's eyes on to can make the user see AR control menu image and the unmanned car of actually seeing and carry out superimposed effect.
Referring to fig. 6, fig. 6 is a schematic diagram illustrating an effect of superimposing an AR control menu image and an actual unmanned vehicle according to the present embodiment. Fig. 6 shows a schematic view of the effect seen by a user wearing the wearable smart device, where a is an AR control menu image seen by human eyes, and B is a real image of an unmanned vehicle seen by human eyes.
In the embodiment of the present application, the operation information is one or more of user eye gazing information, user gesture information, and user voice information, and the embodiment of the present application is not limited.
As an optional implementation manner, the operation information for the AR control menu image is input in a voice control manner, so that the user can feel more natural and relaxed interactive experience. The voice control is to make the computing device understand the words of the human, and to execute the corresponding instructions according to the speaking content of the human. For wearable intelligent equipment which is small in size and worn on the body, voice control is an effective interaction mode.
In the above embodiment, the most central part in the speech control is the recognition technology for speech. Can adopt bone conduction technique to accomplish high-efficient discernment and the transmission to pronunciation, set up indirect bone conduction sensor on wearable smart machine, set up the vocal converter in every mirror leg. Then the wearable smart device can transmit the sound information to the inner ear through the bone on the side of the head of the user through the sound information generated when the frequency converter vibrates, so that the user can hear the sound information output by the wearable smart device.
As an optional implementation manner, the operation information for the AR control menu image may be input in a three-dimensional gesture recognition manner, so as to complete the interactive function of the wearable smart device, which is advantageous in that a non-contact manner is adopted.
In the above-described embodiments, the depth information is used for three-dimensional gesture recognition, and various gestures, hand shapes, and motions can be recognized. Special hardware is needed for acquiring the depth information, and the three-dimensional gesture recognition can be realized by matching a recognition algorithm. The wearable intelligent device can be provided with a gesture recognition sensor, such as a non-contact optical IR gesture recognition sensor, a 3D gesture recognition chip, a gesture recognition arm ring, an intelligent ring for gesture control and the like, and the embodiment of the application is not limited, wherein the non-contact optical IR gesture recognition sensor is provided with a four-in-one sensor module for gesture recognition, ambient light detection, proximity perception and color perception; under the action of an electric field of the 3D gesture recognition chip, gestures can be sensed without contact, and the coordinate position can be determined within a distance of 15cm according to 150dpi high precision; the intelligent ring of this gesture control embeds there are inertial sensor module, treater and low-power consumption bluetooth module.
As an alternative embodiment, the operation information for the AR control menu image may be input by means of eye tracking, i.e., a process of measuring a fixation point of the eye or a motion state of the eye relative to the head. The wearable smart device is also capable of perceiving the emotion of the user through eye tracking technology to determine the user's reaction to the target of gaze.
In the above embodiments, the eye tracking measurement technique is based primarily on image and video measurements, and encompasses a variety of techniques for measuring distinguishable eye movement characteristics, such as the limbus of the sclera or iris, the light intensity of corneal reflections, and the apparent shape of the pupil. Methods based on images, combined with pupil shape changes and corneal reflections are widely used in measuring points of interest of a user's gaze.
After step S102, the method further includes the following steps:
and S103, controlling the unmanned vehicle to execute corresponding actions according to the operation information.
In the embodiment of the application, the wearable intelligent device can identify operation information input by a user, determine command information triggered by the user according to the operation information and the AR control menu image, generate corresponding control information according to the command information, send the control information to the unmanned vehicle, and execute corresponding actions according to the control information after the unmanned vehicle receives the control information.
In actual use, when the unmanned vehicle is a delivery unmanned vehicle, a user can input goods taking information through the wearable intelligent device, and then the unmanned vehicle executes box opening operation after receiving the goods taking information; when this unmanned vehicle is for removing the unmanned vehicle that sells goods, the user can carry out operations such as goods selection, ordering, payment through wearable smart machine, combines wearable smart machine and unmanned vehicle to carry out interactive mode through augmented reality technique, has promoted the convenient degree of control unmanned vehicle, and operating procedure is simple, and then promotes user's interactive experience.
In this embodiment of the present application, an execution subject of the method may be a wearable smart device, a wearable device with an AR function, and the like, which is not limited in this embodiment.
Therefore, by implementing the unmanned vehicle control method based on the wearable intelligent device described in fig. 1, the control instruction can be rapidly input through the wearable intelligent device, the control efficiency is high, and the interactivity is good.
Example 2
Referring to fig. 2, fig. 2 is a schematic block diagram of a flow of an unmanned vehicle control method based on a wearable smart device according to an embodiment of the present application. As shown in fig. 2, the wearable smart device-based unmanned vehicle control method includes:
s201, acquiring a real-view image of the wearable intelligent device.
In the embodiment of the application, the camera is arranged on the wearable intelligent device, the real-field image of the wearable intelligent device can be acquired in real time and is an actual real image, and the real-field image is the same as the field of vision seen by the eyes of a user wearing the wearable intelligent device.
S202, judging whether the real view field image comprises a real image of the unmanned vehicle, and if so, executing a step S203; if not, executing step S201 to continue to acquire the real-field image of the wearable smart device.
In this application embodiment, when the reality field of vision image that wearable smart machine acquireed includes unmanned car, show that the user has also seen this unmanned car.
As an alternative embodiment, when it is determined that the real-field image includes a real image of an unmanned vehicle, the method may further include the steps of:
receiving push information sent by the unmanned vehicle;
and generating an AR information push image according to the push information, and projecting the AR information push image to the eyes of a user wearing the wearable intelligent device.
In the above embodiment, when the unmanned vehicle appears in the field of view, the push information of the unmanned vehicle is received, a corresponding AR information push image is generated, the AR information push image is projected to the eyes of a user wearing the wearable intelligent device, and corresponding advertisement information is pushed to the user in time, so that the user can know the relevant information of the unmanned vehicle conveniently.
S203, acquiring a real-time distance between the unmanned vehicle and the wearable intelligent device.
After step S203, the following steps are also included:
s204, judging whether the real-time distance is smaller than a preset distance, and if so, executing a step S205; if not, executing step S201 to continue to acquire the real-field image of the wearable smart device.
In the embodiment of the application, when the distance between the unmanned vehicle and the user wearing the wearable intelligent device is smaller than the preset distance, the distance between the unmanned vehicle and the user is moderate, and the user can interact with the unmanned vehicle at the moment, so that the interaction experience of the user can be promoted.
And S205, receiving control menu information sent by the unmanned vehicle, and generating a control menu display image according to the control menu information.
After step S205, the following steps are also included:
and S206, determining a real image of the unmanned vehicle in the real view image.
And S207, coupling the control menu display image and the reality image to obtain an AR control menu image.
In the embodiment of the present application, by performing the above-described steps S205 to S207, the AR control menu image superimposed on the real image can be generated from the control menu information and the real image transmitted by the unmanned vehicle.
After step S207, the following steps are also included:
and S208, projecting the AR control menu image to the eyes of the user wearing the wearable intelligent device, and receiving operation information for the AR control menu image input by the user.
In the embodiment of the present application, the operation information is one or more of user eye gazing information, user gesture information, and user voice information, and the embodiment of the present application is not limited.
S209, the operation information is subjected to identification processing, and an operation identification result is obtained.
In the embodiment of the application, the eye gaze information of the user can be identified through an eye tracking technology, the gesture information of the user is identified through a gesture identification technology, and the voice information of the user is identified through a voice identification technology, so that the embodiment of the application is not limited.
And S210, generating control information according to the operation recognition result and the AR control menu image.
And S211, sending the control information to the unmanned vehicle so that the unmanned vehicle executes corresponding actions according to the control information.
In the embodiment of the present application, the above-described steps S209 to S211 are performed, and the unmanned vehicle can be controlled to perform a corresponding operation according to the operation information.
Therefore, by implementing the unmanned vehicle control method based on the wearable intelligent device described in fig. 2, the control instruction can be rapidly input through the wearable intelligent device, the control efficiency is high, and the interactivity is good.
Example 3
Referring to fig. 3, fig. 3 is a schematic block diagram of a structure of an unmanned vehicle control device based on a wearable smart device according to an embodiment of the present application. As shown in fig. 3, the wearable smart device-based unmanned vehicle control apparatus includes:
the generating unit 310 is configured to generate, when the real-view image acquired by the wearable smart device includes an unmanned vehicle, an AR control menu image superimposed on the real-view image according to control menu information sent by the unmanned vehicle;
a projection unit 320 for projecting the AR control menu image to the eyes of the user wearing the wearable smart device.
A receiving unit 330 for receiving operation information for the AR control menu image input by the user.
And the control unit 340 is used for controlling the unmanned vehicle to execute corresponding actions according to the operation information.
Referring to fig. 4, fig. 4 is a block diagram illustrating a structure of another unmanned vehicle control device based on a wearable smart device according to an embodiment of the present application. The wearable smart device-based unmanned vehicle control device shown in fig. 4 is optimized by the wearable smart device-based unmanned vehicle control device shown in fig. 3. As shown in fig. 4, the wearable smart device-based unmanned vehicle control apparatus further includes:
an acquiring unit 350, configured to acquire a real-view image of the wearable smart device;
a judging unit 360 for identifying whether the real-view image includes a real image of the unmanned vehicle;
the acquiring unit 350 is further configured to acquire a real-time distance between the unmanned vehicle and the wearable intelligent device when it is determined that the real-view image includes a real image;
the judging unit 360 is further configured to judge whether the real-time distance is smaller than a preset distance; if the current time is less than the preset time, the trigger generation unit 310 generates an AR control menu image superimposed on the real image according to the control menu information and the real image sent by the unmanned vehicle.
As an optional implementation manner, the receiving unit 330 is further configured to receive push information sent by the unmanned vehicle when it is determined that the real-field image includes a real image of the unmanned vehicle.
The generating unit 310 is further configured to generate an AR information push image according to the push information.
And the projection unit 320 is further configured to project the AR information push image to the eyes of the user wearing the wearable smart device.
As an optional implementation, the generating unit 310 includes:
and the first sub-unit 311 is configured to receive control menu information sent by the unmanned vehicle, and generate a control menu display image according to the control menu information.
A second subunit 312, configured to determine a real image of the unmanned vehicle in the real-view image.
And a third subunit 313, configured to perform coupling processing on the control menu display image and the real image to obtain an AR control menu image.
In the embodiment of the present application, the operation information is one or more of user eye gazing information, user gesture information, and user voice information, and the embodiment of the present application is not limited.
As an alternative embodiment, the control unit 340 includes:
and a fourth subunit 341, configured to perform identification processing on the operation information to obtain an operation identification result.
A fifth sub-unit 342 for generating control information based on the operation recognition result and the AR control menu image.
The sixth subunit 343 is configured to send the control information to the unmanned vehicle, so that the unmanned vehicle executes a corresponding action according to the control information.
It can be seen that, the wearable intelligent device-based unmanned vehicle control device described in this embodiment can input a control instruction through the wearable intelligent device quickly, and is high in control efficiency and good in interactivity.
Example 4
Referring to fig. 5, fig. 5 is a schematic diagram of a system architecture of an unmanned vehicle control system based on a wearable smart device according to an embodiment of the present application. As shown in fig. 5, the wearable smart device-based unmanned vehicle control system includes a wearable smart device 410 and an unmanned vehicle 420, wherein,
and the wearable intelligent device 410 is used for judging whether the real-view image of the wearable intelligent device 410 comprises the unmanned vehicle 420 or not, and if so, establishing communication connection with the unmanned vehicle 420.
In this embodiment of the application, the wearable smart device 410 may perform communication connection with the unmanned vehicle 420 in a short-distance communication manner, and may specifically perform communication connection in a bluetooth manner, a local area network manner, and the like, which is not limited in this embodiment of the application.
And the unmanned vehicle 420 is used for sending the control menu information of the unmanned vehicle 420 to the wearable intelligent device 410 through the communication connection.
The wearable intelligent device 410 is used for receiving control menu information sent by the unmanned vehicle 420; generating an AR control menu image superimposed on the real-view image according to the control menu information; projecting the AR control menu image into the eyes of the user wearing the wearable smart device 410, and receiving operation information input by the user for the AR control menu image; and generating control information according to the operation information and transmitting the control information to the unmanned vehicle 420.
And the unmanned vehicle 420 is used for receiving the control information and executing corresponding actions according to the control information.
In practical use, when the unmanned vehicle 420 is a delivery unmanned vehicle 420, a user may input pickup information through the wearable smart device 410, and then the unmanned vehicle 420 performs a box opening operation after receiving the pickup information; when this unmanned vehicle 420 is for removing unmanned vehicle 420 of selling goods, the user can carry out operations such as goods selection, ordering, payment through wearable smart machine 410, combines wearable smart machine 410 and unmanned vehicle 420 to carry out mutual amount mode through augmented reality technique, has promoted the convenient degree of control unmanned vehicle 420, and operating procedure is simple, and then promotes user's interactive experience.
It can be seen that, the unmanned vehicle control system based on wearable intelligent device described in this embodiment can input control instructions through wearable intelligent device fast, and control efficiency is high, and interactivity is good.
In addition, the invention also provides computer equipment. The computer device comprises a memory and a processor, wherein the memory can be used for storing a computer program, and the processor can be used for causing the computer device to execute the method or the functions of each module in the unmanned vehicle control device based on the wearable intelligent device through running the computer program.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the mobile terminal, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The embodiment also provides a computer storage medium for storing a computer program used in the computer device.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. An unmanned vehicle control method based on wearable intelligent equipment is characterized by comprising the following steps:
when the real-view image acquired by the wearable intelligent device comprises the unmanned vehicle, generating an AR control menu image superposed on the real-view image according to control menu information sent by the unmanned vehicle;
projecting the AR control menu image into the eyes of a user wearing the wearable intelligent device, and receiving operation information of the user aiming at the AR control menu image;
and controlling the unmanned vehicle to execute corresponding actions according to the operation information.
2. The wearable smart device-based unmanned vehicle control method of claim 1, further comprising:
acquiring a real-view image of the wearable intelligent device, and judging whether the real-view image comprises a real image of the unmanned vehicle;
if yes, acquiring a real-time distance between the unmanned vehicle and the wearable intelligent device;
judging whether the real-time distance is smaller than a preset distance or not;
and if the current time is less than the preset time, executing the AR control menu image which is superposed on the real-view image and is generated according to the control menu information sent by the unmanned vehicle.
3. The wearable smart device-based unmanned vehicle control method of claim 1, wherein when the real-view image acquired by the wearable smart device includes the unmanned vehicle, the method further comprises:
receiving push information sent by the unmanned vehicle;
and generating an AR information push image according to the push information, and projecting the AR information push image to the eyes of a user wearing the wearable intelligent device.
4. The unmanned vehicle control method based on wearable intelligent equipment of claim 1, wherein generating an AR control menu image superimposed on the real-view image according to control menu information sent by the unmanned vehicle comprises:
receiving control menu information sent by the unmanned vehicle, and generating a control menu display image according to the control menu information;
determining a real image of the unmanned vehicle in the real-view image;
and coupling the control menu display image and the reality image to obtain an AR control menu image.
5. The unmanned vehicle control method based on wearable intelligent device of claim 1, wherein the operation information is one or more of user eye gaze information, user gesture information, and user voice information.
6. The unmanned vehicle control method based on wearable intelligent equipment according to any one of claims 1 to 5, wherein controlling the unmanned vehicle to execute corresponding actions according to the operation information comprises:
identifying the operation information to obtain an operation identification result;
generating control information according to the operation identification result and the AR control menu image;
and sending the control information to the unmanned vehicle so that the unmanned vehicle executes corresponding actions according to the control information.
7. An unmanned vehicle control device based on a wearable intelligent device is characterized by comprising:
the generating unit is used for generating an AR control menu image superposed on the real-view image according to control menu information sent by the unmanned vehicle when the real-view image acquired by the wearable intelligent device comprises the unmanned vehicle;
a projection unit for projecting the AR control menu image in the eyes of a user wearing the wearable smart device;
a receiving unit for receiving operation information for the AR control menu image input by a user;
and the control unit is used for controlling the unmanned vehicle to execute corresponding actions according to the operation information.
8. An unmanned vehicle control system based on wearable intelligent equipment is characterized by comprising the wearable intelligent equipment and an unmanned vehicle, wherein,
the wearable intelligent device is used for judging whether the real-view image of the wearable intelligent device comprises the unmanned vehicle or not, and if so, establishing communication connection with the unmanned vehicle;
the unmanned vehicle is used for sending control menu information of the unmanned vehicle to the wearable intelligent equipment through the communication connection;
the wearable intelligent equipment is used for receiving control menu information sent by the unmanned vehicle; generating an AR control menu image superimposed on the real-view image according to the control menu information; the AR control menu image is projected to the eyes of a user wearing the wearable intelligent device, and operation information for the AR control menu image input by the user is received; generating control information according to the operation information and sending the control information to the unmanned vehicle;
and the unmanned vehicle is used for receiving the control information and executing corresponding actions according to the control information.
9. A computer device comprising a memory for storing a computer program and a processor that executes the computer program to cause the computer device to perform the wearable smart device-based unmanned vehicle control method of claims 1-6.
10. A computer-readable storage medium, characterized in that it stores a computer program for use when implementing the wearable smart device-based unmanned vehicle control method of any of claims 1 to 6.
CN202010677177.0A 2020-07-14 2020-07-14 Unmanned vehicle control method, device and system based on wearable intelligent equipment Pending CN111796740A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010677177.0A CN111796740A (en) 2020-07-14 2020-07-14 Unmanned vehicle control method, device and system based on wearable intelligent equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010677177.0A CN111796740A (en) 2020-07-14 2020-07-14 Unmanned vehicle control method, device and system based on wearable intelligent equipment

Publications (1)

Publication Number Publication Date
CN111796740A true CN111796740A (en) 2020-10-20

Family

ID=72807040

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010677177.0A Pending CN111796740A (en) 2020-07-14 2020-07-14 Unmanned vehicle control method, device and system based on wearable intelligent equipment

Country Status (1)

Country Link
CN (1) CN111796740A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114286006A (en) * 2021-12-29 2022-04-05 努比亚技术有限公司 Augmented reality-based equipment control method, terminal and storage medium
CN114637545A (en) * 2020-11-30 2022-06-17 华为终端有限公司 VR interaction method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103946734A (en) * 2011-09-21 2014-07-23 谷歌公司 Wearable computer with superimposed controls and instructions for external device
KR20160103286A (en) * 2015-02-24 2016-09-01 강릉원주대학교산학협력단 Wearable control device
CN108664037A (en) * 2017-03-28 2018-10-16 精工爱普生株式会社 The method of operating of head-mount type display unit and unmanned plane
CN110120149A (en) * 2019-06-04 2019-08-13 北京百度网讯科技有限公司 Guidance system by bus based on automatic driving car

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103946734A (en) * 2011-09-21 2014-07-23 谷歌公司 Wearable computer with superimposed controls and instructions for external device
KR20160103286A (en) * 2015-02-24 2016-09-01 강릉원주대학교산학협력단 Wearable control device
CN108664037A (en) * 2017-03-28 2018-10-16 精工爱普生株式会社 The method of operating of head-mount type display unit and unmanned plane
CN110120149A (en) * 2019-06-04 2019-08-13 北京百度网讯科技有限公司 Guidance system by bus based on automatic driving car

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114637545A (en) * 2020-11-30 2022-06-17 华为终端有限公司 VR interaction method and device
CN114637545B (en) * 2020-11-30 2024-04-09 华为终端有限公司 VR interaction method and device
CN114286006A (en) * 2021-12-29 2022-04-05 努比亚技术有限公司 Augmented reality-based equipment control method, terminal and storage medium

Similar Documents

Publication Publication Date Title
KR102553190B1 (en) Automatic control of wearable display device based on external conditions
CN110647237B (en) Gesture-based content sharing in an artificial reality environment
CN110249368B (en) Wearable system and method for providing virtual remote control in mixed reality environment
CN108919958B (en) Image transmission method and device, terminal equipment and storage medium
US10495878B2 (en) Mobile terminal and controlling method thereof
CN110018736B (en) Object augmentation via near-eye display interface in artificial reality
JP6462059B1 (en) Information processing method, information processing program, information processing system, and information processing apparatus
KR102056221B1 (en) Method and apparatus For Connecting Devices Using Eye-tracking
US11017257B2 (en) Information processing device, information processing method, and program
JP6572600B2 (en) Information processing apparatus, information processing apparatus control method, and computer program
US11487354B2 (en) Information processing apparatus, information processing method, and program
JP2019197499A (en) Program, recording medium, augmented reality presentation device, and augmented reality presentation method
JPWO2014128787A1 (en) Tracking display system, tracking display program, tracking display method, wearable device using them, tracking display program for wearable device, and operation method of wearable device
KR102499354B1 (en) Electronic apparatus for providing second content associated with first content displayed through display according to motion of external object, and operating method thereof
CN108369451B (en) Information processing apparatus, information processing method, and computer-readable storage medium
CN111670431B (en) Information processing device, information processing method, and program
CN111796740A (en) Unmanned vehicle control method, device and system based on wearable intelligent equipment
CN112368746A (en) Information processing apparatus, information processing method, and program
CN111240471B (en) Information interaction method and wearable device
US11328187B2 (en) Information processing apparatus and information processing method
JP2019036239A (en) Information processing method, information processing program, information processing system, and information processing device
US10409464B2 (en) Providing a context related view with a wearable apparatus
JP6999538B2 (en) Information processing methods, information processing programs, information processing systems, and information processing equipment
CN109145010B (en) Information query method and device, storage medium and wearable device
JP7094759B2 (en) System, information processing method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220120

Address after: 100176 room 613, 6 / F, area 2, building a, 12 Hongda North Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Applicant after: NEOLIX TECHNOLOGIES Co.,Ltd.

Address before: Room 74, 2 / F, building B1, No. 555, Chuangye Road, Dayun Town, Jiashan County, Jiashan City, Zhejiang Province

Applicant before: Jiashan Neolithic Zhiniu Technology Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201020