CN114578956A - Equipment control method and device, virtual wearable equipment and storage medium - Google Patents

Equipment control method and device, virtual wearable equipment and storage medium Download PDF

Info

Publication number
CN114578956A
CN114578956A CN202011401480.4A CN202011401480A CN114578956A CN 114578956 A CN114578956 A CN 114578956A CN 202011401480 A CN202011401480 A CN 202011401480A CN 114578956 A CN114578956 A CN 114578956A
Authority
CN
China
Prior art keywords
control operation
gesture control
gesture
target
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011401480.4A
Other languages
Chinese (zh)
Inventor
邓方东
许彬
孙小光
姚荣杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011401480.4A priority Critical patent/CN114578956A/en
Publication of CN114578956A publication Critical patent/CN114578956A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a device control method and device, virtual wearable devices and a storage medium. The method comprises the following steps: if the target control operation acting on the gesture control equipment is detected, controlling a display screen to display the position of the target object relative to the gesture control equipment and operation guide information, wherein the operation guide information is used for indicating the target object to perform the gesture control operation according to the position; acquiring gesture control operation content of a target object; and executing an operation function corresponding to the gesture control operation content, and controlling the display screen to display an operation picture corresponding to the gesture control operation content. By the method, the gesture control operation content of the virtual wearable device experience user is visually displayed, the accuracy of device control through gesture control operation is improved, the memory cost and the learning cost of the user are reduced, and user-friendly experience is improved.

Description

Equipment control method and device, virtual wearable equipment and storage medium
Technical Field
The present application relates to the field of device control technologies, and in particular, to a device control method and apparatus, a virtual wearable device, and a storage medium.
Background
Virtual Reality (VR) technology encompasses computer, electronic information, and simulation technologies, and its basic implementation is that a processor simulates a Virtual environment to give people a sense of environmental immersion. Augmented Reality (AR) is a technology for increasing the perception of a user to the real world through information provided by a computer system, applies virtual information to the real world, and superimposes virtual objects, scenes or system prompt information generated by the computer to the real scene, thereby realizing the enhancement of Reality. With the continuous development of social productivity and scientific technology, VR technology and AR technology are increasingly in great demand in various industries. In a related interactive experience scene through the VR or AR device, interaction with the environment can be performed through functions such as gestures, touch or keys of a user, however, most of the existing VR or AR devices are "immersive" experience devices, and when the user wears the VR or AR device to perform touch operation, a false touch may occur due to unfamiliarity with a touch position or incapability of finding the touch position, or the user cannot perform accurate operation due to a small touch area, so that accuracy of the touch operation is affected.
Disclosure of Invention
In view of the above problems, the present application provides a device control method, apparatus, virtual wearable device, and storage medium to improve the above problems.
In a first aspect, an embodiment of the present application provides an apparatus control method, which is applied to a virtual wearable apparatus, where the virtual wearable apparatus includes a gesture control apparatus and a display screen, and the method includes: if the target control operation acting on the gesture control equipment is detected, controlling the display screen to display the position of a target object relative to the gesture control equipment and operation guide information, wherein the operation guide information is used for indicating the target object to perform the gesture control operation according to the position; acquiring gesture control operation content of the target object; and executing an operation function corresponding to the gesture control operation content, and controlling the display screen to display an operation picture corresponding to the gesture control operation content.
In a second aspect, an embodiment of the present application provides an apparatus control device, which operates on a virtual wearable device, where the virtual wearable device includes a gesture control device and a display screen, and the apparatus includes: the first control module is used for controlling the display screen to display the position of a target object relative to the gesture control equipment and operation guide information if target control operation acting on the gesture control equipment is detected, wherein the operation guide information is used for indicating the target object to perform gesture control operation according to the position; the operation data acquisition module is used for acquiring the gesture control operation content of the target object; and the second control module is used for executing an operation function corresponding to the gesture control operation content and controlling the display screen to display an operation picture corresponding to the gesture control operation content.
In a third aspect, the present application provides a virtual wearable device, including a gesture control device, a display screen, one or more processors, and a memory; one or more programs stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of the first aspect described above.
In a fourth aspect, the present application provides a computer-readable storage medium having program code stored therein, wherein the program code executes the method of the first aspect.
According to the method, if a target control operation acting on the gesture control device is detected, the display screen is controlled to display the position of a target object relative to the gesture control device and operation guide information, then gesture control operation content of the target object is obtained, then an operation function corresponding to the gesture control operation content is executed, and the display screen is controlled to display an operation picture corresponding to the gesture control operation content. Therefore, through the method, the gesture control operation content of the target object can be obtained under the condition that the target control operation acting on the gesture control equipment is detected, and then the operation picture corresponding to the gesture control operation content is visually displayed on the display screen, compared with the touch key operation of the virtual wearable equipment performed by a user according to memory, the gesture control operation content of the virtual wearable equipment experience user is visually displayed in the application, the accuracy of equipment control through the gesture control operation is improved, meanwhile, the memory cost and the learning cost of the user are reduced, and the user-friendly experience is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows an application scenario diagram of the device control method provided in the embodiment of the present application.
Fig. 2 is a schematic diagram illustrating another application scenario of the device control method according to the embodiment of the present application.
Fig. 3 is a flowchart illustrating a method of controlling a device according to an embodiment of the present application.
Fig. 4 is a schematic diagram illustrating a principle of a capacitive touch screen sensing a touch in the related art.
Fig. 5 is an exemplary diagram illustrating a control display screen displaying a position of a finger or a palm of a user relative to a gesture control device and operation guidance information according to an embodiment of the present application.
Fig. 6 is a diagram illustrating an example in which a display screen displays an operation screen corresponding to gesture control operation content according to an embodiment of the present application.
Fig. 7 is a flowchart illustrating a method of controlling a device according to another embodiment of the present application.
Fig. 8 is a diagram illustrating an example in which a display screen displays confirmation prompt information corresponding to a non-target control operation according to an embodiment of the present application.
Fig. 9 is a flowchart of a method for controlling a device according to another embodiment of the present application.
Fig. 10 shows a method flowchart of step S330 in fig. 9.
Fig. 11 shows a method flowchart of step S340 in fig. 9.
Fig. 12 is an exemplary diagram illustrating that the control display screen displays an operation screen corresponding to the gesture control operation content in a display mode corresponding to the content type to which the gesture control operation content belongs.
Fig. 13 is a block diagram illustrating a structure of a device control apparatus according to an embodiment of the present application.
Fig. 14 shows a block diagram of a virtual wearable device of the present application for executing a device control method according to an embodiment of the present application.
Fig. 15 is a storage unit according to an embodiment of the present application, configured to store or carry program code for implementing a device control method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
With the continuous development of social productivity and scientific technology, VR and AR technologies are increasingly in great demand in various industries. The VR brings immersive experience through isolated audio and video content, is widely applied to mass markets such as games, videos and live social contacts, and the AR emphasizes seamless fusion of virtual information and a real environment, and is widely applied to the fields of industry, military and the like. The user can interact with the environment through functions such as a camera, gesture control or keys of VR or AR wearable equipment.
Wherein, VR or AR wearing equipment are mostly the head-mounted articles for use, for example VR or AR glasses, and the user requires the outward appearance of VR or AR glasses and the travelling comfort that brings because of its volume or weight etc. requires highly. In order to look good, all or part of entity keys of VR or AR glasses can be cancelled, and a touch screen is used for replacing the entity keys to realize the key function, so that the equipment is operated. However, the inventors have found in their research that when a VR or AR device is controlled using a touch panel, a user cannot see his or her own touch operation, and is likely to make a touch error, or cannot find a correct position and cannot operate correctly. Although a user can learn how to use the touch screen in advance, the user needs a long learning cost for accurately mastering the touch position and the touch action, and after the touch screen is used for a long time, the touch accuracy and the operation fault tolerance rate are difficult to be improved to the degree of visual operation (for example, a visual mobile phone performs touch operation), so that the user experience is reduced. Moreover, with the design concept of miniaturization and light weight of VR or AR wearable devices, VR or AR wearable devices using touch input are more and more, that is, the touch area of the touch screen is reduced, and the experience of a user performing device control operation through the touch screen is further deteriorated.
Therefore, in order to solve the above problems, the inventor proposes a method, an apparatus, and a virtual wearable device for controlling a device, which are provided by the present application, and which can obtain gesture control operation content of a target object when detecting that there is a target control operation acting on a gesture control device, and further visually display an operation screen corresponding to the gesture control operation content on a display screen.
An application scenario of the device control method provided in the embodiment of the present application is described below.
Referring to fig. 1, an application scenario of the device control method provided in the embodiment of the present application is shown, where the application scenario includes an interactive system 100. The interactive system 100 includes the virtual wearable device 10, and optionally, the virtual wearable device 10 in the scene may be an integrated virtual wearable device, that is, the virtual wearable device 10 may include both a gesture control device and a display screen. As shown in fig. 1, the virtual wearable device 10 includes a display screen 1, a temple 2, a mirror holder 3, and a gesture control device 4, where the gesture control device 4 may be a touch screen or a touch pad, and the gesture control device 4 may sense a gesture of a user, so that the user may interact with the environment by controlling the virtual wearable device 10 through the gesture.
Referring to fig. 2, a schematic diagram of another application scenario of the device control method provided in the embodiment of the present application is shown, where the application scenario includes an interactive system 200. The interactive system 200 includes the virtual wearable device 10 and the gesture control device 4, that is, the virtual wearable device 10 in this scenario is a split-type (which may be understood as an external or access-type) virtual wearable device. The virtual wearable device 10 includes a display screen 1. The gesture control device 4 may be a mobile device such as a mobile phone and a tablet, and may also be a touch screen or a touch pad. As shown in fig. 2, the gesture control device 4 may include at least one gesture control function button 5, gesture control functions corresponding to different gesture control function buttons may be different, and the gesture control function button 5 may be an entity button or a virtual button. In this scenario, the gesture control device 4 and the display screen 1 may establish a communication connection through a wired communication module or a wireless communication module, so as to perform data interaction.
Optionally, in some embodiments, the gesture control device 4 in the present application may be configured with a virtual interface operation function, that is, a user may perform device control through gesture operation on a virtual control panel in the air.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 3, an embodiment of the present application provides an apparatus control method applied to a virtual wearable apparatus, where the virtual wearable apparatus includes a gesture control apparatus and a display screen, and the method includes:
step S110: and if the target control operation acting on the gesture control equipment is detected, controlling the display screen to display the position of the target object relative to the gesture control equipment and the operation guide information.
The target control operation represents control gesture operation which can be used by a user to control a human-computer interaction function, a display function and the like of the virtual wearable device, and the gesture comprises fingers (including single fingers, double fingers or multiple fingers) or a palm of the user. For example, if the user is watching a video through the AR glasses, if the single-finger tap gesture control device can capture a video picture, the gesture operation of the single-finger tap gesture control device may be used as the target control operation, and if the user only adjusts the wearing position of the virtual wearing device by using a finger, the non-control gesture operation corresponding to the adjustment of the wearing position should not be used as the target control operation. The gesture control device in this embodiment may be a touch screen or a touch pad.
Optionally, taking the gesture control device as a touch screen as an example, please refer to fig. 4, which shows a schematic diagram of a principle of a capacitive touch screen sensing touch in the related art. As shown in the left diagram of fig. 4, the touch screen may detect a change in the capacitance (to GND) of each sensing unit, and when a finger or a palm of a user touches the touch screen, the capacitance of the finger or the palm may be superimposed on the capacitance of the touch screen (as shown in the middle diagram of fig. 4), so that the capacitance of the touch screen increases. Optionally, when performing touch detection, M X-axis and N Y-axis click arrays may be detected each time, where X represents the number of horizontal touch screen sensing lines, Y represents the number of vertical touch screen sensing lines, and the specific values of M and N may not be limited. By following the change of all self-capacitance values of the touch screen before and after the touch of the fingers or the palm of the user, the X coordinate and the Y coordinate of the fingers or the palm of the user can be determined, and further the touch coordinate combined into a plane can be obtained, namely the position of the fingers or the palm of the user relative to the gesture control equipment can be obtained. In this manner, if the X coordinate and the Y coordinate of the finger or the palm are determined, it can be determined that the target control operation acting on the gesture control apparatus is detected.
When the user adjusts the wearing position of the virtual wearing device through the finger or the palm, the user may touch the touch screen to determine the X coordinate and the Y coordinate of the finger or the palm of the user, and it can be understood that when the user adjusts the wearing position of the virtual wearing device through the finger or the palm, the finger or the palm of the user may move back and forth in each direction of the virtual wearing device. For example, when the elastic band is adjusted, the finger or the palm of the user may move toward the back of the virtual wearable device (i.e., the direction in which the back head of the user is located when the user wears the device), so that the X coordinate and the Y coordinate of the acquired finger or palm may change. If the X coordinate and the Y coordinate of the finger or the palm are determined, but the X coordinate and the Y coordinate of the finger or the palm are sometimes absent, it may be determined that the target control operation acting on the gesture control device is not detected.
It should be noted that, when the gesture control device is a touch pad, a principle of sensing whether the touch pad acts on the gesture control device is similar to a principle of sensing whether the touch screen senses a finger or a palm of a user, which is not described herein again, and in a case that the gesture control device is a touch pad, a sensing distance of the touch pad may be a specified distance range, for example, the specified distance range may be 15-50 mm, or may also be other numerical values.
As one way, if a target control operation acting on the gesture control device is detected, in order to enable a user to visually and conveniently see the content of the gesture control operation and conveniently perform subsequent operations, the display screen may be controlled to display the position of the target object relative to the gesture control device and operation guide information. The target object is the aforementioned finger (including single finger, double finger or multiple fingers) or palm of the user. In this way, the user can see the same picture as the operation interface of the gesture control device on the display screen, and can see the position of the finger or palm of the user relative to the gesture control device, so that the user can clearly see the operation interface, and the subsequent accurate control operation is facilitated.
For example, in a specific application scenario, please refer to fig. 5, which shows an exemplary diagram of the control display screen provided in this embodiment displaying a position of a finger or a palm of a user relative to a gesture control device and operation guide information, as shown in fig. 5, a display screen 1 may display an operation interface of the gesture control device, and display proportions of the display screen 1 and the operation interface of the gesture control device may be different, so that the display screen 1 may display the operation interface of the gesture control device after adjusting the display proportion, and the display with a large proportion may be adapted according to an actual size of the display screen, which is not limited herein. The operation interface of the gesture control device may include a plurality of different types of function buttons, and the function identifiers of the different function buttons may be different, for example, the different types of functions may be represented by letters (the letters may be english abbreviations of the functions, for example, the initials of voice are V, and V in fig. 5 is used to prompt a user to operate the button to adjust the volume of the video), function names, or function characteristic patterns, etc., as shown in fig. 5, the display screen 1 may display the function buttons 5 of the operation interface of the gesture control device. The display mode of the display screen 1 for displaying the position of the finger or palm of the user relative to the gesture control device may be: the special symbols such as a cross cursor (shown in fig. 5), a five-pointed star, dots, and the like are used for displaying, or the relative position of the finger or the palm of the user on the operation interface of the gesture control device is directly displayed (not shown in fig. 5), so that the user can have a picture sense of the operation interface on the display screen, and the gesture control operation can be accurately performed.
Optionally, since the display screen and the gesture control device are located at different spatial positions, for example, if the virtual wearable device is a pair of head-mounted AR glasses, and the touch screen of the AR glasses is usually located on a side of the glasses, rather than being located relatively parallel to the glasses, in this way, although the user may clearly see the operation interface of the gesture control device, when the user actually performs the gesture control operation, if the function that the user wants to operate on the display screen is located on the left side of the position of the finger or palm of the user relative to the gesture control device, when the user performs the operation on the gesture operation interface, the finger or palm needs to be moved backwards rather than leftwards, in order to facilitate the user to perform the gesture control operation more accurately, operation guidance information may be displayed on the display screen, where the operation guidance information is used to instruct the user how to perform the gesture control operation according to the foregoing position specifically, that is, the operation guidance information includes direction guidance information in which the finger or palm of the user needs to be moved.
For example, the operation guidance information may include an operation guidance identifier 7 as shown in fig. 5, and optionally, the operation guidance identifier may be an operation guidance identifier corresponding to a control operation recommended according to the position of the finger or palm of the user relative to the operation interface of the gesture control device, for example, the operation guidance identifier corresponding to a control operation most frequently used by the position of the finger or palm of the user relative to the operation interface of the gesture control device. Alternatively, the position of the finger or palm of the user relative to the gesture control device may be different, and the number and type of the operation guidance marks may be different.
In some embodiments, the operation guidance information may further include voice prompt information, text prompt information, or voice combined text prompt information matching the operation guidance identification. For example, when the position of the finger or palm of the user with respect to the gesture control device is as shown in fig. 5, the operation guidance indicator 7 may be displayed, and at the same time, text prompt information (not shown in the figure) corresponding to the operation guidance indicator 7 may be displayed, or voice prompt information corresponding to the operation guidance indicator 7 may be broadcasted. For example, for the leftmost operation guidance indicator shown in fig. 5, a text prompt message "drag your finger backward to adjust the volume" may be displayed, or a voice prompt message "drag your finger backward to adjust the volume" may be broadcast.
Optionally, the content of the operation guidance information may change with a change in the user operation, for example, the gesture control operation of the user may be a single click, a double finger zoom-in operation, or a multi-point control operation, and the operation guidance information corresponding to the single click operation and the operation guidance information corresponding to the double finger zoom-in operation may be different.
Optionally, the operation guidance information may include functions corresponding to different gesture control operations, for example, operation guidance information may be displayed, such as what function can be realized by a single-click operation, what function can be realized by a double-click operation, and the like. The position of the target object relative to the gesture control equipment and the operation guide information are displayed through the display screen, so that the user can visually and clearly see the position of the gesture relative to the gesture control equipment when the gesture control operation is carried out, and the operation guide information, therefore, the user can accurately carry out the gesture control operation under the condition of not needing to learn and memorize in advance, and the use experience of the user is improved.
Step S120: and acquiring the gesture control operation content of the target object.
The gesture control operation content can be understood as at least one gesture control instruction finished by the user through a gesture, and as a mode, which functions are operated by the user can be determined according to the change of the position of the finger or the palm of the user relative to the gesture detection device, specifically, the change of the position of the finger or the palm of the user relative to the gesture detection device can be obtained according to the change of the electrical characteristics of the gesture control device in the gesture sensing process, and then which functions are operated by the user can be determined. For example, if the user performs a single-click operation, a double-click operation, and a double-finger zoom-in operation, the sensed single-click operation instruction, double-click operation instruction, and double-finger zoom-in operation instruction may be used as the gesture control operation content of the target object.
Step S130: and executing an operation function corresponding to the gesture control operation content, and controlling the display screen to display an operation picture corresponding to the gesture control operation content.
After the gesture control operation content of the target object is obtained, an operation function corresponding to the gesture control operation content can be executed, that is, at least one gesture control instruction of a user is responded, a control function corresponding to the at least one gesture control instruction is realized, and a display screen is controlled to display an operation picture corresponding to the gesture control operation content, wherein the operation picture can include changes of an operation interface of gesture control equipment and changes of positions of fingers or palms of the user relative to the gesture control equipment when the user operates through gestures, and optionally, the operation picture can dynamically present the whole gesture control operation process of the user, so that the user has a picture feeling of directly watching the gesture control equipment to perform gesture control operation.
In a specific application scenario, please refer to fig. 6, which illustrates an exemplary view of the display screen provided in this embodiment displaying an operation screen corresponding to a gesture control operation content, as shown in fig. 6, an operation interface of the gesture control device may be displayed on the display screen 1 of the virtual wearable device worn by the user, the operation interface displays various function buttons 5 of the gesture control device, where "function 1, function 2, function 3, function 4, function 5" and "function 6" shown in the middle diagram in fig. 6 are function buttons of the gesture control device, each function button may have a different function, and if the user touches a function button with a function identifier of "function 5 touch" through a finger, the display content of the display screen 1 changes to an interface corresponding to "function 5", that is, the display screen jumps to a menu selection interface shown in the right diagram in fig. 6, the menu 1, the menu selection interface, and the menu selection interface corresponding to "function 5" are displayed on the display screen 1, The menu 2 and the menu 3 are different control modes, and a user can select the control modes according to actual needs, and the menu 1, the menu 2, and the menu 3 are only used for illustration specifically, and the content of the specific representation thereof can be set according to needs. Through the whole operation flow of the user and the intuitive display of the change of the position of the finger or the palm of the user relative to the gesture control equipment, the problem of inaccurate touch operation caused by mistaken touch due to the fact that an operation interface cannot be seen can be reduced, and meanwhile, the interestingness of gesture control operation can be improved.
According to the device control method, if the target control operation acting on the gesture control device is detected, the display screen is controlled to display the position of the target object relative to the gesture control device and the operation guide information, then the gesture control operation content of the target object is obtained, then the operation function corresponding to the gesture control operation content is executed, and the display screen is controlled to display the operation picture corresponding to the gesture control operation content. Therefore, the gesture control operation content of the target object can be obtained under the condition that the target control operation acting on the gesture control equipment is detected, and then the operation picture corresponding to the gesture control operation content is visually displayed on the display screen, compared with the touch key operation of the virtual wearable equipment performed by the user according to the memory, the gesture control operation content of the virtual wearable equipment experience user is visually displayed in the application, the accuracy of equipment control through the gesture control operation is improved, meanwhile, the memory cost and the learning cost of the user are reduced, and the user-friendly experience is improved.
Referring to fig. 7, another embodiment of the present application provides an apparatus control method applied to a virtual wearable apparatus, where the virtual wearable apparatus includes a gesture control apparatus and a display screen, and the method includes:
step S210: detecting whether there is a control operation acting on the gesture control device.
Optionally, the gesture control device has different types and different distances for sensing gestures, so that to avoid misidentification, whether a control operation acting on the gesture control device exists or not can be detected.
Alternatively, the gesture control device may include a touch pad, in which case it may be detected whether there is a space control operation acting on the touch pad within a specified distance. The designated distance may represent a distance between an operation object of the air-spaced control operation and the touch pad, and a specific value of the designated distance may not be limited, and may be, for example, a value of 20 mm, 30 mm, or 50 mm, and the air-spaced control operation may be understood as a control operation acting on the gesture control device. If there is an idle control operation acting on the touch pad, it may be determined that there is a control operation acting on the gesture control apparatus.
Optionally, the gesture control device may include a touch screen, and in this way, it may be detected whether there is a touch operation acting on the touch screen, where the touch operation in this case may be understood as another control operation acting on the gesture control device. If the touch operation acting on the touch screen exists, the control operation acting on the gesture control device can be judged to exist.
Step S220: and identifying whether the pattern corresponding to the control operation meets a target condition.
If the control operation acting on the gesture control device is detected to exist, whether the pattern corresponding to the control operation meets the target condition can be continuously identified, because if the user and the friend go to experience the AR glasses together, when the user experiences, if the distance between the two people is short, the friend may cause the change of the electrical characteristic (namely, the capacitance) of the gesture control device when passing by the user, thereby causing the false identification, or in other words, the false identification may be caused when other living bodies are near the gesture control device, so that in order to ensure the accuracy of the control operation through the gesture, whether the pattern corresponding to the control operation meets the target condition can be identified. The target condition may include that the pattern corresponding to the control operation is a gesture pattern such as a finger or a palm of the user, and optionally, the gesture pattern that may be used for device control may be stored in advance.
Taking the above example as an example, in one implementation, if there is an idle control operation, it may be started to identify whether a pattern corresponding to the idle control operation satisfies a target condition. In another implementation, if there is a touch operation, it may be determined whether a pattern corresponding to the touch operation satisfies a target condition.
Step S231: determining that the control operation is a target control operation.
As one mode, if it is recognized that the pattern corresponding to the control operation satisfies the target condition, it may be determined that the control operation is the target control operation, and then the content of step S232 may be executed, in which the display screen is controlled to display the position of the target object relative to the gesture control device and the operation guidance information.
Step S232: and controlling the display screen to display the position of the target object relative to the gesture control equipment and the operation guide information.
Step S233: and acquiring the gesture control operation content of the target object.
Step S234: and executing an operation function corresponding to the gesture control operation content, and controlling the display screen to display an operation picture corresponding to the gesture control operation content.
Step S241: determining that the control operation is a non-target control operation.
As one way, if it is recognized that the pattern corresponding to the control operation does not satisfy the target condition, it may be determined that the control operation is a non-target control operation.
Step S242: and controlling the display screen to display confirmation prompt information corresponding to the non-target control operation.
In the case that the control operation is determined to be the non-target control operation, the display screen may be controlled to display confirmation prompt information corresponding to the non-target control operation, or if the non-target control operation acting on the gesture control device is detected, the display screen may be controlled to display confirmation prompt information corresponding to the non-target control operation.
For example, in a specific application scenario, as shown in fig. 8, an exemplary diagram of a display screen displaying confirmation prompt information corresponding to a non-target control operation provided by the embodiment of the present application is shown. Optionally, if the user adjusts the wearing position of the virtual wearable device with a hand, a prompt information popup window for confirming that a gesture of the user is detected, whether the user needs to perform a gesture control operation, and the like, shown in fig. 8, can be displayed on the display screen 1, if the user selects yes, the gesture control operation function can be triggered, and if the user selects no, the popup window can be cancelled, so that even if the user touches by mistake, the current gesture control operation condition can be sensed, the gesture of the user is prevented from being directly recognized as control over the human-computer interaction function of the virtual wearable device, and therefore accuracy of device control through the gesture is improved.
According to the device control method provided by the embodiment of the application, by detecting the control operation acting on the gesture control device and the pattern content corresponding to the control operation, the error recognition can be reduced, and the accuracy of the control operation through the gesture is improved. By controlling the display screen to display the confirmation prompt information corresponding to the non-target control operation under the condition that the control operation is judged to be the non-target control operation, the gesture of the user can be prevented from being directly recognized as the control of the human-computer interaction function of the virtual wearable device, and therefore the accuracy of device control through the gesture is further improved. And the gesture control operation content of the virtual wearable device experience user is visually displayed, so that the accuracy of device control through gesture control operation is improved, the memory cost and the learning cost of the user are reduced, and the user-friendly experience is improved.
Referring to fig. 9, another embodiment of the present application provides an apparatus control method applied to a virtual wearable apparatus, where the virtual wearable apparatus includes a gesture control apparatus and a display screen, and the method includes:
step S310: and if the target control operation acting on the gesture control equipment is detected, controlling the display screen to display the position of the target object relative to the gesture control equipment and the operation guide information.
Step S320: detecting whether there is a change in the location.
If the user needs to perform device control through a gesture, the position of the target object relative to the gesture control device may change, for example, when the user performs multi-point control operation through the gesture, the finger of the user may click on a plurality of newly popped pages, and the position of clicking on each page may change (although the change of the position of clicking between the pages is not particularly obvious, the change may still occur, and at least in a time period from when the user lifts the hand to when the user clicks again, the capacitance sensed by the gesture detection device may also change).
Whether the position is changed or not can be judged according to the variation or the variation frequency of the capacitance sensed by the gesture detection equipment. Optionally, if the sensed change amount of the capacitance exceeds a first threshold, or the change frequency of the capacitance exceeds a second threshold, it may be determined that there is a change in the location, otherwise, it may be determined that there is no change in the location. The specific values of the first threshold and the second threshold may not be limited.
Step S330: and acquiring the gesture control operation content of the target object.
Optionally, if it is detected that the position changes, the gesture control operation content of the target object may be obtained; if the position is not detected to be changed, the detection can be periodically and continuously carried out until the position is detected to be changed.
Referring to fig. 10, as an alternative, step S330 may include:
step S331: and acquiring change data of the position.
The change data of the position may include a start position coordinate of the target object, a moving direction of the target object, and the like, where the start position coordinate may be understood as a start coordinate of the target object relative to the operation interface of the gesture control device. Optionally, the change data of the position may be obtained through the change amount and the change direction of the capacitance sensed by the gesture detection device.
Step S332: and determining at least one operation instruction type corresponding to the target control operation according to the change data.
Optionally, the starting position coordinate of the target object may be a plurality of position coordinates, and the moving direction of the target object may also be a plurality of moving directions, and as a manner, the operation instruction type corresponding to the target control operation may be determined by combining a difference between two position coordinates adjacent to each other (referring to a time sequence of the gesture operation) in front of and behind the plurality of position coordinates and the moving direction corresponding to the difference, and optionally, the number of the operation instruction types may be at least one.
For example, the coordinates of the target object displayed for the first time on the display screen relative to the position of the gesture control device may be recorded as position coordinates 1, when the finger of the user slides to the right, position coordinates 2 are obtained, and when the finger of the user slides down again, position coordinates 3 may be obtained, so that one operation instruction type corresponding to the target control operation may be determined according to the position coordinates 1, the position coordinates 2, and the "slide right", and another operation instruction type corresponding to the target control operation may be determined according to the position coordinates 2, the position coordinates 3, and the "slide down".
Step S333: and determining the gesture control operation content of the target object according to the at least one operation instruction type.
Optionally, after at least one operation instruction type corresponding to the target control operation is determined, an operation function corresponding to the at least one operation instruction type may be used as the gesture control operation content of the target object.
Optionally, if the type of the operation instruction corresponding to the target control operation is not determined, it is described that the current gesture of the user may not be standard and cannot be sensed by the gesture control device, so as to facilitate the user to quickly learn the gesture control operation, the display screen may be controlled to display the misoperation prompt information, and the gesture actions that are similar to the current gesture of the user (for example, the gesture with similarity greater than 70%) and can be used for device control are displayed, so as to help the user to quickly correct the wrong gesture, reduce the learning cost of the user, and improve the user experience.
Step S340: and executing an operation function corresponding to the gesture control operation content, and controlling the display screen to display an operation picture corresponding to the gesture control operation content.
Referring to fig. 11, as an alternative, step S340 may include:
step S341: and acquiring the content type of the gesture control operation content.
Optionally, the gesture control functions corresponding to different gesture operations may be different. For example, a single click operation needs to be performed on the operation interface only once, while a multi-point control operation needs to be performed on the operation interface multiple times in succession, so that the multi-point control operation needs to be performed more accurately through gesture control. In order to avoid the problem of false touch in the operation process of gesture control operation such as multi-point control operation, the content of the gesture control operation may be classified according to the content of the operation, for example, the content of the gesture control operation is classified into a first content type and a second content type, the first content type represents the type of the gesture operation that can be controlled instantly, and for example, the first content type may include a single-click operation; the second content type characterizes a type of gesture operation that requires continuous control, e.g., the second content type may include a multi-point control operation. By classifying the content of the gesture control operation content, the operation picture corresponding to the gesture control operation content can be displayed in a corresponding display effect when being displayed, so that a user can be helped to more accurately distinguish which gesture control operation state the user is in.
Step S342: and controlling the display screen to display an operation picture corresponding to the gesture control operation content in a display mode corresponding to the content type.
Optionally, the display modes corresponding to different content types may be different, and the display modes may include two-dimensional display and three-dimensional display. In one mode, the display screen may be controlled to display an operation screen corresponding to the gesture control operation content in a display mode corresponding to the content type of the gesture control operation content. For example, if the gesture control operation content is a click operation, an operation screen corresponding to the click operation may be displayed in a two-dimensional display mode (the operation screen in this mode may be an operation result), and if the gesture control operation content is a multi-point control operation, in order to make it clearer for the user how to perform the next gesture operation and the current gesture control state, the operation screen corresponding to the multi-point control operation may be displayed in a three-dimensional display mode, that is, the position of the gesture of the user relative to the gesture control device and the whole multi-point control process are both displayed on the display screen in a visual manner, so that the operation process of device control through the gesture becomes intuitive and convenient, and the user experience is improved.
In a specific application scenario, as shown in fig. 12, an exemplary diagram is shown in which the control display screen provided in this embodiment displays an operation screen corresponding to the gesture control operation content in a display mode corresponding to the content type to which the gesture control operation content belongs, and optionally, if the user adjusts the volume of the video playing interface through a single-click operation, the change after the volume is adjusted may be displayed on the display screen 1 in a two-dimensional display mode, that is, the result after the volume is adjusted may be directly displayed. Optionally, in some embodiments, the gesture of the user may also be displayed at the result of the volume adjustment in fig. 12, that is, the user may know that the volume is adjusted to the result of the volume adjustment shown in fig. 12 only by clicking the position of the black point in fig. 12.
According to the device control method provided by the embodiment of the application, the accuracy of the device control performed by the user through the gesture can be improved by detecting the position change of the gesture of the user relative to the gesture control device and displaying the picture corresponding to the gesture control operation content according to the display mode corresponding to the content type of the gesture control operation content. The gesture control operation content of the virtual wearable device experience user is visually displayed, accuracy of device control through gesture control operation is improved, memory cost and learning cost of the user are reduced, and user-friendly experience is improved.
Referring to fig. 13, an apparatus control device 400 provided in the embodiment of the present application operates on a virtual wearable device, where the virtual wearable device includes a gesture control device and a display screen, and the apparatus 400 includes:
the first control module 410 is configured to, if a target control operation acting on the gesture control device is detected, control the display screen to display a position of a target object relative to the gesture control device and operation guidance information, where the operation guidance information is used to instruct the target object to perform the gesture control operation according to the position.
Optionally, the apparatus 400 may further include a gesture sensing module, configured to detect whether there is a control operation acting on the gesture control device before controlling the display screen to display the position of the target object relative to the gesture control device and the operation guidance information if a target control operation acting on the gesture control device is detected; if the control operation exists, identifying whether the pattern corresponding to the control operation meets a target condition; if the target condition is met, judging that the control operation is a target control operation; and if the target condition is not met, judging that the control operation is a non-target control operation.
The gesture control device comprises a touch pad, and in this way, whether the air separation control operation acting on the touch pad exists within a specified distance can be detected, and the specified distance represents the distance between an operation object of the air separation control operation and the touch pad; and if the space control operation exists, identifying whether the pattern corresponding to the control operation meets the target condition.
As another embodiment, the gesture control device includes a touch screen, in which case whether there is a touch operation on the touch screen may be detected; and if the touch operation exists, identifying whether the pattern corresponding to the control operation meets the target condition.
Optionally, the first control module 410 may be further configured to control the display screen to display a confirmation prompt message corresponding to the non-target control operation if the non-target control operation acting on the gesture control device is detected.
And an operation data obtaining module 420, configured to obtain the gesture control operation content of the target object.
In one implementation, the operation data obtaining module 420 may be configured to obtain the gesture control operation content of the target object if it is detected that the position has changed. Wherein change data of the position can be acquired; determining at least one operation instruction type corresponding to the target control operation according to the change data; and determining the gesture control operation content of the target object according to the at least one operation instruction type.
And the second control module 430 is configured to execute an operation function corresponding to the gesture control operation content, and control the display screen to display an operation screen corresponding to the gesture control operation content.
Optionally, the second control module 430 may be configured to obtain a content type of the content of the gesture control operation; and controlling the display screen to display an operation picture corresponding to the gesture control operation content in a display mode corresponding to the content type, wherein the display modes corresponding to different content types are different.
Optionally, the second control module 430 may be further configured to control the display screen to display the misoperation prompt information if the operation instruction type corresponding to the target control operation is not determined.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling or direct coupling or communication connection between the modules shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or modules may be in an electrical, mechanical or other form.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Referring to fig. 14, based on the device control method and apparatus, the embodiment of the present application further provides a virtual wearable device 100 capable of executing the device control method. The virtual wearable device 100 includes a display screen 1, a gesture control device 4, a memory 102, and one or more processors 104 (only one is shown) coupled to each other, and the memory 102 and the processors 104 are connected by communication lines. The memory 102 stores therein a program that can execute the contents of the foregoing embodiments, and the processor 104 can execute the program stored in the memory 102.
The gesture control device 4 may be configured to sense a gesture control operation of a user, the gesture control device 4 may be a touch screen or a touch pad, and the gesture control device 4 may sense a gesture control operation acting on the touch screen or may sense a gesture control operation acting on the touch pad (the gesture control operation in this manner is a blank operation, and a distance between a user gesture and the touch pad may be set according to an actual situation).
The display screen 1 may be used to display a position of a target object (e.g., a finger or a palm of a user) with respect to the gesture control apparatus 4 and operation guidance information for instructing the target object to perform a gesture control operation according to the position. In this embodiment of the application, the display screen 1 may be further configured to display an operation screen corresponding to the gesture control operation content of the user, where the operation screen represents an operation flow of the user on the gesture control device through a gesture, and the operation flow may be displayed in a two-dimensional, three-dimensional, or two-dimensional and three-dimensional combination manner, so that the gesture control operation of the user may be clearly displayed in front of the user in a visual manner, and thus the accuracy of the gesture control operation may be improved.
The processor 104 may include one or more processing cores. The processor 104 connects various parts within the entire virtual wearable device 100 using various interfaces and lines, and performs various functions of the virtual wearable device 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 102, and calling data stored in the memory 102. Alternatively, the processor 104 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 104 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 104, but may be implemented by a communication chip.
The Memory 102 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 102 may be used to store instructions, programs, code sets, or instruction sets. The memory 102 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the foregoing embodiments, and the like. The storage data area may also store data (such as a phone book, audio and video data, chat log data) created by the virtual wearable device 100 in use, and the like.
Referring to fig. 15, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable storage medium 500 has stored therein program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 500 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 500 includes a non-transitory computer-readable storage medium. The computer readable storage medium 500 has storage space for program code 510 for performing any of the method steps of the method described above. The program code can be read from and written to one or more computer program products. The program code 510 may be compressed, for example, in a suitable form.
In summary, according to the device control method, the device, the virtual wearable device, and the storage medium provided by the application, if a target control operation acting on the gesture control device is detected, the display screen is controlled to display the position of the target object relative to the gesture control device and the operation guide information, so as to obtain the gesture control operation content of the target object, execute the operation function corresponding to the gesture control operation content, and control the display screen to display the operation picture corresponding to the gesture control operation content. Therefore, through the method, the gesture control operation content of the target object can be obtained under the condition that the target control operation acting on the gesture control equipment is detected, and then the operation picture corresponding to the gesture control operation content is visually displayed on the display screen, compared with the touch key operation of the virtual wearable equipment performed by a user according to memory, the gesture control operation content of the virtual wearable equipment experience user is visually displayed in the application, the accuracy of equipment control through the gesture control operation is improved, meanwhile, the memory cost and the learning cost of the user are reduced, and the user-friendly experience is improved.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (12)

1. The device control method is applied to a virtual wearable device, wherein the virtual wearable device comprises a gesture control device and a display screen, and the method comprises the following steps:
if the target control operation acting on the gesture control equipment is detected, controlling the display screen to display the position of a target object relative to the gesture control equipment and operation guide information, wherein the operation guide information is used for indicating the target object to perform the gesture control operation according to the position;
acquiring gesture control operation content of the target object;
and executing an operation function corresponding to the gesture control operation content, and controlling the display screen to display an operation picture corresponding to the gesture control operation content.
2. The method according to claim 1, wherein before the step of controlling the display screen to display the position of the target object relative to the gesture control device and the operation guidance information if the target control operation acting on the gesture control device is detected, further comprising:
detecting whether a control operation acting on the gesture control device exists;
if the control operation exists, identifying whether the pattern corresponding to the control operation meets a target condition;
if the target condition is met, judging that the control operation is a target control operation;
and if the target condition is not met, judging that the control operation is a non-target control operation.
3. The method of claim 2, wherein the gesture control device comprises a touch pad, the detecting whether a control operation is performed on the gesture control device, and if the control operation is performed, identifying whether a pattern corresponding to the control operation satisfies a target condition comprises:
detecting whether an air separation control operation acting on the touch pad exists within a specified distance, wherein the specified distance represents the distance between an operation object of the air separation control operation and the touch pad;
and if the space control operation exists, identifying whether the pattern corresponding to the control operation meets the target condition.
4. The method according to claim 2, wherein the gesture control device comprises a touch screen, the detecting whether there is a control operation acting on the gesture control device, and if there is a control operation, the identifying whether a pattern corresponding to the control operation satisfies a target condition comprises:
detecting whether touch operation acting on the touch screen exists or not;
and if the touch operation exists, identifying whether the pattern corresponding to the control operation meets the target condition.
5. The method of claim 2, further comprising:
and if the non-target control operation acting on the gesture control equipment is detected, controlling the display screen to display confirmation prompt information corresponding to the non-target control operation.
6. The method according to claim 1, wherein the obtaining of the gesture control operation content of the target object comprises:
and if the position is detected to be changed, acquiring the gesture control operation content of the target object.
7. The method according to claim 6, wherein the obtaining the gesture control operation content of the target object if the position change is detected comprises:
acquiring change data of the position;
determining at least one operation instruction type corresponding to the target control operation according to the change data;
and determining the gesture control operation content of the target object according to the at least one operation instruction type.
8. The method of claim 7, further comprising:
and if the operation instruction type corresponding to the target control operation is not determined, controlling the display screen to display misoperation prompt information.
9. The method according to any one of claims 1 to 8, wherein the controlling the display screen to display an operation screen corresponding to the gesture control operation content includes:
acquiring the content type of the gesture control operation content;
and controlling the display screen to display an operation picture corresponding to the gesture control operation content in a display mode corresponding to the content type, wherein the display modes corresponding to different content types are different.
10. The utility model provides an equipment controlling means which characterized in that, operates in virtual wearing equipment, virtual wearing equipment includes gesture controlgear and display screen, the device includes:
the first control module is used for controlling the display screen to display the position of a target object relative to the gesture control equipment and operation guide information if target control operation acting on the gesture control equipment is detected, wherein the operation guide information is used for indicating the target object to perform gesture control operation according to the position;
the operation data acquisition module is used for acquiring the gesture control operation content of the target object;
and the second control module is used for executing an operation function corresponding to the gesture control operation content and controlling the display screen to display an operation picture corresponding to the gesture control operation content.
11. A virtual wearable device comprising a gesture control device, a display screen, one or more processors, and a memory;
one or more programs stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-9.
12. A computer-readable storage medium, having a program code stored therein, wherein the program code when executed by a processor performs the method of any of claims 1-9.
CN202011401480.4A 2020-12-02 2020-12-02 Equipment control method and device, virtual wearable equipment and storage medium Pending CN114578956A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011401480.4A CN114578956A (en) 2020-12-02 2020-12-02 Equipment control method and device, virtual wearable equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011401480.4A CN114578956A (en) 2020-12-02 2020-12-02 Equipment control method and device, virtual wearable equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114578956A true CN114578956A (en) 2022-06-03

Family

ID=81769405

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011401480.4A Pending CN114578956A (en) 2020-12-02 2020-12-02 Equipment control method and device, virtual wearable equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114578956A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117130472A (en) * 2023-04-17 2023-11-28 荣耀终端有限公司 Virtual space operation guide display method, mobile device and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130002545A1 (en) * 2011-06-30 2013-01-03 Google Inc. Wearable computer with curved display and navigation tool
KR20150068819A (en) * 2013-12-12 2015-06-22 엘지전자 주식회사 Display device and control method thereof
CN104914999A (en) * 2015-05-27 2015-09-16 广东欧珀移动通信有限公司 Method for controlling equipment and wearable equipment
CN105988562A (en) * 2015-02-06 2016-10-05 刘小洋 Intelligent wearing equipment and method for realizing gesture entry based on same
CN106575043A (en) * 2014-09-26 2017-04-19 英特尔公司 Systems, apparatuses, and methods for gesture recognition and interaction
US10353532B1 (en) * 2014-12-18 2019-07-16 Leap Motion, Inc. User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
CN111383345A (en) * 2018-12-29 2020-07-07 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130002545A1 (en) * 2011-06-30 2013-01-03 Google Inc. Wearable computer with curved display and navigation tool
KR20150068819A (en) * 2013-12-12 2015-06-22 엘지전자 주식회사 Display device and control method thereof
CN106575043A (en) * 2014-09-26 2017-04-19 英特尔公司 Systems, apparatuses, and methods for gesture recognition and interaction
US10353532B1 (en) * 2014-12-18 2019-07-16 Leap Motion, Inc. User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
CN105988562A (en) * 2015-02-06 2016-10-05 刘小洋 Intelligent wearing equipment and method for realizing gesture entry based on same
CN104914999A (en) * 2015-05-27 2015-09-16 广东欧珀移动通信有限公司 Method for controlling equipment and wearable equipment
CN111383345A (en) * 2018-12-29 2020-07-07 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117130472A (en) * 2023-04-17 2023-11-28 荣耀终端有限公司 Virtual space operation guide display method, mobile device and system
CN117130472B (en) * 2023-04-17 2024-07-23 荣耀终端有限公司 Virtual space operation guide display method, mobile device and system

Similar Documents

Publication Publication Date Title
US9329714B2 (en) Input device, input assistance method, and program
EP2869174A1 (en) Method and device for text input and display of intelligent terminal
US20150199030A1 (en) Hover-Sensitive Control Of Secondary Display
WO2014176038A1 (en) Dynamically-positioned character string suggestions for gesture typing
US9623329B2 (en) Operations for selecting and changing a number of selected objects
CN111383345B (en) Virtual content display method and device, terminal equipment and storage medium
US10621766B2 (en) Character input method and device using a background image portion as a control region
CN112506340A (en) Device control method, device, electronic device and storage medium
EP4307096A1 (en) Key function execution method, apparatus and device, and storage medium
US20120179963A1 (en) Multi-touch electronic device, graphic display interface thereof and object selection method of multi-touch display
CN107239222A (en) The control method and terminal device of a kind of touch-screen
CN103927114A (en) Display method and electronic equipment
CN114170407B (en) Model mapping method, device, equipment and storage medium for input equipment
CN110658976B (en) Touch track display method and electronic equipment
CN114578956A (en) Equipment control method and device, virtual wearable equipment and storage medium
JP5736005B2 (en) Input processing device, information processing device, information processing system, input processing method, information processing method, input processing program, and information processing program
CN112817447A (en) AR content display method and system
US20230236673A1 (en) Non-standard keyboard input system
EP3088991B1 (en) Wearable device and method for enabling user interaction
CN111007942A (en) Wearable device and input method thereof
CN113703577B (en) Drawing method, drawing device, computer equipment and storage medium
CN115033170A (en) Input control system and method based on virtual keyboard and related device
JP2016075976A (en) Image processing apparatus, image processing method, image communication system, and program
JP5762075B2 (en) Information processing apparatus, information processing method, and program
KR20150100332A (en) Sketch retrieval system, user equipment, service equipment, service method and computer readable medium having computer program recorded therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination