CN109947249B - Interaction method of wearable device, wearable device and computer storage medium - Google Patents

Interaction method of wearable device, wearable device and computer storage medium Download PDF

Info

Publication number
CN109947249B
CN109947249B CN201910201464.1A CN201910201464A CN109947249B CN 109947249 B CN109947249 B CN 109947249B CN 201910201464 A CN201910201464 A CN 201910201464A CN 109947249 B CN109947249 B CN 109947249B
Authority
CN
China
Prior art keywords
wearable device
view
gesture
touch event
event sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910201464.1A
Other languages
Chinese (zh)
Other versions
CN109947249A (en
Inventor
崔永胜
里强
余航
王建法
何利鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201910201464.1A priority Critical patent/CN109947249B/en
Publication of CN109947249A publication Critical patent/CN109947249A/en
Application granted granted Critical
Publication of CN109947249B publication Critical patent/CN109947249B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an interaction method of wearable equipment, which comprises the following steps: when a sensor of the wearable device detects a flying gesture, acquiring operation data corresponding to the flying gesture; simulating according to the operation data and a pre-stored data protocol to obtain a touch event sequence corresponding to the air gesture; and determining and executing an operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device. The invention also discloses the wearable device and a computer storage medium. When the wearable device detects the air gesture, the touch event sequence is obtained through simulation according to the operation data corresponding to the air gesture, the wearable device determines the operation instruction corresponding to the touch event sequence and executes the operation instruction, so that the air operation of the wearable device is realized, and the operation of the wearable device is more intelligent and flexible.

Description

Interaction method of wearable device, wearable device and computer storage medium
Technical Field
The present invention relates to the field of wearable devices, and in particular, to an interaction method of a wearable device, and a computer storage medium.
Background
The mid-air gestures are a type of non-contact air gestures that enable a user to operate in a freehand manner, and are essentially a natural human-machine interaction manner that does not cause any inconvenience to the user's gesture interactions. Because the display screen of the wearable device is smaller, the technical problem exists in that the method for directly applying the air gesture recognition to the wearable device for interaction according to the prior art, and how to more intelligently interact with the wearable device based on the air gesture becomes the technical problem to be solved currently.
Disclosure of Invention
The invention mainly aims to provide an interaction method of wearable equipment, the wearable equipment and a computer storage medium, and aims to solve the technical problem that the interaction mode of the wearable equipment is not intelligent enough currently.
In order to achieve the above object, the present invention provides an interaction method of a wearable device, the interaction method of the wearable device comprising the steps of:
when a sensor of the wearable device detects a flying gesture, acquiring operation data corresponding to the flying gesture;
simulating according to the operation data and a pre-stored data protocol to obtain a touch event sequence corresponding to the air gesture;
And determining and executing an operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device.
Optionally, after the step of acquiring the operation data corresponding to the air gesture when the sensor of the wearable device detects the air gesture, the method includes:
the sensor acquires the sliding speed in the operation data and judges whether the sliding speed is in a preset speed interval or not;
when the sliding speed is not in the preset speed interval, the wearable device outputs prompt information to prompt the wearable device to input a new air gesture corresponding to a user;
and when the sliding speed is in the preset speed interval, the sensor packages the operation data into a binary format and reports the binary format to a frame layer.
Optionally, the step of simulating according to the operation data and a pre-stored data protocol to obtain the touch event sequence corresponding to the air gesture includes:
the frame layer of the wearable device acquires the sliding speed and the sliding direction in the operation data;
when the sliding direction is upward or downward, the frame layer obtains the longitudinal screen size of the wearable device in the use state, and calculates the sliding distance according to the longitudinal screen size and the sliding speed; when the sliding direction is left or right, the frame layer acquires the transverse screen size of the wearable device in the use state, and calculates the sliding distance according to the transverse screen size and the sliding speed;
According to the sliding distance and an acceleration algorithm in a pre-stored data protocol, a move event sequence corresponding to the over-the-air gesture is obtained;
and changing the first one of the move event sequences into a down event and the last move event into an up event to obtain a touch event sequence corresponding to the over-air gesture.
Optionally, the step of determining and executing the operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device includes:
traversing a view tree corresponding to the current window;
when a first target view containing a preset gesture attribute value exists in the view tree, the touch event sequence is sent to a first view control corresponding to the first target view;
the first view control obtains an operation instruction corresponding to the touch event sequence in the preset instruction and event mapping table, and executes the operation instruction.
Optionally, the step of determining and executing the operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device includes:
traversing a view tree corresponding to the current window;
When a first target view containing a preset gesture attribute value does not exist in the view tree, acquiring display settings of each view in the view tree;
determining a second target view in a view tree according to the display setting, and sending the touch event sequence to a second view control corresponding to the second target view;
and the second view control acquires an operation instruction corresponding to the touch event sequence in the preset instruction and event mapping table, and executes the operation instruction.
Optionally, after the step of sending the touch event sequence to the first view control corresponding to the first target view when the first target view including the preset gesture attribute value exists in the view tree, the method includes:
the first view control inquires a preset instruction and event mapping table;
when the operation instruction corresponding to the touch event sequence does not exist in the preset instruction and event mapping table, executing the steps: and acquiring display settings of each view in the view tree.
Optionally, the step of determining a second target view in the view tree according to the display setting and sending the touch event sequence to a second view control corresponding to the second target view includes:
And taking the view with the largest display size in the view tree and recorded with the roll in the roll label as a second target view, wherein the display setting comprises the following steps: displaying size and roll labels;
and the framework layer sends the touch event sequence to a second view control corresponding to the second target view.
Optionally, the step of determining and executing the operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device includes:
determining an operation instruction corresponding to the air gesture according to the touch event sequence and display information of the current window of the wearable device;
and when the operation instruction is a sliding operation instruction, acquiring the sliding speed in the operation data, acquiring the display adjustment speed corresponding to the sliding speed in a preset sliding data table, and sliding according to the display adjustment speed.
In addition, in order to achieve the above purpose, the present invention also provides a wearable device;
the wearable device includes: a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein:
The computer program, when executed by the processor, implements the steps of the interaction method of the wearable device as described above.
In addition, in order to achieve the above object, the present invention also provides a computer storage medium;
the computer storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the interaction method of a wearable device as described above.
According to the interaction method of the wearable device, the wearable device and the computer storage medium, when a sensor of the wearable device detects a null gesture, operation data corresponding to the null gesture are obtained; simulating according to the operation data and a pre-stored data protocol to obtain a touch event sequence corresponding to the air gesture; and determining and executing an operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device. When the wearable device detects the air gesture, the wearable device simulates according to operation data corresponding to the air gesture, so that a touch event sequence corresponding to the air gesture when mapped to the wearable device is realized, then, the wearable device determines an operation instruction corresponding to the touch event sequence and executes the operation instruction, and the air-isolation operation of the wearable device is realized, so that the operation of the wearable device is more intelligent and flexible.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic hardware structure of an implementation manner of a wearable device according to an embodiment of the present invention;
fig. 2 is a hardware schematic diagram of an implementation manner of a wearable device provided in an embodiment of the present application;
fig. 3 is a hardware schematic of an implementation manner of a wearable device provided in an embodiment of the present application;
fig. 4 is a hardware schematic of an implementation manner of a wearable device provided in an embodiment of the present application;
fig. 5 is a schematic flow chart of a first embodiment of an interaction method of the wearable device of the present invention;
fig. 6 is a schematic flow chart of a second embodiment of an interaction method of the wearable device of the present invention;
Fig. 7 is a hardware schematic of an implementation of a wearable device provided in an embodiment of the present application.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In the following description, suffixes such as "module", "component", or "unit" for representing elements are used only for facilitating the description of the present invention, and have no specific meaning per se. Thus, "module," "component," or "unit" may be used in combination.
The wearable device provided by the embodiment of the invention comprises a mobile terminal such as an intelligent bracelet, an intelligent watch and an intelligent mobile phone. With the continuous development of screen technology, mobile terminals such as smart phones and the like can also be used as wearable devices due to the appearance of screen forms such as flexible screens, folding screens and the like. The wearable device provided in the embodiment of the invention may include: RF (Radio Frequency) unit, wiFi module, audio output unit, A/V (audio/video) input unit, sensor, display unit, user input unit, interface unit, memory, processor, and power supply.
In the following description, a wearable device will be taken as an example, please refer to fig. 1, which is a schematic hardware structure of a wearable device implementing various embodiments of the present invention, where the wearable device 100 may include: an RF (Radio Frequency) unit 101, a WiFi module 102, an audio output unit 103, an a/V (audio/video) input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111. Those skilled in the art will appreciate that the wearable device structure shown in fig. 1 does not constitute a limitation of the wearable device, and that the wearable device may include more or fewer components than shown, or may combine certain components, or may have a different arrangement of components.
The following describes the components of the wearable device in detail with reference to fig. 1:
the radio frequency unit 101 may be used to receive and send information or send signals in a call process, specifically, the radio frequency unit 101 may send uplink information to the base station, or may send downlink information sent by the base station to the processor 110 of the wearable device to process the downlink information, where the downlink information sent by the base station to the radio frequency unit 101 may be generated according to the uplink information sent by the radio frequency unit 101, or may be actively pushed to the radio frequency unit 101 after detecting that the information of the wearable device is updated, for example, after detecting that the geographic position where the wearable device is located changes, the base station may send a notification of the change of the geographic position to the radio frequency unit 101 of the wearable device, after receiving the notification of the message, the radio frequency unit 101 may send the notification of the message to the processor 110 of the wearable device to process, and the processor 110 of the wearable device may control the notification of the message to be displayed on the display panel 1061 of the wearable device; typically, the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 may also communicate with a network and other devices through wireless communication, which may specifically include: through wireless communication with a server in a network system, for example, the wearable device can download file resources from the server through wireless communication, for example, an application program can be downloaded from the server, after the wearable device finishes downloading a certain application program, if the file resources corresponding to the application program in the server are updated, the server can push a message notification of resource update to the wearable device through wireless communication so as to remind a user to update the application program. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication, global System for Mobile communications), GPRS (General Packet Radio Service ), CDMA2000 (Code Division Multiple Access, CDMA 2000), WCDMA (Wideband Code Division Multiple Access ), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, time Division synchronous code Division multiple Access), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency Division Duplex Long term evolution), and TDD-LTE (Time Division Duplexing-Long Term Evolution, time Division Duplex Long term evolution), etc.
In one embodiment, the wearable device 100 may access an existing communication network by inserting a SIM card.
In another embodiment, the wearable device 100 may access an existing communication network by setting an esim card (Embedded-SIM), and by adopting the esim card, the internal space of the wearable device may be saved and the thickness may be reduced.
It is to be understood that although fig. 1 shows the radio frequency unit 101, it is to be understood that the radio frequency unit 101 is not an essential component of the wearable device, and may be omitted entirely within a range that does not change the essence of the invention, as required. The wearable device 100 may implement communication connection with other devices or communication networks through the wifi module 102 alone, which is not limited by the embodiment of the present invention.
WiFi belongs to a short-distance wireless transmission technology, and the wearable device can help a user to send and receive emails, browse webpages, access streaming media and the like through the WiFi module 102, so that wireless broadband Internet access is provided for the user. Although fig. 1 shows a WiFi module 102, it is understood that it does not belong to the necessary constitution of the wearable device, and can be omitted entirely as needed within the scope of not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the wearable device 100 is in a call signal reception mode, a talk mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the wearable device 100. The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive an audio or video signal. The a/V input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 can receive sound (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound into audio data. The processed audio (voice) data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 101 in the case of a telephone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting the audio signal.
As shown in fig. 1, the memory 109, which is a computer storage medium, may include an operating system, a network communication module, a user interface module, and an interactive program of the wearable device, and the processor 110 may be configured to call the interactive program of the wearable device stored in the memory 109, and perform the following steps:
when a sensor of the wearable device detects a flying gesture, acquiring operation data corresponding to the flying gesture;
simulating according to the operation data and a pre-stored data protocol to obtain a touch event sequence corresponding to the air gesture;
and determining and executing an operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device.
Further, the processor 110 may be configured to invoke an interaction program of the wearable device stored in the memory 109, and further execute the step of acquiring operation data corresponding to the air gesture when the sensor of the wearable device detects the air gesture, where the step includes:
the sensor acquires the sliding speed in the operation data and judges whether the sliding speed is in a preset speed interval or not;
when the sliding speed is not in the preset speed interval, the wearable device outputs prompt information to prompt the wearable device to input a new air gesture corresponding to a user;
And when the sliding speed is in the preset speed interval, the sensor packages the operation data into a binary format and reports the binary format to a frame layer.
Further, the processor 110 may be configured to invoke an interaction program of the wearable device stored in the memory 109, and further execute the step of performing the simulation according to the operation data and a pre-stored data protocol to obtain the touch event sequence corresponding to the air gesture, where the step includes:
the frame layer of the wearable device acquires the sliding speed and the sliding direction in the operation data;
when the sliding direction is upward or downward, the frame layer obtains the longitudinal screen size of the wearable device in the use state, and calculates the sliding distance according to the longitudinal screen size and the sliding speed; when the sliding direction is left or right, the frame layer acquires the transverse screen size of the wearable device in the use state, and calculates the sliding distance according to the transverse screen size and the sliding speed;
according to the sliding distance and an acceleration algorithm in a pre-stored data protocol, a move event sequence corresponding to the over-the-air gesture is obtained;
and changing the first one of the move event sequences into a down event and the last move event into an up event to obtain a touch event sequence corresponding to the over-air gesture.
Further, the processor 110 may be configured to invoke an interactive program of the wearable device stored in the memory 109, and further execute the step of determining and executing an operation instruction corresponding to the above-mentioned gesture according to the touch event sequence and display information of a current window of the wearable device, where the step includes:
traversing a view tree corresponding to the current window;
when a first target view containing a preset gesture attribute value exists in the view tree, the touch event sequence is sent to a first view control corresponding to the first target view;
the first view control obtains an operation instruction corresponding to the touch event sequence in the preset instruction and event mapping table, and executes the operation instruction.
Further, the processor 110 may be configured to invoke an interactive program of the wearable device stored in the memory 109, and further execute the step of determining and executing an operation instruction corresponding to the above-mentioned gesture according to the touch event sequence and display information of a current window of the wearable device, where the step includes:
traversing a view tree corresponding to the current window;
when a first target view containing a preset gesture attribute value does not exist in the view tree, acquiring display settings of each view in the view tree;
Determining a second target view in a view tree according to the display setting, and sending the touch event sequence to a second view control corresponding to the second target view;
and the second view control acquires an operation instruction corresponding to the touch event sequence in the preset instruction and event mapping table, and executes the operation instruction.
Further, the processor 110 may be configured to invoke an interaction program of the wearable device stored in the memory 109, and further execute the step of sending the touch event sequence to a first view control corresponding to a first target view when the first target view including a preset gesture attribute value exists in the view tree, where the step includes:
the first view control inquires a preset instruction and event mapping table;
when the operation instruction corresponding to the touch event sequence does not exist in the preset instruction and event mapping table, executing the steps: and acquiring display settings of each view in the view tree.
Further, the processor 110 may be configured to invoke an interactive program of the wearable device stored in the memory 109, and further execute the steps of determining a second target view in the view tree according to the display setting, and sending the touch event sequence to a second view control corresponding to the second target view, where the step includes:
And taking the view with the largest display size in the view tree and recorded with the roll in the roll label as a second target view, wherein the display setting comprises the following steps: displaying size and roll labels;
and the framework layer sends the touch event sequence to a second view control corresponding to the second target view.
Further, the processor 110 may be configured to invoke an interactive program of the wearable device stored in the memory 109, and further execute the step of determining and executing an operation instruction corresponding to the above-mentioned gesture according to the touch event sequence and display information of a current window of the wearable device, where the step includes:
determining an operation instruction corresponding to the air gesture according to the touch event sequence and display information of the current window of the wearable device;
and when the operation instruction is a sliding operation instruction, acquiring the sliding speed in the operation data, acquiring the display adjustment speed corresponding to the sliding speed in a preset sliding data table, and sliding according to the display adjustment speed.
In one embodiment, the wearable device 100 includes one or more cameras, and by opening the cameras, capturing of images, photographing, video recording and other functions can be achieved, and the positions of the cameras can be set according to needs.
The wearable device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 1061 and/or backlight when the wearable device 100 moves to the ear. As one type of motion sensor, the accelerometer sensor can detect the acceleration in all directions (typically three axes), and can detect the gravity and direction when stationary, and can be used for applications for recognizing the gesture of a mobile phone (such as horizontal-vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer, knocking), and the like.
In one embodiment, the wearable device 100 further includes a proximity sensor, by employing the proximity sensor, the wearable device can achieve non-contact manipulation, providing more modes of operation.
In one embodiment, the wearable device 100 further comprises a heart rate sensor, which when worn, enables detection of heart rate by being in close proximity to the user.
In one embodiment, the wearable device 100 may further include a fingerprint sensor, and by reading the fingerprint, security verification and the like can be achieved.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
In one embodiment, the display panel 1061 is a flexible display screen, and the wearable device employing the flexible display screen is capable of bending when worn, thereby fitting more. Optionally, the flexible display screen may be an OLED screen body and a graphene screen body, and in other embodiments, the flexible display screen may also be other display materials, which is not limited to this embodiment.
In one embodiment, the display panel 1061 of the wearable device may take a rectangular shape to facilitate wrapping when worn. In other embodiments, other approaches may be taken as well.
The user input unit 107 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the wearable device. In particular, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1071 or thereabout by using any suitable object or accessory such as a finger, a stylus, etc.) and drive the corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 110, and can receive and execute commands sent from the processor 110. Further, the touch panel 1071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 107 may include other input devices 1072 in addition to the touch panel 1071. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc., as specifically not limited herein.
In one embodiment, the sides of the wearable device 100 may be provided with one or more buttons. The button can realize a plurality of modes such as short pressing, long pressing, rotation and the like, thereby realizing a plurality of operation effects. The number of the buttons can be multiple, and different buttons can be combined for use, so that multiple operation functions are realized.
Further, the touch panel 1071 may overlay the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or thereabout, the touch panel 1071 is transferred to the processor 110 to determine the type of the sequence of touch events, and the processor 110 then provides a corresponding visual output on the display panel 1061 according to the type of the sequence of touch events. Although in fig. 1, the touch panel 1071 and the display panel 1061 are two independent components for implementing the input and output functions of the wearable device, in some embodiments, the touch panel 1071 may be integrated with the display panel 1061 to implement the input and output functions of the wearable device, which is not limited herein. For example, when a message notification of a certain application is received through the rf unit 101, the processor 110 may control the message notification to be displayed in a certain preset area of the display panel 1061, where the preset area corresponds to a certain area of the touch panel 1071, and may control the message notification displayed in the corresponding area on the display panel 1061 by performing a touch operation on the certain area of the touch panel 1071.
The interface unit 108 serves as an interface through which at least one external device can be connected with the wearable apparatus 100. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the wearable apparatus 100 or may be used to transmit data between the wearable apparatus 100 and the external device.
In one embodiment, the interface unit 108 of the wearable device 100 adopts a contact structure, and is connected with other corresponding devices through the contact, so as to realize functions of charging, connection and the like. The contact can also be waterproof.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 109 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 110 is a control center of the wearable device, connects various parts of the entire wearable device with various interfaces and lines, and performs various functions and processes data of the wearable device by running or executing software programs and/or modules stored in the memory 109, and calling data stored in the memory 109, thereby performing overall monitoring of the wearable device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The wearable device 100 may further include a power source 111 (such as a battery) for supplying power to each component, and preferably, the power source 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system.
Although not shown in fig. 1, the wearable device 100 may further include a bluetooth module, etc., which will not be described herein. The wearable device 100 can be connected with other terminal devices through Bluetooth to realize communication and information interaction.
Fig. 2 to fig. 4 are schematic structural diagrams of a wearable device according to an embodiment of the present invention. The wearable device comprises a flexible screen. When the wearable device is unfolded, the flexible screen is in a strip shape; when the wearable device is in a wearing state, the flexible screen is bent to be annular. Fig. 2 and 3 show schematic structural diagrams of the wearable device screen when unfolded, and fig. 4 shows schematic structural diagrams of the wearable device screen when bent.
Based on the hardware embodiment of the wearable device, the following steps are provided: a first embodiment of an interaction method of a wearable device; the interaction method of the wearable device is applied to the wearable device.
Referring to fig. 5, in a first embodiment of an interaction method of a wearable device of the present invention, the interaction method of the wearable device includes:
step S10, when a sensor of the wearable device detects a null gesture, operation data corresponding to the null gesture are obtained.
The wearable device at least includes a sensor, specifically, the sensor of the wearable device in this embodiment may include one or more of an infrared sensor, an ambient light sensor, and a proximity sensor, for example, the infrared sensor is a sensor that performs data processing by using infrared rays, the infrared sensor is used for contactless temperature measurement, and in this embodiment, the infrared sensor determines whether a flying gesture exists through contactless temperature measurement; the ambient light sensor can detect a limp gesture according to the brightness change of ambient light, namely, when a user performs sliding operation above a dial of the wearable device, the ambient light of the wearable device is shielded, so that the ambient light of the wearable device changes, and in the embodiment, the ambient light sensor of the wearable device can detect the ambient light change and judge whether the limp gesture exists according to the change regularity of the ambient light; the proximity sensor is used for replacing a contact detection mode such as a limit switch, and the proximity sensor can detect movement information and presence information of an object and convert the movement information and the presence information into electrical signals to realize non-contact control, for example, the capacitive proximity sensor, and in the embodiment, the proximity sensor judges whether a flying gesture exists according to the detected movement information and the detected presence information.
In this embodiment, when the sensor of the wearable device detects the air gesture, the sensor acquires operation data corresponding to the air gesture, where the operation data includes, but is not limited to, one or more of a sliding time, a sliding direction, and a sliding speed (for convenience in understanding the present invention, the sliding speed is an average sliding speed).
And step S20, simulating according to the operation data and a pre-stored data protocol to obtain a touch event sequence corresponding to the air gesture.
In this embodiment, the sensor of the wearable device transmits the operation data to the data processing system of the wearable device for processing, the data processing system analyzes the operation data to obtain a sliding direction and a sliding speed, and simulates a corresponding touch event sequence, specifically, step S20 includes:
step S21, the frame layer of the wearable device obtains the sliding speed and the sliding direction in the operation data.
The sensor transmits the collected operation data to a data processing system of the wearable device (the data processing system of the wearable device is determined according to the determination condition, that is, the data processing system can be analogized to an android system), and a frame layer in the data processing system receives the operation data and acquires the sliding speed and the sliding direction in the operation data.
Step S22, when the sliding direction is upward or downward, the frame layer obtains the longitudinal screen size of the wearable device in the use state, and calculates the sliding distance according to the longitudinal screen size and the sliding speed; and when the sliding direction is left or right, the frame layer acquires the transverse screen size of the wearable device in the use state, and calculates the sliding distance according to the transverse screen size and the sliding speed.
When the frame layer determines that the sliding direction is upward or downward, the frame layer acquires the longitudinal screen size of the wearable device in the use state, and the frame layer calculates the sliding distance according to the longitudinal screen size and the sliding speed; for example, a preset speed and distance mapping table in the wearable device is that the sliding distance corresponding to the sliding speed of 2 meters per second in the preset speed and distance mapping table is the total length of the screen size, the sliding speed is that the sliding distance corresponding to 1.5 meters per second is that the three-quarter screen length of the screen size, the sliding speed is that the sliding distance corresponding to 1 meter per second is that the half screen length of the screen size, the screen of the current wearable device is that the flexible screen is that the longitudinal screen size is 4cm, the sliding speed is that the sliding distance is 1.5 meters per second, and the three-quarter screen size is that of 3 cm.
Similarly, in this embodiment, when the sliding direction is left or right, the frame layer obtains the size of the lateral screen in the use state of the wearable device, and after the frame layer obtains the size of the lateral screen, the frame layer determines the sliding speed according to the preset speed and distance mapping table, so as to calculate the sliding distance.
Step S23, according to the sliding distance and an acceleration algorithm in a pre-stored data protocol, a move event sequence corresponding to the air gesture is obtained.
The frame layer is used for pre-storing a data protocol, the pre-storing data protocol comprises a preset acceleration algorithm, and the frame layer can accurately simulate the operation of a user according to the acceleration algorithm to obtain a move event sequence corresponding to the air gesture; for example, the acceleration algorithm is used for presetting that the acceleration of a user's air gesture is 1 meter per second squared, the sliding distance is 4cm, one move event corresponds to every 0.01 second, the initial speed in operation data is 0, then the frame layer determines that the number of the move events collected by the wearable device is 20 according to the sliding distance and the acceleration algorithm in a pre-stored data protocol, and the 20 move events are arranged according to the time sequence to form a move event sequence.
And step S24, changing the first one of the motion event sequences into a down event and the last motion event into an up event to obtain a touch event sequence corresponding to the air gesture.
The frame layer changes one move event which is ordered first in the move event sequence into a down event, namely, a user presses a screen of the wearable device, changes the last move event of the frame layer into an up event, namely, the user leaves the screen of the wearable device, and takes the down event, a series of move events and the up event as a touch event sequence corresponding to a flying gesture.
Step S30, determining and executing an operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device.
The frame layer of the wearable device acquires a touch event sequence, a preset instruction and an event mapping table are preset in the frame layer, and a preset operation instruction and event mapping relation in the preset instruction and event mapping table, for example, the preset instruction and event mapping table comprises: the touch event series 1 correspondingly increases the volume 5, and the frame layer acquires display information corresponding to the current window according to the touch event series.
When the wearable device detects the air gesture, the wearable device simulates according to operation data corresponding to the air gesture, so that a touch event sequence corresponding to the air gesture when mapped to the wearable device is realized, then, the wearable device determines an operation instruction corresponding to the touch event sequence and executes the operation instruction, and the air-isolation operation of the wearable device is realized, so that the operation of the wearable device is more intelligent and flexible.
In addition, in order to improve the operation experience of the user, a sliding data table is preset for the wearable device, wherein different sliding speeds in the preset sliding data table correspond to different display adjustment speeds, for example, the sliding speed is 1 meter per second and corresponds to the display adjustment speed of 1 centimeter per second; specifically, when the operation instruction is a sliding operation instruction, the frame layer acquires a sliding speed in the operation data, acquires a display adjustment speed corresponding to the sliding speed in a preset sliding data table, and slides according to the display adjustment speed. In the embodiment, the display adjustment speed is determined according to the operation speed of the air gesture, so that the display adjustment speed is more convenient for a user to check.
Further, referring to fig. 6, a second embodiment of the interaction method of the wearable device of the present invention is presented on the basis of the first embodiment of the present invention.
The present embodiment is a step after step S10 in the first implementation, in which the sensor determines the validity of the operation data, and specifically, the interaction method of the wearable device includes:
step S40, the sensor acquires the sliding speed in the operation data, and determines whether the sliding speed is within a preset speed interval.
The sensor acquires the sliding speed in the operation data, and judges whether the sliding speed is in a preset speed interval, wherein the preset speed interval is a preset sliding speed interval, and the preset speed interval can be flexibly set according to user requirements, for example, the preset speed interval is set to be 1 meter per second to 2 meters per second.
And S50, when the sliding speed is not in the preset speed interval, the wearable device outputs prompt information to prompt the wearable device to input a new air gesture corresponding to a user.
When the sensor determines that the sliding speed is not in the preset speed interval, namely, the sliding speed of the air gesture is too high or too low, the sensor determines that the operation data are invalid, then the sensor of the wearable device sends the operation data to a processor of the wearable device, and the processor receives information that the operation data are invalid and outputs prompt information to prompt the wearable device to input a new air gesture corresponding to a user.
And step S60, when the sliding speed is in the preset speed interval, the sensor packages the operation data into a binary format and reports the binary format to a frame layer.
When the sensor determines that the sliding speed is in a preset speed interval, the sensor converts an operation data format into a binary format, encapsulates the binary format and reports the binary format to the frame layer; in the embodiment, the sensor analyzes the operation data and eliminates invalid operation data so that the data processing of the air gesture is more intelligent, and meanwhile, when the sensor determines that the operation data is effective, the sensor converts the operation data into binary data, so that the data transmission quantity is reduced, and the data processing is more convenient.
Further, a third embodiment of the interaction method of the wearable device of the present invention is provided on the basis of the above embodiment of the present invention; this embodiment is a refinement of step S30 in the first embodiment.
In this embodiment, an implementation manner of determining an operation instruction according to a touch event sequence is specifically described, and in particular, the interaction method of the wearable device includes:
step S31, traversing the view tree corresponding to the current window.
The frame layer acquires the view tree corresponding to the current window, and traverses the view tree, namely, the frame layer acquires view leaves of each level of the view tree, and the frame layer expands the view tree in infinite levels to acquire predefined values, views, patterns and the like of each view leaf.
Step S32, when there is a first target view including a preset gesture attribute value in the view tree, the touch event sequence is sent to a first view control corresponding to the first target view.
When a first target view containing preset gesture attribute values exists in the view tree, namely, a preset attribute value set in the frame layer, wherein the preset attribute value set contains a plurality of preset gesture attribute values, the frame layer acquires attributes corresponding to each view in the view tree, judges whether the attributes corresponding to each view contain preset gesture attribute values in the preset attribute value set, and when the attributes corresponding to a certain view in the view tree contain the preset gesture attribute values, the frame layer takes the view as the first target view, and sends a touch event sequence to a first view control corresponding to the first target view, wherein the first view control can be understood as an adjusting program corresponding to the first target view.
For example, a song playing page is displayed in a current window of the wearable device, and a view tree corresponding to the song playing page includes: the method comprises the steps of displaying view of lyrics, displaying view of title, adjusting view of song and adjusting view of volume, wherein a frame layer traverses view tree corresponding to a song playing page, and judges that the volume adjusting view contains a preset gesture attribute value, and the frame layer slides the received operation data in the direction: and (5) upward, sending the view control with the sliding speed of 1.2 meters per second to the view corresponding to the volume adjustment.
Step S33, the first view control obtains an operation instruction corresponding to the touch event sequence in the event mapping table and the preset instruction, and executes the operation instruction.
The preset instruction and event mapping table in the first view control refers to a preset operation instruction and event mapping table, for example, the preset instruction and event mapping table includes: the touch event series 1 correspondingly increases the volume 5; the first view control acquires an operation instruction corresponding to a touch event sequence in the event mapping table and executes the operation instruction.
In this embodiment, user-defined gesture operation is supported, by presetting an attribute value set in a frame layer, wherein the preset attribute value set includes a preset gesture attribute value, when the frame layer detects that a target view including a preset gesture attribute exists in a view tree, an adjustment operation is performed on the target view, that is, in this embodiment, the preset gesture attribute value can be set on some special interfaces such as music playing or gallery, so as to support special operations, for example, sliding up and down on the gallery interface to zoom in or out the picture, and sliding left and right to switch the picture; the control of the wearable device is more intelligent.
Further, another implementation manner of determining an operation instruction according to a touch event sequence is further provided in this embodiment, and specifically, the interaction method of the wearable device includes:
step S34, when the first target view including the preset gesture attribute value does not exist in the view tree, obtaining a display setting of each view in the view tree.
When the frame layer determines that a first target view containing a preset gesture attribute value does not exist in the view tree, the frame layer acquires display settings of each view in the view tree, wherein the display settings comprise: the display size and the rolling label, for example, the current window is a time setting window, and the time setting window includes: the display sizes in the display settings of the date setting view, hour setting view, and minute setting view are: the rolling label is 0.5cm long and 0.5cm wide, and has rolling property; the display size in the hour setting view display setting is: the length is 1cm and the width is 0.5cm, and the rolling property label is recorded with rolling property; the display size in the display setting of the minute setting view is: the rolling property label is recorded with rolling property, wherein the length is 0.5cm and the width is 0.5 cm.
And step S35, determining a second target view in the view tree according to the display setting, and sending the touch event sequence to a second view control corresponding to the second target view.
Specifically, step S35 includes:
step a, taking the view with the largest display size in the view tree and the record with the rolling property in the rolling property label as a second target view;
and b, the framework layer sends the touch event sequence to a second view control corresponding to the second target view.
Step S37, the second view control obtains an operation instruction corresponding to the touch event sequence in the event mapping table and the preset instruction, and executes the operation instruction.
The second view control preset instruction and event mapping table refers to a preset operation instruction and event mapping table, for example, the preset instruction and event mapping table includes: the touch event series 2 corresponds to a 1 cm slide; the second view control obtains an operation instruction corresponding to the touch event sequence in the preset instruction and the event mapping table, and executes the operation instruction.
In this embodiment, the air gesture may be directed to the largest scrollable view in the current window, so that the displayed content may be slid to the greatest extent. And judging the visibility and rollability of the view layer by traversing the view tree, and after obtaining the optimal view, issuing a touch event sequence to a view control corresponding to the optimal view by the frame layer to process, and executing an operation instruction corresponding to the air gesture, thereby realizing the up-down or left-right sliding effect of the wearable device.
What needs to be stated is: in step S33 of this embodiment, the first view control obtains an operation instruction corresponding to the touch event sequence in the event mapping table and further includes, after executing the operation instruction: when the operation instruction corresponding to the touch event sequence does not exist in the preset instruction and event mapping table, executing step S34: and acquiring display settings of each view in the view tree.
For example, the first target view is a volume adjustment view, a first view control corresponding to the first target view, a preset instruction and an event mapping table in the first view control, where the preset instruction and the event mapping table include: 1. presetting an operation instruction corresponding to upward sliding of a touch event sequence to increase volume; 2. the method comprises the steps that a corresponding operation instruction of downward sliding of a touch event sequence is preset to reduce volume; if the touch event sequence obtained by simulating the operation data corresponding to the air gesture is sliding leftwards, the first view control determines that the operation instruction corresponding to the touch event sequence does not exist in the preset instruction and the event mapping table, and at this time, step S34 is executed: and acquiring display settings of each view in the view tree. In the embodiment, the effectiveness of user operation is ensured through traversing view.
Based on the above embodiments, it can be seen that if the device is a wristwatch, a bracelet, or a wearable device, the screen of the device may not cover the watchband area of the device, or may cover the watchband area of the device. The invention proposes an alternative embodiment, in which the device may be a wristwatch, a bracelet or a wearable device, comprising a screen and a connection. The screen may be a flexible screen and the connection may be a wristband. Alternatively, the screen of the device or the display area of the screen may be partially or fully overlaid on the wristband of the device. Fig. 7 is a schematic hardware diagram of an implementation manner of a wearable device according to an embodiment of the present application, where a screen of the device extends to two sides, and a part of the screen is covered on a watchband of the device. In other embodiments, the screen of the device may also be entirely overlaid on the wristband of the device.
In addition, the embodiment of the invention also provides a computer storage medium.
The computer storage medium stores a computer program, which when executed by a processor, implements the operations in the interaction method of the wearable device provided in the foregoing embodiment.
It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity/operation/object from another entity/operation/object without necessarily requiring or implying any actual such relationship or order between such entities/operations/objects; the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points. The apparatus embodiments described above are merely illustrative, in which the units illustrated as separate components may or may not be physically separate. Some or all of the modules may be selected according to actual needs to achieve the objectives of the present invention. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, comprising instructions for causing a wearable device (which may be a wristwatch, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (6)

1. An interaction method of a wearable device is characterized by comprising the following steps:
When a sensor of the wearable device detects a flying gesture, acquiring operation data corresponding to the flying gesture;
simulating according to the operation data and a pre-stored data protocol to obtain a touch event sequence corresponding to the air gesture;
determining and executing an operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device;
wherein,
when a sensor of the wearable device detects a null gesture, after the step of acquiring operation data corresponding to the null gesture, the method comprises the following steps:
the sensor acquires the sliding speed in the operation data and judges whether the sliding speed is in a preset speed interval or not;
when the sliding speed is not in the preset speed interval, the wearable device outputs prompt information to prompt the wearable device to input a new air gesture corresponding to a user;
when the sliding speed is in the preset speed interval, the sensor packages the operation data into a binary format and reports the binary format to a frame layer;
the step of simulating according to the operation data and a pre-stored data protocol to obtain a touch event sequence corresponding to the aerial gesture comprises the following steps:
The frame layer of the wearable device acquires the sliding speed and the sliding direction in the operation data;
when the sliding direction is upward or downward, the frame layer obtains the longitudinal screen size of the wearable device in the use state, and calculates the sliding distance according to the longitudinal screen size and the sliding speed; when the sliding direction is left or right, the frame layer acquires the transverse screen size of the wearable device in the use state, and calculates the sliding distance according to the transverse screen size and the sliding speed;
according to the sliding distance and an acceleration algorithm in a pre-stored data protocol, a move event sequence corresponding to the over-the-air gesture is obtained;
changing the first one of the move event sequences into a down event, and changing the last move event into an up event to obtain a touch event sequence corresponding to the over-air gesture;
the step of determining and executing the operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device comprises the following steps:
traversing a view tree corresponding to the current window;
when a first target view containing a preset gesture attribute value exists in the view tree, the touch event sequence is sent to a first view control corresponding to the first target view;
The first view control obtains an operation instruction corresponding to the touch event sequence in the preset instruction and event mapping table, and executes the operation instruction;
or,
when a first target view containing a preset gesture attribute value does not exist in the view tree, acquiring display settings of each view in the view tree;
determining a second target view in a view tree according to the display setting, and sending the touch event sequence to a second view control corresponding to the second target view;
and the second view control acquires an operation instruction corresponding to the touch event sequence in the preset instruction and event mapping table, and executes the operation instruction.
2. The method for interaction of a wearable device according to claim 1, wherein the step of sending the touch event sequence to a first view control corresponding to a first target view when the first target view including a preset gesture attribute value exists in the view tree includes:
the first view control inquires a preset instruction and event mapping table;
when the operation instruction corresponding to the touch event sequence does not exist in the preset instruction and event mapping table, executing the steps: and acquiring display settings of each view in the view tree.
3. The interaction method of the wearable device of claim 1, wherein the step of determining a second target view in a view tree according to the display setting and sending the touch event sequence to a second view control corresponding to the second target view comprises:
and taking the view with the largest display size in the view tree and recorded with the roll in the roll label as a second target view, wherein the display setting comprises the following steps: displaying size and roll labels;
and the framework layer sends the touch event sequence to a second view control corresponding to the second target view.
4. The interaction method of the wearable device according to claim 1, wherein the step of determining and executing the operation instruction corresponding to the air gesture according to the touch event sequence and the display information of the current window of the wearable device includes:
determining an operation instruction corresponding to the air gesture according to the touch event sequence and display information of the current window of the wearable device;
and when the operation instruction is a sliding operation instruction, acquiring the sliding speed in the operation data, acquiring the display adjustment speed corresponding to the sliding speed in a preset sliding data table, and sliding according to the display adjustment speed.
5. A wearable device, the wearable device comprising: a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein:
the computer program, when executed by the processor, implements the steps of the interaction method of a wearable device as claimed in any of claims 1 to 4.
6. A computer storage medium, characterized in that the computer storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the interaction method of a wearable device according to any of claims 1 to 4.
CN201910201464.1A 2019-03-15 2019-03-15 Interaction method of wearable device, wearable device and computer storage medium Active CN109947249B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910201464.1A CN109947249B (en) 2019-03-15 2019-03-15 Interaction method of wearable device, wearable device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910201464.1A CN109947249B (en) 2019-03-15 2019-03-15 Interaction method of wearable device, wearable device and computer storage medium

Publications (2)

Publication Number Publication Date
CN109947249A CN109947249A (en) 2019-06-28
CN109947249B true CN109947249B (en) 2024-03-19

Family

ID=67008984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910201464.1A Active CN109947249B (en) 2019-03-15 2019-03-15 Interaction method of wearable device, wearable device and computer storage medium

Country Status (1)

Country Link
CN (1) CN109947249B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111142655A (en) * 2019-12-10 2020-05-12 上海博泰悦臻电子设备制造有限公司 Interaction method, terminal and computer readable storage medium
CN110989844A (en) * 2019-12-16 2020-04-10 广东小天才科技有限公司 Input method, watch, system and storage medium based on ultrasonic waves
CN111255022A (en) * 2020-01-16 2020-06-09 珠海格力电器股份有限公司 Method and device for controlling water tank, storage medium and electronic device
CN112639689A (en) * 2020-04-30 2021-04-09 华为技术有限公司 Control method, device and system based on air-separating gesture
CN112162476A (en) * 2020-10-28 2021-01-01 广东小天才科技有限公司 Flexible surrounding screen smart watch and information interaction method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017152531A1 (en) * 2016-03-07 2017-09-14 中国科学院计算技术研究所 Ultrasonic wave-based air gesture recognition method and system
CN108874121A (en) * 2018-04-28 2018-11-23 努比亚技术有限公司 Control method, wearable device and the computer readable storage medium of wearable device
CN108920052A (en) * 2018-06-26 2018-11-30 Oppo广东移动通信有限公司 page display control method and related product

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017152531A1 (en) * 2016-03-07 2017-09-14 中国科学院计算技术研究所 Ultrasonic wave-based air gesture recognition method and system
CN108874121A (en) * 2018-04-28 2018-11-23 努比亚技术有限公司 Control method, wearable device and the computer readable storage medium of wearable device
CN108920052A (en) * 2018-06-26 2018-11-30 Oppo广东移动通信有限公司 page display control method and related product

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Hong Li等.WiFinger: talk to your smart devices with finger-grained gesture.《UbiComp'16》.2016,全文. *

Also Published As

Publication number Publication date
CN109947249A (en) 2019-06-28

Similar Documents

Publication Publication Date Title
CN109947249B (en) Interaction method of wearable device, wearable device and computer storage medium
CN110096195B (en) Sports icon display method, wearable device and computer readable storage medium
CN110399195B (en) Desktop icon dynamic replacement method, equipment and computer readable storage medium
CN109982179A (en) Audio frequency signal output, device, wearable device and storage medium
CN110620875B (en) Screenshot control method, equipment and computer readable storage medium in video shooting process
CN109933400B (en) Display interface layout method, wearable device and computer readable storage medium
CN110177209B (en) Video parameter regulation and control method, device and computer readable storage medium
CN109933294B (en) Data processing method and device, wearable device and storage medium
CN110175066A (en) Wearable device, interaction control method and computer readable storage medium
CN110083289A (en) A kind of button display methods, wearable device and computer readable storage medium
CN110086929A (en) Breath screen display methods, mobile phone, wearable device and computer readable storage medium
CN110177208B (en) Video recording association control method, equipment and computer readable storage medium
CN110399196B (en) Wearable device, interface switching implementation method thereof and computer readable storage medium
CN110069193B (en) Interface switching method of wearable device, wearable device and storage medium
CN110072071B (en) Video recording interaction control method, equipment and computer readable storage medium
CN110113529B (en) Shooting parameter regulation and control method and device and computer readable storage medium
CN110162369B (en) Wearable device, icon arrangement method thereof and computer readable storage medium
CN109634503B (en) Operation response method and mobile terminal
CN110191230A (en) Application interface display methods, mobile terminal and readable storage medium storing program for executing
CN110109603A (en) A kind of page operation method, wearable device and computer readable storage medium
CN110650289B (en) Shooting depth of field control method, equipment and computer readable storage medium
CN110620876B (en) Image preview interaction method, device and computer readable storage medium
CN110058918B (en) Picture processing method, wearable device and computer readable storage medium
CN110187950B (en) Method for adjusting picture display position, wearable device and storage medium
CN110096150B (en) Search interaction control method, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant