CN106354415B - Terminal and method for recognizing user gesture - Google Patents

Terminal and method for recognizing user gesture Download PDF

Info

Publication number
CN106354415B
CN106354415B CN201610882626.9A CN201610882626A CN106354415B CN 106354415 B CN106354415 B CN 106354415B CN 201610882626 A CN201610882626 A CN 201610882626A CN 106354415 B CN106354415 B CN 106354415B
Authority
CN
China
Prior art keywords
pressure
change
pressure data
terminal
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610882626.9A
Other languages
Chinese (zh)
Other versions
CN106354415A (en
Inventor
程瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ruian brilliant Network Technology Co., Ltd
Original Assignee
Ruian Brilliant Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ruian Brilliant Network Technology Co Ltd filed Critical Ruian Brilliant Network Technology Co Ltd
Priority to CN201610882626.9A priority Critical patent/CN106354415B/en
Publication of CN106354415A publication Critical patent/CN106354415A/en
Application granted granted Critical
Publication of CN106354415B publication Critical patent/CN106354415B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72484User interfaces specially adapted for cordless or mobile telephones wherein functions are triggered by incoming communication events
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The embodiment of the invention discloses a terminal and a method for identifying user gestures, wherein the method comprises the following steps: the terminal comprises a collecting unit, a first body and a second body, wherein the collecting unit is used for collecting at least one piece of pressure data obtained when the first body attached to a first position of the terminal is in different motion states, and the pressure data come from pressure collecting points arranged on the first position and a second position of the terminal; the data analysis unit is used for dividing the at least one pressure data according to each corresponding area, and obtaining pressure change values on each area and pressure distribution rules on each area along with the change of the acquired pressure data; the modeling unit is used for modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model; and the recognition unit is used for recognizing the gesture change of the current user according to the first model simulation pressure change.

Description

Terminal and method for recognizing user gesture
Technical Field
The invention relates to an identification technology, in particular to a terminal and a method for identifying user gestures.
Background
Wearable electronic devices are now widely used by users, such as smartwatches, smartbands, and the like. The user can use the wearable device to make a call, surf the internet, navigate and the like at any time and any place. However, the screen size of wearable devices represented by smartwatches or bracelets is limited, and cannot meet the requirements of users for quick and convenient operations, such as answering a call, making an emergency call, viewing information, and the like. The operation instruction is generated through the gesture change to realize convenient operation, and is not limited by the size of the screen size, so how to accurately recognize the gesture change of the user is a technical problem to be solved, however, for the problem, an effective solution does not exist in the related art.
Disclosure of Invention
The embodiment of the invention provides a terminal and a method for recognizing user gestures, which at least solve the technical problem.
A terminal according to an embodiment of the present invention includes:
the terminal comprises a collecting unit, a first body and a second body, wherein the collecting unit is used for collecting at least one piece of pressure data obtained when the first body attached to a first position of the terminal is in different motion states, and the pressure data come from pressure collecting points arranged on the first position and a second position of the terminal;
the data analysis unit is used for dividing the at least one pressure data according to each corresponding area, and obtaining pressure change values on each area and pressure distribution rules on each area along with the change of the acquired pressure data;
the modeling unit is used for modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model;
and the recognition unit is used for recognizing the gesture change of the current user according to the first model simulation pressure change.
In the foregoing solution, the terminal further includes:
the terminal comprises a coordinate system establishing unit, a coordinate system calculating unit and a judging unit, wherein the coordinate system establishing unit is used for establishing a coordinate system by using the whole area corresponding to the first position of the terminal, and dividing the whole area into four areas by using a horizontal axis x and a vertical axis y by using the center of the whole area as a coordinate center;
the first determining unit is used for determining the pressure data acquired in the positive direction of the y axis by the distribution of the pressure acquisition points as a positive value;
and the second determining unit is used for determining the negative value of the pressure data acquired in the negative direction of the y axis by the distribution of the pressure acquisition points.
In the foregoing solution, the data analysis unit is further configured to:
and dividing the at least one pressure data into corresponding positions in the four regions according to the positions of the corresponding pressure acquisition points.
In the foregoing solution, the data analysis unit is further configured to:
when the first body attached to the first position of the terminal moves in the y-axis direction, recording that the angle of deviation in the up-down direction is alpha, wherein alpha is a positive value when the first body moves upwards along the y-axis direction, α is a negative value when the first body moves downwards along the y-axis direction;
when the first body attached to the first position of the terminal moves in the x-axis direction, recording that the angle of deviation in the left-right direction is beta, wherein beta is a negative value when the first body moves to the left along the x-axis direction, β is a positive value when the first body moves to the right along the x-axis direction;
and recording the angle value changes of the alpha and the β in the four areas along with the change of the pressure data to obtain the pressure distribution rule that the angle value changes accord with a trigonometric function model.
In the foregoing solution, the identification unit is further configured to:
acquiring the gesture change of a current user, and decomposing the gesture of the current user into motion in the y-axis direction and the x-axis direction;
acquiring a first pressure data change obtained by moving in the y-axis direction and a second pressure data change obtained by moving in the x-axis direction;
inputting the first pressure data change and the second pressure data change into the first model, calculating the angle value change of the vertical direction offset angle α and the angle value change of the horizontal direction offset angle β which deviate from the coordinate center caused by the current user gesture change in real time according to the first model,
and recognizing the current gesture change of the user according to the angle value change of the alpha and the angle value change of the beta.
The embodiment of the invention provides a method for identifying user gestures, which comprises the following steps:
acquiring at least one piece of pressure data obtained when a first body attached to a first position of a terminal is in different motion states, wherein the pressure data is from pressure acquisition points arranged on the first position and a second position of the terminal;
dividing the at least one pressure data according to each corresponding area, and obtaining pressure change values on each area and pressure distribution rules on each area along with the change of the collected pressure data;
modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model;
and identifying the current user gesture change according to the first model simulation pressure change.
In the above scheme, the method further comprises:
establishing a coordinate system by using the whole area corresponding to the first position of the terminal, taking the center of the whole area as a coordinate center, and dividing the whole area into four areas by using a horizontal axis x and a vertical axis y;
the pressure data acquired by the pressure acquisition points distributed in the positive direction of the y axis is a positive value;
and the pressure data acquired in the negative direction of the y axis by the distribution of the pressure acquisition points is a negative value.
In the above scheme, dividing the at least one pressure data according to each corresponding region includes:
and dividing the at least one pressure data into corresponding positions in the four regions according to the positions of the corresponding pressure acquisition points.
In the above scheme, the obtaining of the pressure change value in each area and the pressure distribution rule in each area along with the change of the collected pressure data includes:
when the first body attached to the first position of the terminal moves in the y-axis direction, recording that the angle of deviation in the up-down direction is alpha, wherein alpha is a positive value when the first body moves upwards along the y-axis direction, α is a negative value when the first body moves downwards along the y-axis direction;
when the first body attached to the first position of the terminal moves in the x-axis direction, recording that the angle of deviation in the left-right direction is beta, wherein beta is a negative value when the first body moves to the left along the x-axis direction, β is a positive value when the first body moves to the right along the x-axis direction;
and recording the angle value changes of the alpha and the β in the four areas along with the change of the pressure data to obtain the pressure distribution rule that the angle value changes accord with a trigonometric function model.
In the above solution, recognizing a current user gesture change according to the first model simulation pressure change includes:
acquiring the gesture change of a current user, and decomposing the gesture of the current user into motion in the y-axis direction and the x-axis direction;
acquiring a first pressure data change obtained by moving in the y-axis direction and a second pressure data change obtained by moving in the x-axis direction;
inputting the first pressure data change and the second pressure data change into the first model, calculating the angle value change of the vertical direction offset angle α and the angle value change of the horizontal direction offset angle β which deviate from the coordinate center caused by the current user gesture change in real time according to the first model,
and recognizing the current gesture change of the user according to the angle value change of the alpha and the angle value change of the beta.
The terminal of the embodiment of the invention comprises: the terminal comprises a collecting unit, a first body and a second body, wherein the collecting unit is used for collecting at least one piece of pressure data obtained when the first body attached to a first position of the terminal is in different motion states, and the pressure data come from pressure collecting points arranged on the first position and a second position of the terminal; the data analysis unit is used for dividing the at least one pressure data according to each corresponding area, and obtaining pressure change values on each area and pressure distribution rules on each area along with the change of the acquired pressure data; the modeling unit is used for modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model; and the recognition unit is used for recognizing the gesture change of the current user according to the first model simulation pressure change.
By adopting the embodiment of the invention, at least one piece of pressure data is collected, the pressure data is divided according to each corresponding region, the pressure change value on each region and the pressure distribution rule on each region are obtained along with the change of the collected pressure data, and the first model is generated according to the pressure change value and the pressure distribution rule on each region. Because the first model is obtained according to the pressure change values of the pressure data on the regions and the pressure distribution rules on the regions, when gesture recognition processing is subsequently performed based on the first model, the gesture change of the current user can be accurately recognized according to the pressure change simulated by the first model. The gesture change is not limited by the size of a terminal screen, the gesture change of the current user can be accurately recognized, various convenient operations on the terminal can be realized according to the operation instruction generated by the gesture change, and the method is very simple and easy to implement.
Drawings
Fig. 1 is a schematic hardware configuration diagram of an alternative mobile terminal implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
FIG. 3 is a schematic diagram of pressure data zone partitioning according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a process flow of an embodiment of the present invention;
FIG. 5 is a schematic illustration of a further method flow according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating the hardware components of a terminal according to an embodiment of the invention;
FIG. 7 is a schematic diagram of a pressure distribution rule in an application scenario applying an embodiment of the present invention
Fig. 8-9 are schematic diagrams of mathematical models in application scenarios to which embodiments of the present invention are applied.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the description of the embodiments of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The terminal may be implemented in various forms. For example, the terminal described in the embodiments of the present invention may include terminals such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a Personal Digital Assistant (PDA), a tablet computer (PAD), a Portable Multimedia Player (PMP), a navigation device, and the like, and fixed terminals such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an audio/video (a/V) input unit 120, a user input unit 130, an acquisition unit 140, a data analysis unit 141, a modeling unit 142, an identification unit 143, an output unit 150, a memory 160, an interface unit 170, a controller 180, a power supply unit 190, and the like. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. Elements of the mobile terminal will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of Digital Video Broadcasting Handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive Digital broadcasts by using a Digital Broadcasting system such as a Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Digital video Broadcasting-handheld (DVB-H), a data Broadcasting system of a Forward Link Media (Media flo), a Terrestrial Digital Broadcasting-Integrated service (ISDB-T), an Integrated Services Digital Broadcasting-television, and the like. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet Access technology related to the module may include Wireless Local Area Network (WLAN) (Wi-Fi), wireless broadband (Wibro), worldwide interoperability for microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.
The short-range communication module 114 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetooth, Radio Frequency Identification (RFID), infrared data Association (IrDA), Ultra Wideband (UWB), zigbee, and the like.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of the position information module is a Global Positioning System (GPS). According to the current technology, the location information module 115, which is a GPS, calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the location information module 115, which is a GPS, can calculate speed information by continuously calculating current location information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 122, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the cameras 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the construction of the mobile terminal. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The acquisition unit 140 is configured to acquire at least one piece of pressure data, which is obtained when a first body attached to a first position of the terminal is in different motion states, where the pressure data is derived from pressure acquisition points arranged at the first position and a second position of the terminal; the data analysis unit 141 is configured to divide the at least one piece of pressure data according to each corresponding region, and obtain a pressure change value in each region and a pressure distribution rule in each region along with changes in the acquired pressure data; the modeling unit 142 is used for modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model; the recognition unit 143 is configured to recognize a current user gesture change according to the first model simulation pressure change.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification Module may store various information for authenticating a User using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The Display unit 151 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor LCD (TFT-LCD), an Organic Light-Emitting Diode (OLED) Display, a flexible Display, a three-dimensional (3D) Display, and the like. Some of these displays may be configured to be transparent to allow a user to see from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a Transparent Organic Light Emitting Diode (TOLED) display or the like. Depending on the particular desired implementation, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The memory 160 may store software programs or the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, etc.) that has been output or is to be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The Memory 160 may include at least one type of storage medium including a flash Memory, a hard disk, a multimedia card, a card-type Memory (e.g., SD or DX Memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic Memory, a magnetic disk, an optical disk, etc. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, and an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to now, the mobile terminal has been described in terms of its functions. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to an embodiment of the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication System includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global System for Mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of Mobile terminals 100, a plurality of Base Stations (BSs) 270, a Base Station Controller (BSC) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 275.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz,5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT) 295 transmits a Broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The location information module 115, which is a GPS as shown in fig. 1, is generally configured to cooperate with satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
Based on the above mobile terminal hardware structure and communication system, the present invention provides various embodiments of the method.
The first embodiment is as follows:
a terminal according to an embodiment of the present invention includes: the acquisition unit is used for acquiring at least one piece of pressure data obtained when a first body (such as a wrist) attached to a first position of the terminal is in different motion forms (including the vertical direction or the horizontal direction), and the pressure data is derived from pressure acquisition points arranged on the first position (such as a mobile wrist strap) and a second position (the back of a dial plate) of the terminal. And the data analysis unit is used for dividing the at least one pressure data according to each corresponding area, and obtaining pressure change values on each area and pressure distribution rules on each area along with the change of the acquired pressure data. And the modeling unit is used for modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model (such as a trigonometric function model). And the recognition unit is used for recognizing the gesture change of the current user according to the first model simulation pressure change.
By adopting the embodiment of the invention, the pressure acquisition points are distributed on the movable wrist strap, the pressure sensor is used for acquiring the pressure data, meanwhile, the pressure sensor is distributed on the back of the dial plate of the user, and the pressure sensors are used for acquiring the pressure data of the acquisition points corresponding to the positions. By analyzing the pressure data, the pressure distribution rule of the pressure data in each area is obtained, and a mathematical model is constructed based on the pressure data and the pressure distribution rule. Because the first model is obtained according to the pressure change values of the pressure data on the regions and the pressure distribution rules on the regions, when gesture recognition processing is subsequently performed based on the first model, the gesture change of the current user can be accurately recognized according to the pressure change simulated by the first model. The gesture change is not limited by the size of a terminal screen, the gesture change of the current user can be accurately recognized, various convenient operations on the terminal can be realized according to the operation instruction generated by the gesture change, and the method is very simple and easy to implement. The basis of establishing a mathematical model (such as a trigonometric function model) is that the pressure sensor is distributed on the movable wrist strap, and the pressure sensor is distributed on the back of the dial of the user, so that the advantages are that: the pressure that whole wrist strap experienced is more sensitive, and the distribution of pressure is also more balanced, the recognition processing of the user's gesture of being convenient for.
Example two:
a terminal according to an embodiment of the present invention includes: the acquisition unit is used for acquiring at least one piece of pressure data obtained when a first body (such as a wrist) attached to a first position of the terminal is in different motion forms (including the vertical direction or the horizontal direction), and the pressure data is derived from pressure acquisition points arranged on the first position (such as a mobile wrist strap) and a second position (the back of a dial plate) of the terminal. And a coordinate system establishing unit for establishing a coordinate system for the whole area corresponding to the first position of the terminal (such as the mobile wrist strap), and dividing the whole area into four areas by taking the center of the whole area as a coordinate center and taking a horizontal axis x and a vertical axis y as the center. For example, the wristband is laid flat, a coordinate system is established, and the pressure sensor is divided into four areas, up, down, left, and right, as shown in FIG. 3. And the first determining unit is used for determining the pressure data acquired in the positive direction of the y axis by the distribution of the pressure acquisition points to be a positive value. And the second determining unit is used for determining the negative value of the pressure data acquired in the negative direction of the y axis by the distribution of the pressure acquisition points. For example, the pressure sensor may sense positive pressure in the positive y-axis direction and negative pressure in the negative y-axis direction. And the data analysis unit is used for dividing the at least one pressure data according to each corresponding area, and obtaining pressure change values on each area and pressure distribution rules on each area along with the change of the acquired pressure data. And the modeling unit is used for modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model (such as a trigonometric function model). And the recognition unit is used for recognizing the gesture change of the current user according to the first model simulation pressure change.
By adopting the embodiment of the invention, the pressure acquisition points are distributed on the movable wrist strap, the pressure sensor is used for acquiring the pressure data, meanwhile, the pressure sensor is distributed on the back of the dial plate of the user, and the pressure sensors are used for acquiring the pressure data of the acquisition points corresponding to the positions. The pressure sensor is divided into four areas, namely an upper area, a lower area, a left area and a right area, pressure data collected by pressure collection points distributed in the positive direction of the y axis are positive values, pressure data collected by the pressure collection points distributed in the negative direction of the y axis are negative values, according to the principle, the pressure data are analyzed to obtain the pressure distribution rules of the four areas, namely the upper area, the lower area, the left area and the right area, and a mathematical model is constructed based on the pressure data and the pressure distribution rules. Because the first model is obtained according to the pressure change values of the pressure data on the regions and the pressure distribution rules on the regions, when gesture recognition processing is subsequently performed based on the first model, the gesture change of the current user can be accurately recognized according to the pressure change simulated by the first model. The gesture change is not limited by the size of a terminal screen, the gesture change of the current user can be accurately recognized, various convenient operations on the terminal can be realized according to the operation instruction generated by the gesture change, and the method is very simple and easy to implement. The basis of establishing a mathematical model (such as a trigonometric function model) is that the pressure sensor is distributed on the movable wrist strap, and the pressure sensor is distributed on the back of the dial of the user, so that the advantages are that: the pressure that whole wrist strap experienced is more sensitive, and the distribution of pressure is also more balanced, the recognition processing of the user's gesture of being convenient for.
In an implementation manner of the embodiment of the present invention, the terminal further includes: a data analysis unit further to: and dividing the at least one pressure data into corresponding positions in the four regions according to the positions of the corresponding pressure acquisition points.
in an embodiment of the present invention, the data analysis unit is further configured to record an angle of vertical offset as α when the first body attached to the first position of the terminal moves in a y-axis direction, record an angle of horizontal offset as β when the first body attached to the first position of the terminal moves in an x-axis direction, record β as a negative value when the first body moves in the y-axis direction, and record changes in angle values of α and β in the four regions along with changes in pressure data, where β is a negative value when the first body attached to the first position of the terminal moves in the x-axis direction, and β is a positive value when the first body moves in the x-axis direction.
in a practical application, when the wrist is laid flat, the pressure sensed by the pressure sensor is balanced, the angle of deviation in the up-and-down direction is recorded as β, the upward direction is a positive value, the downward direction is a negative value, and β takes a value of-90 degrees to 90 degrees, the angle of deviation in the left-and-right direction is recorded as beta, the left direction is a negative value, the right direction is a positive value, and beta takes a value of-90 degrees to 90 degrees, when the wrist moves vertically and upwards for 90 degrees, i.e. α is 90 degrees, and beta is 0 degrees, the pressure sensed by the pressure sensor in the upper region is balanced, the angle of deviation in the up-and-down direction is recorded as β, the upward direction is a positive value, the downward direction is a negative value, β is-90 degrees to 90 degrees, the left-and-right regions are small, the pressure is distributed basically symmetrically, the pressure is distributed right above the wrist and the right regions is maximum, the pressure in each region is simulated as a trigonometric function model (such as a trigonometric function model), and the pressure change of the gesture can be accurately recognized according to the current gesture recognized by a user.
in an embodiment of the present invention, the recognition unit is further configured to obtain a change in a current user gesture, and decompose the current user gesture into motions in a y-axis direction and an x-axis direction, obtain a first pressure data change (such as a pressure amplitude value) obtained by the motion in the y-axis direction and a second pressure data change obtained by the motion in the x-axis direction, input the first pressure data change and the second pressure data change into the first model, calculate, in real time, an angle value change of an angle α that is offset from a coordinate center in an up-down direction and an angle value change of an angle β that is offset in a left-right direction due to the change in the current user gesture according to the first model, and recognize the current user gesture change according to the angle value change of α and the angle value change of β.
in a practical application, according to the up-and-down variation and the pressure distribution rule of the wrist, the pressure distribution y at the point X2 conforms to a trigonometric function model along with the angular variation, y is-k 1 sin α, the wrist level varies left and right, the pressure distribution rule also conforms to the trigonometric function model along with the angular variation of the pressure distribution y at the point 0, y is-k 2 sin β, k1 is the maximum pressure amplitude sensed by the up-and-down variation, and k2 is the maximum pressure amplitude sensed by the left-and-right variation.
Example three:
as shown in fig. 4, a method for recognizing a user gesture according to an embodiment of the present invention includes:
step 101, collecting at least one pressure data obtained when a first body attached to a first position of a terminal is in different motion states, wherein the pressure data is derived from pressure collecting points arranged on the first position and a second position of the terminal.
Here, at least one pressure data may be collected when the first body (e.g., wrist) attached to the first position of the terminal is in different motion forms (including up and down or left and right directions), and the pressure data may be derived from pressure collection points disposed on the first position (e.g., mobile wrist band) and the second position (e.g., back of the dial) of the terminal.
And 102, dividing the at least one pressure data according to each corresponding area, and obtaining pressure change values on each area and pressure distribution rules on each area along with the change of the acquired pressure data.
103, modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model.
And 104, recognizing the gesture change of the current user according to the simulated pressure change of the first model.
By adopting the embodiment of the invention, at least one piece of pressure data obtained when a first body (such as a wrist) attached to a first position of the terminal is in different motion forms (including the up-down direction or the left-right direction) is collected, the pressure data is obtained from pressure collection points arranged on the first position (such as a mobile wrist strap) and a second position (the back part of a dial plate), pressure change values on all areas and pressure distribution rules on all areas are obtained along with the change of the collected pressure data through steps 102-104, and then modeling is carried out according to the pressure change values on all areas and the pressure distribution rules on all areas, so as to generate a first model (such as a trigonometric function model) for recognizing gestures. Specifically, gather pressure data at the removal wrist strap full pressure collection point, gather pressure data through pressure sensors, user's dial plate back also is full pressure sensors simultaneously, gathers the collection point pressure data that corresponds on these positions through pressure sensors. By analyzing the pressure data, the pressure distribution rule of the pressure data in each area is obtained, and a mathematical model is constructed based on the pressure data and the pressure distribution rule. Because the first model is obtained according to the pressure change values of the pressure data on the regions and the pressure distribution rules on the regions, when gesture recognition processing is subsequently performed based on the first model, the gesture change of the current user can be accurately recognized according to the pressure change simulated by the first model. The gesture change is not limited by the size of a terminal screen, the gesture change of the current user can be accurately recognized, various convenient operations on the terminal can be realized according to the operation instruction generated by the gesture change, and the method is very simple and easy to implement. The basis of establishing a mathematical model (such as a trigonometric function model) is that the pressure sensor is distributed on the movable wrist strap, and the pressure sensor is distributed on the back of the dial of the user, so that the advantages are that: the pressure that whole wrist strap experienced is more sensitive, and the distribution of pressure is also more balanced, the recognition processing of the user's gesture of being convenient for.
Example four:
as shown in fig. 5, a method for recognizing a user gesture according to an embodiment of the present invention includes:
step 201, collecting at least one pressure data obtained when a first body attached to a first position of a terminal is in different motion states, wherein the pressure data is derived from pressure collecting points arranged on the first position and a second position of the terminal.
Here, at least one pressure data may be collected when the first body (e.g., wrist) attached to the first position of the terminal is in different motion forms (including up and down or left and right directions), and the pressure data may be derived from pressure collection points disposed on the first position (e.g., mobile wrist band) and the second position (e.g., back of the dial) of the terminal.
Step 202, establishing a coordinate system by using the whole area corresponding to the first position of the terminal, and dividing the whole area into four areas by using the center of the whole area as a coordinate center and a horizontal axis x and a vertical axis y.
Here, as shown in fig. 3, a coordinate system is established with respect to the entire area corresponding to the first position of the terminal (e.g., moving the wristband), and the entire area is divided into four areas with the horizontal axis x and the vertical axis y as the coordinate center. For example, the wrist band is laid out, a coordinate system is established, and the pressure sensor is divided into four areas, namely an upper area, a lower area, a left area and a right area.
And step 203, taking the pressure data acquired by the pressure acquisition point distribution in the positive direction of the y axis as a positive value.
And 204, taking the pressure data acquired by the pressure acquisition point distribution in the negative direction of the y axis as a negative value.
Step 205, dividing the at least one pressure data into corresponding positions in the four areas according to the position of the corresponding pressure acquisition point, and obtaining a pressure change value on each area and a pressure distribution rule on each area along with the change of the acquired pressure data.
And step 206, modeling according to the pressure change values on the areas and the pressure distribution rules on the areas, and generating a first model.
And step 207, recognizing the gesture change of the current user according to the simulated pressure change of the first model.
By adopting the embodiment of the invention, the pressure acquisition points are distributed on the movable wrist strap, the pressure sensor is used for acquiring the pressure data, meanwhile, the pressure sensor is distributed on the back of the dial plate of the user, and the pressure sensors are used for acquiring the pressure data of the acquisition points corresponding to the positions. The pressure sensor is divided into four areas, namely an upper area, a lower area, a left area and a right area, pressure data collected by pressure collection points distributed in the positive direction of the y axis are positive values, pressure data collected by the pressure collection points distributed in the negative direction of the y axis are negative values, according to the principle, the pressure data are analyzed to obtain the pressure distribution rules of the four areas, namely the upper area, the lower area, the left area and the right area, and a mathematical model is constructed based on the pressure data and the pressure distribution rules. Because the first model is obtained according to the pressure change values of the pressure data on the regions and the pressure distribution rules on the regions, when gesture recognition processing is subsequently performed based on the first model, the gesture change of the current user can be accurately recognized according to the pressure change simulated by the first model. The gesture change is not limited by the size of a terminal screen, the gesture change of the current user can be accurately recognized, various convenient operations on the terminal can be realized according to the operation instruction generated by the gesture change, and the method is very simple and easy to implement. The basis of establishing a mathematical model (such as a trigonometric function model) is that the pressure sensor is distributed on the movable wrist strap, and the pressure sensor is distributed on the back of the dial of the user, so that the advantages are that: the pressure that whole wrist strap experienced is more sensitive, and the distribution of pressure is also more balanced, the recognition processing of the user's gesture of being convenient for.
in an embodiment of the invention, the obtaining of the pressure change values in each area and the pressure distribution rules in each area along with the change of the acquired pressure data includes recording an angle of deviation in the up-down direction as α when the first body attached to the first position of the terminal moves in the y-axis direction, recording an angle of deviation in the left-right direction as β when the first body attached to the first position of the terminal moves in the x-axis direction, recording an angle of deviation in the left-right direction as β when the first body moves in the x-axis direction, and recording changes of the angle values of the α and the β in the four areas along with the change of the pressure data to obtain the pressure distribution rules of which the angle value changes conform to a trigonometric function model.
in a practical application, when the wrist is laid flat, the pressure sensed by the pressure sensor is balanced, the angle of deviation in the up-and-down direction is recorded as β, the upward direction is a positive value, the downward direction is a negative value, and β takes a value of-90 degrees to 90 degrees, the angle of deviation in the left-and-right direction is recorded as beta, the left direction is a negative value, the right direction is a positive value, and beta takes a value of-90 degrees to 90 degrees, when the wrist moves vertically and upwards for 90 degrees, i.e. α is 90 degrees, and beta is 0 degrees, the pressure sensed by the pressure sensor in the upper region is balanced, the angle of deviation in the up-and-down direction is recorded as β, the upward direction is a positive value, the downward direction is a negative value, β is-90 degrees to 90 degrees, the left-and-right regions are small, the pressure is distributed basically symmetrically, the pressure is distributed right above the wrist and the right regions is maximum, the pressure in each region is simulated as a trigonometric function model (such as a trigonometric function model), and the pressure change of the gesture can be accurately recognized according to the current gesture recognized by a user.
in an embodiment of the invention, the method for recognizing the gesture change of the current user according to the pressure change simulated by the first model comprises the steps of obtaining the gesture change of the current user, decomposing the gesture of the current user into movement in a y-axis direction and a x-axis direction, obtaining the change of first pressure data (such as a pressure amplitude value) obtained by the movement in the y-axis direction and the change of second pressure data obtained by the movement in the x-axis direction, inputting the change of the first pressure data and the change of the second pressure data into the first model, calculating the change of an angle value of an angle α deviated from a coordinate center in the up-down direction and the change of an angle beta deviated from the left-right direction in real time according to the change of the gesture of the current user, and recognizing the gesture change of the current user according to the change of the angle value of the angle α and the change of the angle beta.
in a practical application, according to the up-and-down variation and the pressure distribution rule of the wrist, the pressure distribution y at the point X2 conforms to a trigonometric function model along with the angular variation, y is-k 1 sin α, the wrist level varies left and right, the pressure distribution rule also conforms to the trigonometric function model along with the angular variation of the pressure distribution y at the point 0, y is-k 2 sin β, k1 is the maximum pressure amplitude sensed by the up-and-down variation, and k2 is the maximum pressure amplitude sensed by the left-and-right variation.
Example five:
the terminal of the embodiment of the invention comprises: a memory and a processor; wherein, the memory contains computer executable codes for realizing the information interconversion method of the identification information of the above embodiment; the processor may implement the information interconversion scheme for the identification information via computer executable code contained in the memory.
The terminal may be an electronic device such as a PC, a PAD, a tablet computer, a portable electronic device such as a laptop computer, or an intelligent mobile terminal such as a mobile phone, and is not limited to the description herein. The processor may be implemented by a microprocessor, a Central Processing Unit (CPU), a DSP, or an FPGA when executing Processing.
The terminal is shown in fig. 6 as an example of a hardware entity. The apparatus comprises a processor 31, a memory 32 and at least one external communication interface 33; the processor 31, the memory 32 and the at least one external communication interface 33 are all connected by a bus 34.
The following describes embodiments of the present invention through specific application scenarios.
With the widespread use of wearable electronic devices by users at present, such as smart watches, smart bracelets, and the like. The user can use the wearable device to make a call, surf the internet, navigate and the like at any time and any place. However, the screen size of wearable devices represented by smartwatches or bracelets is limited, and the wearable devices cannot meet the requirements of users for quick and convenient operations, such as answering a call, making an emergency call, viewing information, and the like. The user needs to provide a scheme capable of quickly and accurately sensing the gesture change of the user, so that more convenient interaction is brought according to the gesture change of the user, and user-defined gestures are conveniently expanded to carry out convenient processing operation.
For the application scene, the embodiment of the invention can be a design scheme for recognizing the user gesture based on the trigonometric function model sensing pressure angle change. The following specifically describes a design scheme for recognizing the user gesture based on the sensing pressure angle change of the trigonometric function model.
1. The coordinate system needs to be established:
the model basis of establishing is that the pressure sensors are covered with to the removal wrist strap, and user's dial plate back also is covered with pressure sensors simultaneously, and such advantage is that, the pressure that whole wrist strap experienced is more sensitive, and pressure distribution is also more balanced, the recognition processing of the user's gesture of being convenient for. The wristband is laid flat and a coordinate system is established as shown in figure 3.
2. The pressure sensor is divided into four areas: upper zone of
Figure BDA0001126586750000211
Indicated shaded filled areas; lower region of the body
Figure BDA0001126586750000212
Indicated shaded filled areas; left zone of
Figure BDA0001126586750000213
Indicated shaded filled areas; right zone of
Figure BDA0001126586750000214
The indicated shading fills the area as shown in fig. 3.
3. In the coordinate system shown in fig. 3, the pressure sensed by the pressure sensor in the positive y-axis direction is positive, and the pressure sensed in the negative y-axis direction is negative.
4. when the wrist is square, the pressure sensed by the pressure sensor is balanced, the angle of the vertical deviation is recorded as β, the upward direction is a positive value, the downward direction is a negative value, the β is a value between minus 90 degrees and 90 degrees, the angle of the horizontal deviation is recorded as beta, the leftward direction is a negative value, the rightward direction is a positive value, and the beta is a value between minus 90 degrees and 90 degrees.
5. Description of the principle:
when the wrist moves 90 degrees vertically upwards, namely α is 90 degrees, β is 0 degrees, the pressure values sensed in the upper and lower regions are the largest, the left and right regions are small, the pressure distribution is as shown in fig. 7, fig. 7 is a pressure distribution rule diagram of the wrist in the vertical direction, the pressure distribution rule diagram of the wrist in the vertical direction is shown, the pressure in each region is basically distributed symmetrically, the pressure sensed in the upper and lower regions is the largest, the pressure processing in each region is simulated into a mathematical model, as shown in fig. 8, the schematic diagram of the drawn downward model is as shown in fig. 9 according to the pressure analysis of the wrist in the vertical direction, the analysis of fig. 7-9 shows that the pressure distribution y at the X2 point accords with a trigonometric function model along with the angular variation, y is-k 1 sin α, the wrist moves horizontally and from left to right, the pressure distribution rule also accords with the trigonometric function model along with the angular variation, y is-2 β, k is the amplitude variation sensed by the amplitude, the maximum amplitude of the pressure, k is the amplitude variation of the pressure sensed in the upper and the left and right, the current gesture is identified according to the current gesture, the current pressure processing, the left and right, and left and right hand gesture, and left hand gesture, and right hand gestures, and left hand gesture, and right hand gesture.
In the application scenario, by adopting the embodiment of the invention, the pressure change of the pressure sensor can be simulated by utilizing a mathematical model (such as a trigonometric function model), so that the current gesture is recognized in real time, and the corresponding processing is further carried out. More gestures can be defined for a system and a user, gesture recognition is improved, and a richer interaction mode is further provided.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention. The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (6)

1. A terminal for recognizing a gesture of a user, the terminal comprising:
the terminal comprises a collecting unit, a first body and a second body, wherein the collecting unit is used for collecting at least one piece of pressure data obtained when the first body attached to a first position of the terminal is in different motion states, and the pressure data come from pressure collecting points arranged on the first position and a second position of the terminal;
the data analysis unit is used for dividing the at least one pressure data according to each corresponding area, and obtaining pressure change values on each area and pressure distribution rules on each area along with the change of the acquired pressure data;
the modeling unit is used for modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model;
the recognition unit is used for recognizing the gesture change of the current user according to the first model simulation pressure change;
the terminal comprises a coordinate system establishing unit, a coordinate system calculating unit and a judging unit, wherein the coordinate system establishing unit is used for establishing a coordinate system by using the whole area corresponding to the first position of the terminal, and dividing the whole area into four areas by using a horizontal axis x and a vertical axis y by using the center of the whole area as a coordinate center;
the first determining unit is used for determining the pressure data acquired in the positive direction of the y axis by the distribution of the pressure acquisition points as a positive value;
the second determining unit is used for determining the pressure data acquired in the negative direction of the y axis by the distribution of the pressure acquisition points as a negative value;
the data analysis unit is further used for recording the angle of deviation in the vertical direction as α when the first body attached to the first position of the terminal moves in the y-axis direction, recording the angle of deviation in the left-right direction as β when the first body attached to the first position of the terminal moves in the x-axis direction, recording the angle value change of the α and the β in the four areas along with the change of pressure data, and obtaining the pressure distribution rule that the angle value change accords with a trigonometric function model.
2. The terminal of claim 1, wherein the data analysis unit is further configured to:
and dividing the at least one pressure data into corresponding positions in the four regions according to the positions of the corresponding pressure acquisition points.
3. The terminal of claim 1, wherein the identification unit is further configured to:
acquiring the gesture change of a current user, and decomposing the gesture of the current user into motion in the y-axis direction and the x-axis direction;
acquiring a first pressure data change obtained by moving in the y-axis direction and a second pressure data change obtained by moving in the x-axis direction;
inputting the first pressure data change and the second pressure data change into the first model, calculating the angle value change of the vertical direction offset angle α and the angle value change of the horizontal direction offset angle β which deviate from the coordinate center caused by the current user gesture change in real time according to the first model,
and recognizing the current gesture change of the user according to the angle value change of the alpha and the angle value change of the beta.
4. A method of recognizing a gesture of a user, the method comprising:
acquiring at least one piece of pressure data obtained when a first body attached to a first position of a terminal is in different motion states, wherein the pressure data is from pressure acquisition points arranged on the first position and a second position of the terminal;
dividing the at least one pressure data according to each corresponding area, and obtaining pressure change values on each area and pressure distribution rules on each area along with the change of the collected pressure data;
modeling according to the pressure change values on the areas and the pressure distribution rules on the areas to generate a first model;
identifying a current user gesture change according to the first model simulation pressure change;
establishing a coordinate system by using the whole area corresponding to the first position of the terminal, taking the center of the whole area as a coordinate center, and dividing the whole area into four areas by using a horizontal axis x and a vertical axis y; the pressure data acquired by the pressure acquisition points distributed in the positive direction of the y axis is a positive value; the pressure data acquired in the negative direction of the y axis by the pressure acquisition point distribution is a negative value;
the method comprises the steps of recording the angle of deviation in the vertical direction as alpha when a first body attached to a first position of a terminal moves in the y-axis direction, recording the angle of deviation in the left-right direction as β when the first body attached to the first position of the terminal moves in the x-axis direction, recording the angle value change of the alpha and the β in four areas along with the change of pressure data, wherein the angle value change accords with the pressure distribution rule of a trigonometric function model.
5. The method of claim 4, wherein dividing the at least one pressure data by its corresponding respective region comprises:
and dividing the at least one pressure data into corresponding positions in the four regions according to the positions of the corresponding pressure acquisition points.
6. The method of claim 4, wherein identifying a current user gesture change based on the first model simulating a pressure change comprises:
acquiring the gesture change of a current user, and decomposing the gesture of the current user into motion in the y-axis direction and the x-axis direction;
acquiring a first pressure data change obtained by moving in the y-axis direction and a second pressure data change obtained by moving in the x-axis direction;
inputting the first pressure data change and the second pressure data change into the first model, calculating the angle value change of the vertical direction offset angle α and the angle value change of the horizontal direction offset angle β which deviate from the coordinate center caused by the current user gesture change in real time according to the first model,
and recognizing the current gesture change of the user according to the angle value change of the alpha and the angle value change of the beta.
CN201610882626.9A 2016-10-08 2016-10-08 Terminal and method for recognizing user gesture Active CN106354415B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610882626.9A CN106354415B (en) 2016-10-08 2016-10-08 Terminal and method for recognizing user gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610882626.9A CN106354415B (en) 2016-10-08 2016-10-08 Terminal and method for recognizing user gesture

Publications (2)

Publication Number Publication Date
CN106354415A CN106354415A (en) 2017-01-25
CN106354415B true CN106354415B (en) 2020-05-26

Family

ID=57865764

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610882626.9A Active CN106354415B (en) 2016-10-08 2016-10-08 Terminal and method for recognizing user gesture

Country Status (1)

Country Link
CN (1) CN106354415B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915111A (en) * 2012-04-06 2013-02-06 寇传阳 Wrist gesture control system and method
CN103631368A (en) * 2012-08-27 2014-03-12 联想(北京)有限公司 Detection device, detection method and electronic equipment
CN103984416A (en) * 2014-06-10 2014-08-13 北京邮电大学 Gesture recognition method based on acceleration sensor
CN104778746A (en) * 2015-03-16 2015-07-15 浙江大学 Method for performing accurate three-dimensional modeling based on data glove by using natural gestures
CN105022471A (en) * 2014-04-23 2015-11-04 王建勤 Device and method for carrying out gesture recognition based on pressure sensor array
CN105511624A (en) * 2015-12-15 2016-04-20 上海斐讯数据通信技术有限公司 Wearable wireless input system, equipment and method
CN105892881A (en) * 2015-12-11 2016-08-24 乐视移动智能信息技术(北京)有限公司 Human-computer interaction method and device, and mobile equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915111A (en) * 2012-04-06 2013-02-06 寇传阳 Wrist gesture control system and method
CN103631368A (en) * 2012-08-27 2014-03-12 联想(北京)有限公司 Detection device, detection method and electronic equipment
CN105022471A (en) * 2014-04-23 2015-11-04 王建勤 Device and method for carrying out gesture recognition based on pressure sensor array
CN103984416A (en) * 2014-06-10 2014-08-13 北京邮电大学 Gesture recognition method based on acceleration sensor
CN104778746A (en) * 2015-03-16 2015-07-15 浙江大学 Method for performing accurate three-dimensional modeling based on data glove by using natural gestures
CN105892881A (en) * 2015-12-11 2016-08-24 乐视移动智能信息技术(北京)有限公司 Human-computer interaction method and device, and mobile equipment
CN105511624A (en) * 2015-12-15 2016-04-20 上海斐讯数据通信技术有限公司 Wearable wireless input system, equipment and method

Also Published As

Publication number Publication date
CN106354415A (en) 2017-01-25

Similar Documents

Publication Publication Date Title
CN104660913B (en) Focus adjustment method and apparatus
CN104750420B (en) Screenshotss method and device
CN105468158B (en) Color adjustment method and mobile terminal
US10587747B2 (en) Method, apparatus, terminal, and storage medium for entering numeric symbols using touch screen frame
CN104767889B (en) Screen state control method and device
CN110928708B (en) Icon display method and device, electronic equipment and computer readable storage medium
CN106897018B (en) Gesture operation method and device and mobile terminal
CN104777982B (en) Method and device for switching terminal input method
CN106598538B (en) Instruction set updating method and system
CN106372264B (en) Map data migration device and method
CN105763847A (en) Monitoring method and monitoring terminal
CN106066766A (en) A kind of mobile terminal and control method thereof
CN105975753A (en) Method for calculating moving speed of user and mobile terminal
CN106648105B (en) Antenna switching method and device for intelligent equipment
CN106851114B (en) Photo display device, photo generation device, photo display method, photo generation method and terminal
CN106371682A (en) Gesture recognition system based on proximity sensor and method thereof
CN106648324B (en) Hidden icon control method and device and terminal
CN105227771B (en) Picture transmission method and device
CN105302441A (en) Screen size adjustment method and terminal device
CN105159571A (en) Page slide control method and mobile terminal
CN106778167B (en) Fingerprint identification device and method
CN106569670B (en) Device and method for processing application
CN104777959A (en) Cursor positioning method and cursor positioning device for mobile terminal
CN107168612A (en) A kind of image acquisition method and terminal
CN106125946A (en) A kind of control method, mobile terminal and helmet

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200414

Address after: 325000 urban and rural distribution center, Yunzhou street, Ruian City, Wenzhou City, Zhejiang Province 523 (Qiaomao town)

Applicant after: Ruian brilliant Network Technology Co., Ltd

Address before: 518000 Guangdong Province, Shenzhen high tech Zone of Nanshan District City, No. 9018 North Central Avenue's innovation building A, 6-8 layer, 10-11 layer, B layer, C District 6-10 District 6 floor

Applicant before: NUBIA TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant