CN109933192A - A kind of implementation method, terminal and the computer readable storage medium of gesture high up in the air - Google Patents

A kind of implementation method, terminal and the computer readable storage medium of gesture high up in the air Download PDF

Info

Publication number
CN109933192A
CN109933192A CN201910138586.0A CN201910138586A CN109933192A CN 109933192 A CN109933192 A CN 109933192A CN 201910138586 A CN201910138586 A CN 201910138586A CN 109933192 A CN109933192 A CN 109933192A
Authority
CN
China
Prior art keywords
event
air
value
action
physical button
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910138586.0A
Other languages
Chinese (zh)
Inventor
张欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201910138586.0A priority Critical patent/CN109933192A/en
Publication of CN109933192A publication Critical patent/CN109933192A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

This application involves implementation method, terminal and the computer readable storage mediums of a kind of gesture high up in the air.The described method includes: the pulse data that default optical sensor captures is modeled as physical button value;According to the physical button value of simulation, judge whether the corresponding event of the pulse data is gesture high up in the air;It is contact action by the corresponding event simulation of the pulse data when being determined as is.The application passes through Register Sensor and identifies that gesture obtains the pulse that photosensitive sensors report, and after receiving pulse, is converted into simulation manpower contact action, to efficiently, efficiently realize gesture high up in the air.

Description

A kind of implementation method, terminal and the computer readable storage medium of gesture high up in the air
Technical field
This application involves field of terminal equipment more particularly to a kind of implementation method, terminal and the computer of gesture high up in the air can Read storage medium.
Background technique
With the development of touch screen technology, the terminal with touch screen functionality has been popularized.But in some scenes, for example, The scenes such as other articles are held on hand, or are being eaten, if user can bring by touch screen mode come operating terminal at this time Bad experience.And gesture high up in the air can carry out upper cunning, downslide, left cunning and right cunning to realize to terminal above terminal Manipulation, to effectively improve experience of the user in these scenes.
Gesture high up in the air refers to the gesture motion by not touching terminal screen come the function of operating terminal.Currently available technology In not yet propose the scheme for effectively realizing gesture high up in the air.
Summary of the invention
In order to solve the above-mentioned technical problem or it at least is partially solved above-mentioned technical problem, this application provides a kind of icepros Implementation method, terminal and the computer readable storage medium of empty-handed gesture.
In a first aspect, this application provides a kind of implementation methods of gesture high up in the air, which comprises
The pulse data that default optical sensor captures is modeled as physical button value;
According to the physical button value of simulation, judge whether the corresponding event of the pulse data is gesture high up in the air;
It is contact action by the corresponding event simulation of the pulse data when being determined as is.
Optionally, the pulse data that default optical sensor is captured is modeled as physical button value, comprising:
The pulse data is captured according to the optical sensor;
The pulse data is modeled as physical button;
Obtain the physical button value of the corresponding simulation of physical button of simulation.
It is optionally, described that the pulse data is modeled as physical button, comprising:
In the Root View of the ccf layer of preset operating system, by distribution processor value event functions by the pulse data It is modeled as physical button.
Optionally, the pulse data that default optical sensor is captured is modeled as before physical button value, comprising:
Preset physical button value is corresponded to according to the gesture high up in the air, registers the optical sensor.
Optionally, it is described by the corresponding event simulation of the pulse data be contact action, comprising:
Obtain touch event information corresponding to the corresponding event of the pulse data;
The touch event information is executed by distribution processor touch event function in the operating system, to will be described The corresponding event simulation of pulse data is contact action.
Optionally, the touch event information includes at least: ACTION_DOWN event, ACTION_MOVE event and ACTION_UP event;
It is described that the touch event is executed by distribution processor touch event function in the operating system, comprising:
Preset starting point coordinate is corresponded to according to the ACTION_DOWN event and the ACTION_MOVE event is corresponding pre- If moving parameter, by distribution processor touch event function in the operating system execute the ACTION_DOWN event, ACTION_MOVE event and ACTION_UP event;The time interval that the moving parameter includes mobile number, repeatedly moves Relationship between the distance repeatedly moved.
Optionally, when distribution processor touch event function executes ACTION_MOVE event in through the operating system, Further include:
Obtain the assignment of preset physical button mark;
It is described be assigned a value of the first value when, executed by distribution processor touch event function in the operating system ACTION_MOVE event;
It is described be assigned a value of second value when, stopping executed by distribution processor touch event function in the operating system ACTION_MOVE event;First value is gesture high up in the air for identifying current contact action, and the second value is worked as identifying Preceding contact action is non-gesture high up in the air.
Optionally, the method also includes:
When being determined as is, preset physical button mark is assigned a value of the first value;
By the corresponding event simulation of the pulse data be contact action after, by the physical button mark be assigned a value of Second value.
Second aspect, this application provides a kind of terminal, the terminal includes:
Memory, processor and it is stored in the computer program that can be run on the memory and on the processor;
The step of as above described in any item methods are realized when the computer program is executed by the processor.
The third aspect, this application provides a kind of computer readable storage medium, on the computer readable storage medium It is stored with the realization program of gesture high up in the air, as above any one institute is realized when the realization program of the gesture high up in the air is executed by processor The step of implementation method for the gesture high up in the air stated.
Above-mentioned technical proposal provided by the embodiments of the present application has the advantages that compared with prior art
Implementation method, terminal and the computer readable storage medium of gesture high up in the air provided by the embodiments of the present application, pass through note Volume sensor simultaneously identifies that gesture obtains the pulse that photosensitive sensors report, and after receiving pulse, is converted into simulation manpower touching Screen operation, to efficiently, efficiently realize gesture high up in the air.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows and meets implementation of the invention Example, and be used to explain the principle of the present invention together with specification.
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, for those of ordinary skill in the art Speech, without any creative labor, is also possible to obtain other drawings based on these drawings.
Fig. 1 is a kind of schematic diagram for the wearable device that each embodiment of the application provides;
Fig. 2 is another schematic diagram for the wearable device that each embodiment of the application provides;
Fig. 3 is the hardware structural diagram of the wearable device of each embodiment of the application;
Fig. 4 is the flow chart of the implementation method of gesture high up in the air provided by the embodiments of the present application;
Fig. 5 is the realization effect diagram of gesture high up in the air provided by the embodiments of the present application.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In subsequent description, it is only using the suffix for indicating such as " module ", " component " or " unit " of element Be conducive to explanation of the invention, itself there is no a specific meaning.Therefore, " module ", " component " or " unit " can mix Ground uses.
Terminal can be implemented in a variety of manners.For example, terminal described in the present invention may include such as wearable sets Standby, Intelligent bracelet, mobile phone, tablet computer, laptop, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable media player (Portable Media Player, PMP), navigation device and pedometer Equal mobile terminals.With the continuous development of Screen Technology, the appearance of the screens form such as flexible screen, Folding screen, mobile terminal can also Using as wearable device.
It will be illustrated by taking mobile terminal as an example in subsequent descriptions, it will be appreciated by those skilled in the art that in addition to special Except element for moving purpose, the construction of embodiment according to the present invention can also apply to the terminal of fixed type.
Fig. 1 and Fig. 2 are please referred to, is the structural schematic diagram of wearable device provided by the embodiments of the present application, including can in figure Wearable device 100, referring to Fig. 3, its to realize the present invention a kind of wearable device of each embodiment hardware configuration signal Figure, the wearable device 100 may include: that RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio are defeated Out unit 103, A/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, connect The components such as mouth unit 108, memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that in Fig. 3 The wearable device structure shown does not constitute the restriction to wearable device, wearable device may include than illustrate it is more or Less component perhaps combines certain components or different component layouts.
It is specifically introduced below with reference to all parts of the Fig. 3 to wearable device:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, signal sends and receivees, specifically, by base station Downlink information receive after, to processor 110 handle;In addition, the data of uplink are sent to base station.In general, radio frequency unit 101 Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, penetrating Frequency unit 101 can also be communicated with network and other equipment by wireless communication.Any communication can be used in above-mentioned wireless communication Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code Division Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code Division Multiple Access, wideband code division multiple access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, TD SDMA), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency division duplex long term evolution) and TDD-LTE (Time Division Duplexing-Long Term Evolution, time division duplex long term evolution) etc..
WiFi belongs to short range wireless transmission technology, and wearable device can help user to receive and dispatch by WiFi module 102 Email, browsing webpage and access streaming video etc., it provides wireless broadband internet access for user.Although Fig. 3 WiFi module 102 is shown, but it is understood that, and it is not belonging to must be configured into for wearable device, it completely can root It is omitted within the scope of not changing the essence of the invention according to needs.
Audio output unit 103 can be in call signal reception pattern, call mode, record in wearable device 100 When under the isotypes such as mode, speech recognition mode, broadcast reception mode, by radio frequency unit 101 or WiFi module 102 it is received or The audio data that person stores in memory 109 is converted into audio signal and exports to be sound.Moreover, audio output unit 103 can also provide audio output relevant to the specific function that wearable device 100 executes (for example, call signal reception sound Sound, message sink sound etc.).Audio output unit 103 may include loudspeaker, buzzer etc..
A/V input unit 104 is for receiving audio or video signal.A/V input unit 104 may include graphics processor (Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode Or the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries out Reason.Treated, and picture frame may be displayed on display unit 106.Through graphics processor 1041, treated that picture frame can be deposited Storage is sent in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042 Quiet down sound (audio data), and can be audio data by such acoustic processing.Audio that treated (voice) data can To be converted to the format output that can be sent to mobile communication base station via radio frequency unit 101 in the case where telephone calling model. Microphone 1042 can be implemented various types of noises elimination (or inhibition) algorithms and send and receive sound to eliminate (or inhibition) The noise generated during frequency signal or interference.
Wearable device 100 further includes at least one sensor 105, for example, optical sensor, motion sensor and other Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to ring The light and shade of border light adjusts the brightness of display panel 1061, proximity sensor can when wearable device 100 is moved in one's ear, Close display panel 1061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions The size of (generally three axis) acceleration, can detect that size and the direction of gravity, can be used to identify mobile phone posture when static It (for example pedometer, is struck using (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function Hit) etc.;Fingerprint sensor, pressure sensor, iris sensor, molecule sensor, the gyroscope, gas that can also configure as mobile phone The other sensors such as meter, hygrometer, thermometer, infrared sensor are pressed, details are not described herein.
Display unit 106 is for showing information input by user or being supplied to the information of user.Display unit 106 can wrap Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 1061.
User input unit 107 can be used for receiving the number or character information of input, and generate and wearable device User setting and the related key signals input of function control.Specifically, user input unit 107 may include touch panel 1071 And other input equipments 1072.Touch panel 1071, also referred to as touch screen collect the touch behaviour of user on it or nearby Make (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 1071 or in touch panel Operation near 1071), and corresponding attachment device is driven according to preset formula.Touch panel 1071 may include touching Two parts of detection device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch behaviour Make bring signal, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and It is converted into contact coordinate, then gives processor 110, and order that processor 110 is sent can be received and executed.This Outside, touch panel 1071 can be realized using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves.In addition to touching Panel 1071 is controlled, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 It can include but is not limited to physical keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse, operation One of bar etc. is a variety of, specifically herein without limitation.
Further, touch panel 1071 can cover display panel 1061, when touch panel 1071 detect on it or After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing The type of part provides corresponding visual output on display panel 1061.Although in Fig. 3, touch panel 1071 and display panel 1061 be the function that outputs and inputs of realizing wearable device as two independent components, but in certain embodiments, Touch panel 1071 and display panel 1061 can be integrated and be realized the function that outputs and inputs of wearable device, specifically herein Without limitation.
Interface unit 108 be used as at least one external device (ED) connect with wearable device 100 can by interface.Example Such as, external device (ED) may include wired or wireless headphone port, external power supply (or battery charger) port, You Xianhuo Wireless data communications port, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number It is believed that breath, electric power etc.) and the input received is transferred to one or more elements in wearable device 100 or can For transmitting data between wearable device 100 and external device (ED).
Memory 109 can be used for storing software program and various data.Memory 109 can mainly include storing program area The storage data area and, wherein storing program area can (such as the sound of application program needed for storage program area, at least one function Sound playing function, image player function etc.) etc.;Storage data area can store according to mobile phone use created data (such as Audio data, phone directory etc.) etc..In addition, memory 109 may include high-speed random access memory, it can also include non-easy The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of wearable device, utilizes various interfaces and the entire wearable device of connection Various pieces, by running or execute the software program and/or module that are stored in memory 109, and call and be stored in Data in memory 109 execute the various functions and processing data of wearable device, to carry out to wearable device whole Monitoring.Processor 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulation Demodulation processor, wherein the main processing operation system of application processor, user interface and application program etc., modulation /demodulation processing Device mainly handles wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Wearable device 100 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply 111 can be logically contiguous by power-supply management system and processor 110, thus charged by power-supply management system realization management, The functions such as electric discharge and power managed.
Although Fig. 3 is not shown, wearable device 100 can also be including bluetooth module etc., and details are not described herein.
Based on above-mentioned wearable mobile terminal hardware configuration, each embodiment of the present invention is proposed.
Embodiment one
The embodiment of the present invention provides a kind of implementation method of gesture high up in the air, as shown in Figure 1, which comprises
The pulse data that default optical sensor captures is modeled as physical button value by S101;
S102 judges whether the corresponding event of the pulse data is gesture high up in the air according to the physical button value of simulation;
The corresponding event simulation of the pulse data is contact action when being determined as is by S103.
Wherein, gesture high up in the air may include the gestures such as the upper cunning for not touching terminal screen, downslide, left cunning and right cunning;Such as Shown in Fig. 5, pass through the gesture upward sliding page that soars aloft.
Method can execute in the terminal in the embodiment of the present invention, and terminal may include such as wearable device, intelligent hand Ring, mobile phone, tablet computer, laptop, palm PC, personal digital assistant, portable media player, navigation device with And the mobile terminals such as pedometer.
Pulse data (gesture event i.e. high up in the air) can be captured in the embodiment of the present invention by optical sensor, according to capture Different gesture events high up in the air, come analog-converted be different physical button values;Wherein key value (the i.e. physics of physical button Key value) it can be preset as biggish value, determination is newly to be worth, and no and other physical button values repeat;Wherein, key value is rule The numerical value of a fixed int type, to distinguish other types.
The embodiment of the present invention can be by sensor management class SensorManager, based on gesture type type high up in the air Sensor.TYPE_EXTEND_BASE realizes registration optical sensor, that is to say, that is corresponded to according to the gesture high up in the air preset Physical button value registers the optical sensor.Then the pulse data is captured according to the optical sensor;By the umber of pulse According to being modeled as physical button;Obtain the physical button value of the corresponding simulation of physical button of simulation;Such as pass through Keyevent.getScanCode () gets key value;So as to by the physical button value of the simulation with it is described preset Physical button value compares;According to the comparison as a result, judging whether the corresponding event of the pulse data is hand high up in the air Gesture.For example, the pulse data is corresponding if the physical button value of simulation belongs to one in preset physical button value Event be exactly gesture high up in the air.
The pulse data that default optical sensor captures is modeled as physical button value in the embodiment of the present invention, and according to simulation Physical button value, when determining that the corresponding event of the pulse data is gesture high up in the air, by the corresponding event of the pulse data It is modeled as contact action, to pass through Register Sensor and identify that gesture obtains the pulse that photosensitive sensors report, and is being received To after pulse, it is converted into simulation manpower contact action, to efficiently, efficiently realize gesture high up in the air.
It is described that the pulse data is modeled as physical button in some embodiments of the embodiment of the present invention, it can be with Include:
In the Root View of the ccf layer of preset operating system, by distribution processor value event functions by the pulse data It is modeled as physical button.
Wherein operating system can be Android (Android) system.That is, to realize icepro in whole operation system Empty-handed gesture needs the DecorView (Root View) in framework (ccf layer) layer of Android Pulse data is handled in dispatchKeyEvent (distribution processor value event functions), to can convert pulse data to KeyEvent (physical button event), gets key value by Keyevent.getScanCode ().
In some embodiments of the embodiment of the present invention, it is described by the corresponding event simulation of the pulse data be touch screen It operates, may include:
Obtain touch event information corresponding to the corresponding event of the pulse data;
The touch event is executed by dispatchTouch in the operating system (distribution processor touch event function) Information, to be contact action by the corresponding event simulation of the pulse data.
Wherein it is possible to obtain touch event information by event.getAction function;Touch event information is at least wrapped Include: ACTION_DOWN event when screen (finger first contacts to trigger), (finger slides ACTION_MOVE event on the screen When trigger, can repeatedly trigger) and ACTION_UP event (triggering when finger frames out).That is, each gesture high up in the air can To be converted to a series of touch event, so as to execute this series of touch event by dispatchTouch, thus It is contact action by the corresponding event simulation of the pulse data.
Wherein, described that the touch event is executed by distribution processor touch event function in the operating system, it can be with Include:
Preset starting point coordinate is corresponded to according to the ACTION_DOWN event and the ACTION_MOVE event is corresponding pre- If moving parameter, by distribution processor touch event function in the operating system execute the ACTION_DOWN event, ACTION_MOVE event and ACTION_UP event;The time interval that the moving parameter includes mobile number, repeatedly moves Relationship between the distance repeatedly moved.That is, one gesture of every execution can only receive when handling sliding problem The value that optical sensor reports, and the slide of finger is simulated later, ACTION_ can be determined by this value The events such as DOWN and subsequent ACTION_MOVE and ACTION_UP carry out operation interface.
For example, the coordinate of starting point coordinate (x, y) can choose as (50,480), this coordinate in ACTION_DOWN Point is relatively good touch point;When handling ACTION_MOVE, the number for defining each gesture move (movement) is 15 Secondary, the time that move is separated by twice is 10 seconds, carries out circulation move event by Handler.In order to reach smooth sliding effect Fruit sets relationship between the distance repeatedly moved in the embodiment of the present invention relationship of an equal difference array, that is to say, that first The distance of secondary movement is maximum, is then successively reduced, to achieve the effect that very smooth;Wherein handler is mainly used for It updates UI (user interface).
In practical applications, operation can there are entanglement problems for gesture of soaring aloft and finger actual touch, in order to effectively solve The problem can also wrap when distribution processor touch event function executes ACTION_MOVE event in through the operating system It includes:
Obtain the assignment of preset KeyEventFlag (physical button mark);
It is described be assigned a value of the first value (true) when, executed by distribution processor touch event function in the operating system ACTION_MOVE event;
It is described be assigned a value of second value (false) when, stopping pass through distribution processor touch event letter in the operating system Number executes ACTION_MOVE event;First value is gesture high up in the air for identifying current contact action, and the second value is used for Identifying current contact action is non-gesture high up in the air.
That is, in some embodiments, it, will when determining the corresponding event of the pulse data is gesture high up in the air Preset physical button mark is assigned a value of the first value;
After the corresponding event of the pulse data is completed simulation contact action, the physical button is identified into assignment For second value.
For example, a KeyEventFlag can have been pre-defined, then pass through in dispatchKeyEvent The assignment of MotionEvent.getSource () acquisition KeyEventFlag.So as to pass through Whether MotionEvent.getSource () is event high up in the air to distinguish, and is that then KeyEventFlag value is true, Handler Transmitted move event continues to execute, and is false if not then KeyEventFlag, stops move event immediately, then hold Row finger actual touch event, to efficiently solve the problems, such as entanglement between gesture and actual touch event high up in the air.
Embodiment two
Based on each embodiment in embodiment one, the embodiment of the present invention provides a kind of realization of gesture of optionally soaring aloft Method, which comprises
Step 1, by SensorManager, according to gesture type type Sensor.TYPE_EXTEND_BASE high up in the air come Realize Register Sensor.
Step 2 will realize gesture high up in the air in whole operation system, need framework layers of DecorView's This event is handled in dispatchKeyEvent, event high up in the air can be converted into KeyEvent, pass through Keyevent.getScanCode () gets key value to determine whether being gesture event high up in the air, passes through Event.getAction is to determine whether be KeyEvent.ACTION_DOWN.
Step 3, gesture event high up in the air need dispatchTouch method to execute, incoming to it based on this MotionEvent event (i.e. touch event), that is, ACTION_DOWN, ACTION_MOVE, ACTION_UP event, and return Return true.
Step 4, in ACTION_DOWN, the relatively good touch point of the coordinate selection (50,480) of (x, y) is being handled When ACTION_MOVE, the number for defining each gesture move is 15 times, and the time that move is separated by twice is 10 seconds, is passed through Handler carries out circulation move event and defines an equal difference array in order to reach smooth sliding effect, move for the first time Distance is maximum, then successively reduces, has achieved the effect that very smooth.
Step 5 is to solve the problems, such as entanglement, defines a KeyEventFlag, passes through in dispatchKeyEvent Whether MotionEvent.getSource () is event high up in the air to distinguish, and is that then KeyEventFlag value is true, Handler Transmitted move event continues to execute, and is false if not then KeyEventFlag, stops move event immediately, then hold Row finger actual touch event, to efficiently solve the problems, such as entanglement.
Embodiment three
The embodiment of the present invention provides a kind of terminal, and the terminal includes:
Memory, processor and it is stored in the computer program that can be run on the memory and on the processor;
It is realized as described in any one of embodiment one and embodiment two when the computer program is executed by the processor Method the step of.
Example IV
The embodiment of the present invention provides a kind of computer readable storage medium, is stored on the computer readable storage medium The realization program of gesture high up in the air, realizes such as embodiment one and embodiment when the realization program of the gesture high up in the air is executed by processor The step of implementation method of gesture high up in the air described in any one of two.
Embodiment three and example IV in specific implementation, can have corresponding refering to embodiment one and embodiment two Technical effect.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do There is also other identical elements in the process, method of element, article or device.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art The part contributed out can be embodied in the form of software products, which is stored in a storage medium In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal (can be mobile phone, computer, service Device, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much Form, all of these belong to the protection of the present invention.

Claims (10)

1. a kind of implementation method of gesture high up in the air, which is characterized in that the described method includes:
The pulse data that default optical sensor captures is modeled as physical button value;
According to the physical button value of simulation, judge whether the corresponding event of the pulse data is gesture high up in the air;
It is contact action by the corresponding event simulation of the pulse data when being determined as is.
2. the implementation method of gesture high up in the air according to claim 1, which is characterized in that described to capture default optical sensor Pulse data be modeled as physical button value, comprising:
The pulse data is captured according to the optical sensor;
The pulse data is modeled as physical button;
The physical button for obtaining simulation corresponds to the physical button value of the simulation.
3. the implementation method of gesture high up in the air according to claim 3, which is characterized in that described to simulate the pulse data For physical button, comprising:
In the Root View of the ccf layer of preset operating system, the pulse data is simulated by distribution processor value event functions For physical button.
4. the implementation method of gesture high up in the air according to claim 1, which is characterized in that described to capture default optical sensor Pulse data be modeled as before physical button value, comprising:
Preset physical button value is corresponded to according to the gesture high up in the air, registers the optical sensor.
5. the implementation method of gesture high up in the air described in any one of -4 according to claim 1, which is characterized in that it is described will be described The corresponding event simulation of pulse data is contact action, comprising:
Obtain touch event information corresponding to the corresponding event of the pulse data;
The touch event information is executed by distribution processor touch event function in the operating system, to by the pulse The corresponding event simulation of data is contact action.
6. the implementation method of gesture high up in the air according to claim 5, which is characterized in that the touch event information is at least wrapped It includes: ACTION_DOWN event, ACTION_MOVE event and ACTION_UP event;
It is described that the touch event is executed by distribution processor touch event function in the operating system, comprising:
Correspond to preset starting point coordinate according to the ACTION_DOWN event and the ACTION_MOVE event correspond to it is preset Moving parameter, by distribution processor touch event function in the operating system execute the ACTION_DOWN event, ACTION_MOVE event and ACTION_UP event;The time interval that the moving parameter includes mobile number, repeatedly moves Relationship between the distance repeatedly moved.
7. the implementation method of gesture high up in the air according to claim 6, which is characterized in that divide in through the operating system When hair processing touch event function executes ACTION_MOVE event, further includes:
Obtain the assignment of preset physical button mark;
It is described be assigned a value of the first value when, pass through in the operating system distribution processor touch event function execute ACTION_ MOVE event;
It is described be assigned a value of second value when, stopping executed by distribution processor touch event function in the operating system ACTION_MOVE event;First value is gesture high up in the air for identifying current contact action, and the second value is worked as identifying Preceding contact action is non-gesture high up in the air.
8. the implementation method of gesture high up in the air according to claim 6, which is characterized in that the method also includes:
When being determined as is, preset physical button mark is assigned a value of the first value;
By the corresponding event simulation of the pulse data be contact action after, by the physical button mark be assigned a value of second Value.
9. a kind of terminal, which is characterized in that the terminal includes:
Memory, processor and it is stored in the computer program that can be run on the memory and on the processor;
The step such as method described in any item of the claim 1 to 8 is realized when the computer program is executed by the processor Suddenly.
10. a kind of computer readable storage medium, which is characterized in that be stored with hand high up in the air on the computer readable storage medium It realizes when the realization program of the realization program of gesture, the gesture high up in the air is executed by processor such as any one of claims 1 to 8 institute The step of implementation method for the gesture high up in the air stated.
CN201910138586.0A 2019-02-25 2019-02-25 A kind of implementation method, terminal and the computer readable storage medium of gesture high up in the air Pending CN109933192A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910138586.0A CN109933192A (en) 2019-02-25 2019-02-25 A kind of implementation method, terminal and the computer readable storage medium of gesture high up in the air

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910138586.0A CN109933192A (en) 2019-02-25 2019-02-25 A kind of implementation method, terminal and the computer readable storage medium of gesture high up in the air

Publications (1)

Publication Number Publication Date
CN109933192A true CN109933192A (en) 2019-06-25

Family

ID=66985916

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910138586.0A Pending CN109933192A (en) 2019-02-25 2019-02-25 A kind of implementation method, terminal and the computer readable storage medium of gesture high up in the air

Country Status (1)

Country Link
CN (1) CN109933192A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112379797A (en) * 2020-12-01 2021-02-19 宁波视睿迪光电有限公司 Key touch method and device, touch equipment and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090265671A1 (en) * 2008-04-21 2009-10-22 Invensense Mobile devices with motion gesture recognition
CN102609151A (en) * 2012-01-18 2012-07-25 北京北阳电子技术有限公司 E-book reader and page turning method for same
CN103713735A (en) * 2012-09-29 2014-04-09 华为技术有限公司 Method and device of controlling terminal equipment by non-contact gestures
CN104460987A (en) * 2014-11-07 2015-03-25 惠州Tcl移动通信有限公司 Electronic equipment capable of being controlled by non-contact gestures
CN107589832A (en) * 2017-08-01 2018-01-16 深圳市汇春科技股份有限公司 It is a kind of based on optoelectronic induction every empty gesture identification method and its control device
CN108181990A (en) * 2018-01-19 2018-06-19 昆山国显光电有限公司 Control method and smartwatch based on infrared gesture identification

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090265671A1 (en) * 2008-04-21 2009-10-22 Invensense Mobile devices with motion gesture recognition
CN102609151A (en) * 2012-01-18 2012-07-25 北京北阳电子技术有限公司 E-book reader and page turning method for same
CN103713735A (en) * 2012-09-29 2014-04-09 华为技术有限公司 Method and device of controlling terminal equipment by non-contact gestures
CN104460987A (en) * 2014-11-07 2015-03-25 惠州Tcl移动通信有限公司 Electronic equipment capable of being controlled by non-contact gestures
CN107589832A (en) * 2017-08-01 2018-01-16 深圳市汇春科技股份有限公司 It is a kind of based on optoelectronic induction every empty gesture identification method and its control device
CN108181990A (en) * 2018-01-19 2018-06-19 昆山国显光电有限公司 Control method and smartwatch based on infrared gesture identification

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
佚名: ""android事件传递ACTION_DOWN、ACTION_MOVE、ACTION_UP的深入研究"", 《CSDN》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112379797A (en) * 2020-12-01 2021-02-19 宁波视睿迪光电有限公司 Key touch method and device, touch equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN109032734A (en) A kind of background application display methods and mobile terminal
CN109388304A (en) A kind of screenshotss method and terminal device
CN110413364A (en) A kind of information processing method and terminal
CN109379484A (en) A kind of information processing method and terminal
CN109151367A (en) A kind of video call method and terminal device
CN108366220A (en) A kind of video calling processing method and mobile terminal
CN109582475A (en) A kind of sharing method and terminal
CN107592406A (en) Message display method, mobile terminal and the storage medium of mobile terminal
CN109710165A (en) A kind of drawing processing method and mobile terminal
CN109151162A (en) A kind of multi-panel screen interaction control method, equipment and computer readable storage medium
CN107580350A (en) A kind of signal intensity switching method, equipment and computer-readable recording medium
CN109407948A (en) A kind of interface display method and mobile terminal
CN109799912A (en) A kind of display control method, equipment and computer readable storage medium
CN110096203A (en) A kind of screenshot method and mobile terminal
CN109886686A (en) A kind of safe payment method, equipment and computer readable storage medium
CN109117105A (en) A kind of collaboration desktop interaction regulation method, equipment and computer readable storage medium
CN108958625A (en) A kind of screen interaction regulation method, equipment and computer readable storage medium
CN108744495A (en) A kind of control method of virtual key, terminal and computer storage media
CN109992472A (en) A kind of interface monitoring method, terminal and computer readable storage medium
CN108227842A (en) Wearable pattern switching based reminding method, mobile terminal and readable storage medium storing program for executing
CN109947617A (en) A kind of method, terminal and readable storage medium storing program for executing monitored application interface and show content
CN109947314A (en) A kind of touch controlled key processing method, terminal and computer readable storage medium
CN110096173A (en) Method for controlling mobile terminal, mobile terminal and computer readable storage medium
CN109814793A (en) A kind of interface control method, equipment and computer readable storage medium
CN110225192A (en) A kind of message display method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190625

RJ01 Rejection of invention patent application after publication