CN109491584A - A kind of screen control method and a kind of mobile terminal based on mobile terminal - Google Patents

A kind of screen control method and a kind of mobile terminal based on mobile terminal Download PDF

Info

Publication number
CN109491584A
CN109491584A CN201811209557.0A CN201811209557A CN109491584A CN 109491584 A CN109491584 A CN 109491584A CN 201811209557 A CN201811209557 A CN 201811209557A CN 109491584 A CN109491584 A CN 109491584A
Authority
CN
China
Prior art keywords
gesture
screen
user
center
operating gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811209557.0A
Other languages
Chinese (zh)
Inventor
周凡贻
唐僖僖
张韩月
邹成珅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Acoustic Manufacturing Co Ltd
Original Assignee
Shenzhen Acoustic Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Acoustic Manufacturing Co Ltd filed Critical Shenzhen Acoustic Manufacturing Co Ltd
Priority to CN201811209557.0A priority Critical patent/CN109491584A/en
Priority to PCT/CN2018/124023 priority patent/WO2020077852A1/en
Publication of CN109491584A publication Critical patent/CN109491584A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention provides a kind of screen control method based on mobile terminal, include the following steps: to judge the software interface with upper and lower extension content whether is shown on the screen of the mobile terminal;When display has the software interface of upper and lower extension content, the operating gesture of user is identified, and judge whether the operating gesture of user matches with preset operating gesture;When the operating gesture of user and preset operating gesture match, start center gesture operation function;When recognizing user's finger and sliding up and down at the induction region on center, the content to extend up and down in the software interface follows the sliding of finger to synchronize and slides up and down.The present invention also provides a kind of mobile terminals.After adopting the above technical scheme, can carry out gesture operation in the induction region on center, to control, extension content is slided up and down up and down in screen, and easy to operate, finger will not form the content in screen and block, and improves the viewing experience of user.

Description

A kind of screen control method and a kind of mobile terminal based on mobile terminal
Technical field
The present invention relates to technical field of mobile terminals more particularly to a kind of screen control methods and one based on mobile terminal Kind mobile terminal.
Background technique
The mobile terminals such as mobile phone, tablet computer play an important role in current daily life, Ren Mentong Often will use mobile terminal browsing content, for example browse news, read article, check mail etc..These contents are usually more, show Show in the application interface in screen, these contents are usually to extend up and down, i.e., cannot be fully displayed at current screen interface In, user needs to interact with mobile terminal, and the content controlled in application interface slides up and down, and then browses oneself needs Content.For example, user, when browsing mail, application interface would generally show many envelope mails, user's finger with extending up and down Lower slider on the screen, the mail in application interface also slide up and down therewith, so user can according to need browsing search from Oneself mail to be treated, then carry out post-treatment operations.
In the prior art, user interacts with mobile terminal, controls the side of the content to extend up and down in application interface on screen Formula generally includes:
1) it uses gesture interaction in conventional shield, such as finger to slide up and down in screen, or pins the virtual of the interior setting of screen Sliding block slides up and down, or clicks the virtual key etc. in screen.
2) shield outer key interaction using conventional, such as click the outer physical button of the screens such as volume add-substract key.
Using gesture interaction in conventional shield, there are the following problems:
1) regardless of both hands operation or one-handed performance, display content can block a part by finger always;
2) easy to operate by mistake or generation is accidentally touched when operating, such as edge easy touching screen in the centre of the palm causes to miss when finger manipulation Touching;
3) screen of mobile terminal is usually larger, and when being hold by one hand mobile terminal and interacting, advantage is difficult to move on finger Suitable position is moved, it is inconvenient;
4) when being hold by one hand mobile terminal and interacting, thumb bending degree is bigger, and long-time operation finger is easy Ache.
Using the outer key interaction of conventional screen, there are the following problems: having interaction effect due to shielding outer key definition itself, if logical It crosses the content that it is controlled on screen in application interface to slide up and down, is easy mutually to conflict with the effect that key itself defines, influences to use The usage experience at family.
Summary of the invention
In order to overcome the above technical defects, the purpose of the present invention is to provide a kind of easy to operate naturally, do not block screen, The viewing experience preferably screen control method based on mobile terminal and a kind of mobile terminal.
The invention discloses a kind of screen control method based on mobile terminal, include the following steps: to judge the movement Whether software interface with upper and lower extension content is shown on the screen of terminal;
When display has the software interface of upper and lower extension content, the operating gesture of user is identified, and judge the behaviour of user It makes a sign with the hand with whether preset operating gesture matches;
When the operating gesture of user and preset operating gesture match, start center gesture operation function;
When recognizing user's finger and being slided up and down at the induction region on center, extend up and down in the software interface Content follow the sliding of finger to synchronize to slide up and down.
Optionally, the operating gesture of the identification user, and judge that the operating gesture of user and preset operating gesture are It is no to match, comprising:
Operating gesture of the sensor identification user at center being set at the induction region on center, it is described mobile whole The processor at end will recognize operating gesture and be compared with preset center gesture, judge operating gesture of the user at center Whether match with preset center gesture;
The center gesture is the induction region double-clicked on center, and keeps the state pinned in second of click.
Optionally, the operating gesture of the identification user, and judge that the operating gesture of user and preset operating gesture are It is no to match, comprising:
Operating gesture of the sensor identification user at screen being set at screen, the processor of the mobile terminal will Operating gesture is recognized to be compared with preset screen gesture, judge operating gesture of the user at screen whether with it is preset Screen gesture matches;
The screen gesture includes: to pin screen 1 second and upward sliding, alternatively, pin screen, first upward sliding, then to Lower slider.
Preferably, further include following steps after the starting center gesture operation function:
Whether the time for judging that finger leaves induction region is more than the preset time;
When being more than the preset time, center gesture operation function is closed, when gliding on user's finger is at induction region When dynamic, the content to extend up and down in the software interface does not follow the sliding of finger to synchronize and slides up and down.
Preferably, further include following steps after the starting center gesture operation function:
The fixed mark in display position in the mobile terminal screen, the mark is not with the sliding of the content to extend up and down And position change occurs;
Judge whether induction region recognizes the operating gesture of double-click;
When recognizing the operating gesture of double-click, corresponded on opening screen interior in the software interface at the home position Hold.
The invention also discloses a kind of mobile terminal, including processor, memory, screen, center, it is arranged on the center There is induction region, computer program is stored on the memory, is realized when the computer program is executed by processor following Step:
Judge the software interface with upper and lower extension content whether is shown on the screen of the mobile terminal;
When display has the software interface of upper and lower extension content, the operating gesture of user is identified, and judge the behaviour of user It makes a sign with the hand with whether preset operating gesture matches;
When the operating gesture of user and preset operating gesture match, start center gesture operation function;
When recognizing user's finger and being slided up and down at the induction region on center, extend up and down in the software interface Content follow the sliding of finger to synchronize to slide up and down.
Preferably, the sensing of user's operation gesture for identification is respectively arranged at the screen and the induction region Device, the sensor are touch sensor and/or pressure sensor.
Preferably, there are four the lateral surface towards different directions, the induction region is set to described outer the center tool On one or more of side.
After above-mentioned technical proposal, compared with prior art, have the advantages that
1. the screen control method of the application is easy to operate, the gesture used and the gesture held naturally are close, so behaviour Make more natural.
It is blocked 2. will not be formed to the content on screen when operation, viewing experience is more preferable.
Detailed description of the invention
Fig. 1 is the flow diagram of the screen control method based on mobile terminal in one embodiment of the invention;
Fig. 2 is the flow diagram of the subsequent step of flow and method in Fig. 1;
Fig. 3 is the schematic diagram for double-clicking the induction region on center;
Fig. 4 is to control the signal that the content to extend up and down in software interface on screen slides up and down by center gesture operation Figure;
Fig. 5 is the structural schematic diagram for inventing mobile terminal in an embodiment.
Appended drawing reference:
A- mark, 10- mobile terminal, 11- memory, 12- processor, 13- screen, 14- center.
Specific embodiment
Below in conjunction with attached drawing, the advantages of the present invention are further explained with specific embodiment.
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment Described in embodiment do not represent all implementations consistent with this disclosure.On the contrary, they be only with it is such as appended The example of the consistent device and method of some aspects be described in detail in claims, the disclosure.
It is only to be not intended to be limiting the disclosure merely for for the purpose of describing particular embodiments in the term that the disclosure uses. The "an" of the singular used in disclosure and the accompanying claims book, " described " and "the" are also intended to including majority Form, unless the context clearly indicates other meaning.It is also understood that term "and/or" used herein refers to and wraps It may be combined containing one or more associated any or all of project listed.
It will be appreciated that though various information, but this may be described using term first, second, third, etc. in the disclosure A little information should not necessarily be limited by these terms.These terms are only used to for same type of information being distinguished from each other out.For example, not departing from In the case where disclosure range, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as One information.Depending on context, word as used in this " if " can be construed to " ... when " or " when ... When " or " in response to determination ".
In the description of the present invention, it is to be understood that, term " longitudinal direction ", " transverse direction ", "upper", "lower", "front", "rear", The orientation or positional relationship of the instructions such as "left", "right", "vertical", "horizontal", "top", "bottom" "inner", "outside" is based on attached drawing institute The orientation or positional relationship shown, is merely for convenience of description of the present invention and simplification of the description, rather than the dress of indication or suggestion meaning It sets or element must have a particular orientation, be constructed and operated in a specific orientation, therefore should not be understood as to limit of the invention System.
In the description of the present invention, unless otherwise specified and limited, it should be noted that term " installation ", " connected ", " connection " shall be understood in a broad sense, for example, it may be mechanical connection or electrical connection, the connection being also possible to inside two elements can , can also indirectly connected through an intermediary, for the ordinary skill in the art to be to be connected directly, it can basis Concrete condition understands the concrete meaning of above-mentioned term.
In subsequent description, it is only using the suffix for indicating such as " module ", " component " or " unit " of element Be conducive to explanation of the invention, itself there is no specific meanings.Therefore, " module " can mixedly make with " component " With.
It is the flow diagram of the screen control method based on mobile terminal in one embodiment of the invention referring to attached drawing 1.Institute Stating mobile terminal can include but is not limited to mobile phone, tablet computer, laptop and E-book reader etc..
The mobile terminal has center, and the center is the metal or nonmetallic frame between mobile phone screen and rear cover Frame, also referred to as frame.The center includes induction region, is provided with sensor in induction region.The sensor is that contact passes Sensor and/or pressure sensor, contact sensor and pressure sensor can be arranged simultaneously or be separately provided one of which. The sensor can incude user's finger in the movement of induction region.The processor of the sensor and mobile terminal Connection, processor can know whether finger contacts with induction region, finger is in induction region according to the information of sensor sensing Residence time, movement speed, moving distance, press pressure etc..To judge user's finger in the operating gesture of induction region, These operating gestures include click, double-click, pinning, again by etc..
It should be noted that user mentioned here refers to the user namely mobile terminal user of mobile terminal.
The screen of the mobile terminal is touch screen, uses any suitable objects such as finger, felt pen for receiving user The touch operation of body on it or nearby, and the software interface of each application program of display.
In the present embodiment, screen control method includes the following steps:
S101: judge the software interface with upper and lower extension content whether is shown on the screen of the mobile terminal.
In this step, the processor of the mobile terminal to being shown in the application program and screen being currently executing in Appearance is analyzed, and judges that the software interface for the application program being shown on screen whether there is the content to extend up and down.On described The content of lower extension refers to that the content in software interface cannot be fully displayed on current screen, needs to slide screen or carries out class Like operation, the content in software interface is set to slide or roll, so that the content for being not depicted in current screen be made to be shown to current screen In curtain.When the user front of finger up and down mentioned here holds mobile terminal, above and below mobile terminal.According to the service condition of user With grip etc., here up and down in some cases it also will be understood that at left and right.
S102: when display has the software interface of upper and lower extension content, the operating gesture of user is identified, and judge user Operating gesture whether match with preset operating gesture.
This step is executed in the case where the judgement of step S101 is set up.In the case where step S101 judges invalid, No longer execute the subsequent step of S101.
In this step, mobile terminal identifies the operating gesture of user, and by the result recognized with preset Good operating gesture is compared, and both judges whether to be identical operating gesture, i.e., judgement recognize operating gesture whether with Pre-set operating gesture matches.
Specifically, it in a kind of situation, comprises the following processes:
Operating gesture of the sensor identification user at center being set at the induction region on center, it is described mobile whole The processor at end will recognize operating gesture and be compared with preset center gesture, judge operating gesture of the user at center Whether match with preset center gesture.
The preset center gesture is the induction region double-clicked on center, and keeps the shape pinned in second of click State.The preset center gesture may be other suitable operating gestures.It is in a schematical double-click referring to attached drawing 3 The schematic diagram of induction region on frame is double-clicked with thumb in the case where the singlehanded mobile phone of gripping naturally and is located at mobile phone right edge Induction region on center.
In another situation, comprise the following processes:
Operating gesture of the sensor identification user at screen being set at screen, the processor of the mobile terminal will Operating gesture is recognized to be compared with preset screen gesture, judge operating gesture of the user at screen whether with it is preset Screen gesture matches;
The preset screen gesture includes: to pin screen 1 second and upward sliding first slides up alternatively, pinning screen It is dynamic, it slides still further below.The preset screen gesture may be other suitable operating gestures.
In step s 102, above-mentioned two situations can exist simultaneously, i.e., identify user at center and screen simultaneously Operating gesture carries out subsequent step as long as the operating gesture wherein at one matches with preset;Above-mentioned two situations can also be with Only exist one kind, i.e., only the operating gesture in center and screen one is identified, when its with it is preset match when, carry out Subsequent step.
S103: when the operating gesture of user and preset operating gesture match, start center gesture operation function.
This step is executed in the case where the judgement of step S102 is set up.In the case where step S102 judges invalid, No longer execute the subsequent step of S102.
When the operating gesture of the user recognized and preset operating gesture match, start center gesture operation function Can, i.e., it establishes and contacts between the software interface of induction region and screen on center, the sensor identification at induction region When sliding up and down at the induction region on center to user's finger, the content to extend up and down in the software interface follows finger Sliding synchronize and slide up and down.It is a schematical scene referring to attached drawing 4, in the case where the singlehanded mobile phone of gripping naturally, with big Thumb slides up and down at the induction region being located on mobile phone right edge center, the content to extend up and down in software interface on screen It slides up and down therewith.
The application, by carrying out slide in the induction region on center, to control extension up and down in software interface Content slide up and down, finger is not needed to be moved on screen and be slided, the content shown on screen will not be blocked, Improve the viewing experience of user.The gesture of operation and the gesture held naturally are close, so operation is more natural, it is more convenient, and more Hommization.
Preferably, the speed that slides up and down of the finger on center at induction region is preset, software circle on distance and screen Corresponding relationship between speed that the content to extend up and down in face slides up and down, distance, so that user can be better by feeling Region is answered to carry out slide to browse the content to extend up and down in software interface on screen.
In the application, the purpose of step S102 is starting switch as center gesture operation function, and user is by making Identical with default gesture gesture starts center gesture operation function.User can need to carry out center gesture operation in this way The autonomous starting function.It avoids when user does not need using center gesture operation, due to the induction region on center It accidentally touches, and the content browsed is caused to slide up and down, influence the viewing experience of user.
Preferably, referring in 2 the above embodiments of attached drawing, the step 103 further includes later following steps:
S104: whether the time for judging that finger leaves induction region is more than the preset time.
After the starting of center gesture operation function, the sensor at the induction region connects finger and induction region Touching is felt, and processor analysis finger leaves the time of induction region, i.e. finger is no longer in contact with induction region, and judges Whether the time that finger leaves induction region is more than the preset time.The preset time can be for 1 second or other suitable Time.
S105: when being more than the preset time, center gesture operation function is closed.
This step is executed in the case where the judgement of step S102 is set up.
When being more than the preset time, center gesture operation function, i.e. induction region and screen on disconnection center are closed Software interface between connection, when user's finger slides up and down at induction region, in the software interface up and down extend Content do not follow the sliding of finger to synchronize to slide up and down.
In the application, it is preferable that after the starting of center gesture operation function, referring to attached drawing 4, Yu Suoshu mobile terminal screen With the sliding of the content to extend up and down position change does not occur for display position fixed mark a, the mark a in curtain.It is described Identify a, it is preferable that can be set into it is transparent, in order to avoid in software interface content formation block.Preferably, the mark a It is set to the edge in the middle part of screen.
Judge whether induction region recognizes the operating gesture of double-click, that is, judges whether user double-clicks the sense on the center Answer region;
When recognizing the operating gesture of double-click, corresponded on opening screen interior in the software interface at the home position Hold.The corresponding region of the mark is preset, which is located at the mark, or is located near the mark.It opens on screen The content in software interface at the corresponding home position, that is, open sub-menus at the corresponding region of the mark, can The enterable contents such as the link that point is opened.Step operation, which is equal to, described on clicks screen in the prior art identifies corresponding area Domain.
Further, judge induction region whether recognize again by operating gesture.Preferably, a pressure threshold is preset, Whether the pressure for judging induction region is more than the threshold value, when the threshold value is exceeded, then it is assumed that recognized again by operating gesture. Certainly, the operating gesture that identification is pressed again can also be realized by other suitable modes.When recognize again by operating gesture after, Return to previous menu.
Identification and further operating by induction region to double-clicking and pressing again can be used family and do not having to finger movement The specific content in screen can be opened on to screen, and returns to previous menu.It is easy to operate, naturally, and will not be to screen The content shown on curtain is blocked, and the viewing experience and operating experience of user are improved.
Referring to attached drawing 5, the invention also discloses a kind of mobile terminal 10, including processor 12, memory 11, screen 13, Center 14, which is characterized in that it is provided with induction region on the center 14, is stored with computer program on the memory 11, The computer program performs the steps of when being executed by processor 12
Judge the software interface with upper and lower extension content whether is shown on the screen 14 of the mobile terminal 10;
When display has the software interface of upper and lower extension content, the operating gesture of user is identified, and judge the behaviour of user It makes a sign with the hand with whether preset operating gesture matches;
When the operating gesture of user and preset operating gesture match, start center gesture operation function;
It is upper downward in the software interface when recognizing user's finger and being slided up and down at the induction region on center 14 The content of exhibition follows the sliding of finger to synchronize and slides up and down.
That is, computer program realizes the step in above-described embodiment when being executed by processor 12, which is not described herein again.
The mobile terminal 10 can include but is not limited to mobile phone, tablet computer, laptop and E-book reader Deng.It in some cases, may include fixed terminal.
In the application, processor 12 may include one or more processing core.Processor 12 using various interfaces and Various pieces in the entire mobile terminal 10 of connection, by running or executing the computer program being stored in memory 11 The data that (including: instruction, program, code set or instruction set etc.) and calling are stored in memory 11, execute terminal 10 Various functions and processing data.Optionally, processor 12 can use Digital Signal Processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable patrol At least one of array (Programmable Logic Array, PLA) example, in hardware is collected to realize.Processor 12 can integrate Central processing unit (Central Processing Unit, CPU), image processor (Graphics Processing Unit, ) and the combination of one or more of modem etc. GPU.Wherein, it the main processing operation system of CPU, user interface and answers With program etc., GPU is used to be responsible for the rendering and drafting of content to be shown needed for screen 13.
Memory 11 may include random access memory (Random Access Memory, RAM), also may include read-only deposit Reservoir (Read-Only Memory).Optionally, memory 11 includes non-transient computer-readable medium (non- transitory computer-readable storage medium).Memory 11 can be used for storing computer program (packet It includes: instruction, program, code, code set or instruction set etc.).
Screen 13 is touch screen, on it or attached using any suitable objects such as finger, felt pens for receiving user Close touch operation, and the user interface of each application program of display.
The center 14 is the metal or nonmetallic frame between mobile phone screen 13 and rear cover, also referred to as frame.It is described Center 14 includes induction region, is provided with sensor in induction region.The sensor is that contact sensor and/or pressure pass Sensor, contact sensor and pressure sensor can be arranged simultaneously or be separately provided one of which.The sensor can Movement to user's finger in induction region incudes.The sensor is connect with the processor 12 of mobile terminal 10, processing Device 12 can know whether finger contacts with induction region according to the information of sensor sensing, stop of the finger in induction region Time, movement speed, moving distance, press pressure etc..To judge user's finger in the operating gesture of induction region, these behaviour Make a sign with the hand including clicking, double-clicking, pin, again by etc..The tool of center 14 is described there are four the lateral surface towards different directions Induction region is set on one or more of described lateral surface.It, can so that user is when mobile phone is in various postures Use center gesture operation function.
It should be noted that the embodiment of the present invention has preferable implementation, and not the present invention is made any type of Limitation, any one skilled in the art change or are modified to possibly also with the technology contents of the disclosure above equivalent effective Embodiment, as long as without departing from the content of technical solution of the present invention, it is to the above embodiments according to the technical essence of the invention Any modification or equivalent variations and modification, all of which are still within the scope of the technical scheme of the invention.

Claims (10)

1. a kind of screen control method based on mobile terminal, which comprises the steps of:
Judge the software interface with upper and lower extension content whether is shown on the screen of the mobile terminal;
When display has the software interface of upper and lower extension content, the operating gesture of user is identified, and judge the manipulator of user Whether gesture matches with preset operating gesture;
When the operating gesture of user and preset operating gesture match, start center gesture operation function;
When recognizing user's finger and sliding up and down at the induction region on center, what is extended up and down in the software interface is interior Appearance follows the sliding of finger to synchronize and slides up and down.
2. screen control method as described in claim 1, which is characterized in that the operating gesture of the identification user, and judge Whether the operating gesture of user matches with preset operating gesture, comprising:
Operating gesture of the sensor identification user at center being set at the induction region on center, the mobile terminal Processor will recognize operating gesture and be compared with preset center gesture, whether judge operating gesture of the user at center Match with preset center gesture;
The center gesture is the induction region double-clicked on center, and keeps the state pinned in second of click.
3. screen control method as described in claim 1, which is characterized in that the operating gesture of the identification user, and judge Whether the operating gesture of user matches with preset operating gesture, comprising:
Operating gesture of the sensor identification user at screen being set at screen, the processor of the mobile terminal will identify Be compared to operating gesture with preset screen gesture, judge operating gesture of the user at screen whether with preset screen Gesture matches;
The screen gesture includes: to pin screen 1 second and upward sliding, alternatively, pinning screen, first upward sliding is slided still further below It is dynamic.
4. screen control method as described in claim 1, which is characterized in that after the starting center gesture operation function, Further include following steps:
Whether the time for judging that finger leaves induction region is more than the preset time;
When being more than the preset time, center gesture operation function is closed, when user's finger slides up and down at induction region, The content to extend up and down in the software interface does not follow the sliding of finger to synchronize and slides up and down.
5. screen control method as described in claim 1, which is characterized in that after the starting center gesture operation function, Further include following steps:
The fixed mark in display position, the mark are not sent out with the sliding of the content to extend up and down in the mobile terminal screen Raw position change;
Judge whether induction region recognizes the operating gesture of double-click;
When recognizing the operating gesture of double-click, the content corresponded in the software interface at the home position on screen is opened.
6. a kind of mobile terminal, including processor, memory, screen, center, which is characterized in that thoughts are arranged on the center Region is answered, computer program is stored on the memory, the computer program performs the steps of when being executed by processor
Judge the software interface with upper and lower extension content whether is shown on the screen of the mobile terminal;
When display has the software interface of upper and lower extension content, the operating gesture of user is identified, and judge the manipulator of user Whether gesture matches with preset operating gesture;
When the operating gesture of user and preset operating gesture match, start center gesture operation function;
When recognizing user's finger and sliding up and down at the induction region on center, what is extended up and down in the software interface is interior Appearance follows the sliding of finger to synchronize and slides up and down.
7. mobile terminal as claimed in claim 6, which is characterized in that be respectively arranged at the screen and the induction region The sensor of user's operation gesture for identification, the sensor are touch sensor and/or pressure sensor.
8. mobile terminal as claimed in claim 6, which is characterized in that there are four the center tools towards the outside of different directions Face, the induction region are set on one or more of described lateral surface.
9. mobile terminal as claimed in claim 6, which is characterized in that the operating gesture for obtaining user, and judge user Operating gesture whether match with preset operating gesture, comprising:
Operating gesture of the sensor identification user at center being set at the induction region on center, the mobile terminal Processor will recognize operating gesture and be compared with preset center gesture, whether judge operating gesture of the user at center Match with preset center gesture;
The center gesture is the induction region double-clicked on center, and keeps the state pinned in second of click.
10. mobile terminal as claimed in claim 6, which is characterized in that the operating gesture for obtaining user, and judge user Operating gesture whether match with preset operating gesture, comprising:
Operating gesture of the sensor identification user at screen being set at screen, the processor of the mobile terminal will identify Be compared to operating gesture with preset screen gesture, judge operating gesture of the user at screen whether with preset screen Gesture matches;
The screen gesture includes: to pin screen 1S and upward sliding, alternatively, pinning screen, first upward sliding is slided still further below It is dynamic.
CN201811209557.0A 2018-10-17 2018-10-17 A kind of screen control method and a kind of mobile terminal based on mobile terminal Pending CN109491584A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811209557.0A CN109491584A (en) 2018-10-17 2018-10-17 A kind of screen control method and a kind of mobile terminal based on mobile terminal
PCT/CN2018/124023 WO2020077852A1 (en) 2018-10-17 2018-12-26 Mobile terminal-based screen control method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811209557.0A CN109491584A (en) 2018-10-17 2018-10-17 A kind of screen control method and a kind of mobile terminal based on mobile terminal

Publications (1)

Publication Number Publication Date
CN109491584A true CN109491584A (en) 2019-03-19

Family

ID=65691484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811209557.0A Pending CN109491584A (en) 2018-10-17 2018-10-17 A kind of screen control method and a kind of mobile terminal based on mobile terminal

Country Status (2)

Country Link
CN (1) CN109491584A (en)
WO (1) WO2020077852A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment
CN112445409A (en) * 2019-09-03 2021-03-05 中移(苏州)软件技术有限公司 Waterfall flow information display method and device, terminal and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105589639A (en) * 2014-11-14 2016-05-18 中兴通讯股份有限公司 Control method and device for realizing page scrolling, and terminal
CN106020692A (en) * 2016-05-20 2016-10-12 广东小天才科技有限公司 Operation method and system of touch screen mobile phone
CN106484288A (en) * 2016-09-28 2017-03-08 依偎科技(南昌)有限公司 A kind of terminal control method and mobile terminal
CN108156320A (en) * 2017-12-27 2018-06-12 奇酷互联网络科技(深圳)有限公司 Icon auto arranging method, icon automatic arranging device and terminal device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103809875A (en) * 2012-11-14 2014-05-21 韩鼎楠 Human-computer interaction method and human-computer interaction interface
CN103118166B (en) * 2012-11-27 2014-11-12 广东欧珀移动通信有限公司 Method of realizing single hand operation of mobile phone based on pressure sensing
CN103019565B (en) * 2012-12-17 2016-07-20 广东欧珀移动通信有限公司 A kind of mobile terminal by the side frame slip page and method
CN104182164A (en) * 2013-05-27 2014-12-03 赛龙通信技术(深圳)有限公司 Electronic device and method for operating interface through side frame
CN106843739B (en) * 2017-02-28 2018-11-30 维沃移动通信有限公司 A kind of display control method and mobile terminal of mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105589639A (en) * 2014-11-14 2016-05-18 中兴通讯股份有限公司 Control method and device for realizing page scrolling, and terminal
CN106020692A (en) * 2016-05-20 2016-10-12 广东小天才科技有限公司 Operation method and system of touch screen mobile phone
CN106484288A (en) * 2016-09-28 2017-03-08 依偎科技(南昌)有限公司 A kind of terminal control method and mobile terminal
CN108156320A (en) * 2017-12-27 2018-06-12 奇酷互联网络科技(深圳)有限公司 Icon auto arranging method, icon automatic arranging device and terminal device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment
CN112394811B (en) * 2019-08-19 2023-12-08 华为技术有限公司 Interaction method of air-separation gestures and electronic equipment
US12001612B2 (en) 2019-08-19 2024-06-04 Huawei Technologies Co., Ltd. Air gesture-based interaction method and electronic device
CN112445409A (en) * 2019-09-03 2021-03-05 中移(苏州)软件技术有限公司 Waterfall flow information display method and device, terminal and storage medium
CN112445409B (en) * 2019-09-03 2022-03-29 中移(苏州)软件技术有限公司 Waterfall flow information display method and device, terminal and storage medium

Also Published As

Publication number Publication date
WO2020077852A1 (en) 2020-04-23

Similar Documents

Publication Publication Date Title
US10768804B2 (en) Gesture language for a device with multiple touch surfaces
US10261630B2 (en) Input device, input support method, and program
CN102981768B (en) A kind of method and system realizing floated overall situation button at touch screen terminal interface
CN102789360B (en) A kind of intelligent terminal text input display methods and device
CN103218138B (en) Touch control terminal and application programe switch-over method thereof
JP6319298B2 (en) Information terminal, display control method and program thereof
CN102629164B (en) A kind of multi-point touch equipment and method for information display and apply processing unit
JP5846129B2 (en) Information processing terminal and control method thereof
WO2018133285A1 (en) Display method and terminal
CN103345312A (en) System and method with intelligent terminal as host, mouse and touch panel at the same time
CN103197885A (en) Method for controlling mobile terminal and mobile terminal thereof
CN101931687A (en) Method for realizing split-screen gesture operation at mobile communication terminal
CN109491584A (en) A kind of screen control method and a kind of mobile terminal based on mobile terminal
CN107870705A (en) A kind of change method and device of the picture mark position of application menu
CN105630369B (en) Method for realizing one-hand operation of mobile terminal and mobile terminal
CN104636017A (en) Wireless intelligent terminal control equipment and control method thereof
CN106293051B (en) Gesture-based interaction method and device and user equipment
CN104714676A (en) Wearable electronic device
CN111078032B (en) Finger ring type mouse capable of prejudging two-finger merging trend and prejudging method
TWM416137U (en) Wireless touch screen multi-media interactive device
US20150153925A1 (en) Method for operating gestures and method for calling cursor
KR101230982B1 (en) Chopsticks type touch-pen
CN109885170A (en) Screenshotss method, wearable device and computer readable storage medium
CN104866228B (en) System and method for holding portable intelligent device for operation
CN105183353B (en) Multi-touch input method for touch equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190319

RJ01 Rejection of invention patent application after publication