CN105278853A - Mobile terminal manipulation method and mobile terminal - Google Patents

Mobile terminal manipulation method and mobile terminal Download PDF

Info

Publication number
CN105278853A
CN105278853A CN201410779283.4A CN201410779283A CN105278853A CN 105278853 A CN105278853 A CN 105278853A CN 201410779283 A CN201410779283 A CN 201410779283A CN 105278853 A CN105278853 A CN 105278853A
Authority
CN
China
Prior art keywords
region
manipulation
mobile terminal
slide
map tag
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410779283.4A
Other languages
Chinese (zh)
Inventor
杨章
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201410779283.4A priority Critical patent/CN105278853A/en
Publication of CN105278853A publication Critical patent/CN105278853A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a mobile terminal manipulation method. The method comprises the following steps of: determining position and area of a manipulation region according to operations of a user on a screen of a mobile terminal; display a mapping mark; obtaining manipulation data in the manipulation region, wherein the manipulation data comprises sliding displacement data and target operation data; controlling the mapping mark according to the sliding displacement data to execute corresponding displacement to a target position; and operating a target at the target position according to the target operation data. The embodiment of the invention also provides a mobile terminal. The method and the mobile terminal provided by the invention bring good user experience of operating more conveniently through one-hand holding to the user.

Description

A kind of control method of mobile terminal and mobile terminal
Technical field
The present invention relates to electronic device field, particularly relate to a kind of control method and mobile terminal of mobile terminal.
Background technology
For giant-screen mobile device, for solving the technical matters of its one-handed performance inconvenience, existing innovative approach comprises: the mode such as entering method keyboard, dialing keyboard that 1, reduces alleviate one-handed performance and giant-screen grip between contradiction.2, screen equal proportion is reduced, by completing the operation to giant-screen to the operation of the small screen.But there is following technical matters in above-mentioned prior art: user is hold by one hand operation inconvenience, cannot be formed and be hold by one hand easy to operate good Consumer's Experience.
Summary of the invention
Be hold by one hand operation inconvenience for overcoming user in prior art, cannot form the problem being hold by one hand easy to operate good Consumer's Experience, the embodiment of the present invention provides a kind of control method of mobile terminal on the one hand, comprising:
According to the operation of user on the screen of described mobile terminal, determine the position and the area that manipulate region;
Display map tag;
Obtain the manipulation data in described manipulation region, described manipulation data comprise slide displacement data and object run data;
According to described slide displacement Data Control, map tag moves corresponding positions and moves on to target location;
The target of described target location is operated according to described object run data.
On the other hand, embodiments provide a kind of mobile terminal, comprising:
Determine to manipulate regions module, for according to the operation of user on the screen of described mobile terminal, determine the position and the area that manipulate region;
Map tag display module, for showing map tag;
Acquisition module, for obtaining the manipulation data in described manipulation region, described manipulation data comprise slide displacement data and object run data;
Mobile module, moves corresponding positions for map tag according to described slide displacement Data Control and moves on to target location;
Operational module, for operating the target of described target location according to described object run data.
The embodiment of the present invention is according to the manipulation data of user in described manipulation region, map tag is made to move to target location, and the target of Action Target position, user is by be hold by one hand and the mode operated realizes controlling the destination object of arbitrary target location in full frame, without the need to hands grasping, without the need to converting holding mode, manipulating convenient and swift, can bring to user and being hold by one hand good Consumer's Experience more convenient to operate.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, below the accompanying drawing used required in describing embodiment is briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is the schematic flow sheet of the first embodiment of the control method of mobile terminal of the present invention;
Fig. 2 is the schematic flow sheet of the second embodiment of the control method of mobile terminal of the present invention;
Fig. 3 is the schematic flow sheet of the 3rd embodiment of the control method of mobile terminal of the present invention;
Fig. 4 is the structural representation of the first embodiment of mobile terminal of the present invention;
Fig. 5 is the structural representation of the embodiment of operational module of the present invention;
Fig. 6 is the structural representation of the second embodiment of mobile terminal of the present invention;
Fig. 7 is the structural representation of the 3rd embodiment of mobile terminal of the present invention.
Embodiment
In order to make technical matters solved by the invention, technical scheme and beneficial effect clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
Fig. 1, be the schematic flow sheet of the first embodiment of the control method of mobile terminal of the present invention, this control method comprises:
Step S101, according to the operation of user on the screen of described mobile terminal, determines the position and the area that manipulate region.
In this step, the screen of described mobile terminal comprises touch-screen.
In this step, the operation of user on the screen of described mobile terminal can be the operation that user draws a circle to approve closed region on screen, or the touch operation of the optional position of user on screen.
In this step, the position of closed region that the position in manipulation region can be drawn a circle to approve on screen according to user and determining, or when carrying out touch operation according to user on screen touch point position and determine.
In the present embodiment, grip the position of palm and determine when the position in manipulation region can also be hold by one hand operation according to user.Such as, if user's right hand grips mobile phone, then right hand thumb can touch screen right half part region, and the position in this region is then the position in manipulation region; If user's left-handed mobile phone, then left hand thumb can touch screen left-half region, and the position in this region is then the position in manipulation region.Certainly, user also can hands grasping mobile phone, and the position in left hand or right hand forefinger or other finger territories, access area is then identified as the position in manipulation region.
In the present embodiment, the position in described manipulation region also can be that being convenient to of automatically distributing after identifying the position gripping palm when user is hold by one hand operation manipulates the position pointed and touch.Described identification user grips the position of palm when being hold by one hand operation, by including but not limited to the detecting units such as pressure transducer obtain.
Step S102, display map tag.
In this step, show described map tag at the predeterminated position of screen; Or determine the position of user in the touch point of described screen, map tag described in the position display in described touch point; Or determine the position of the touch point of user in described manipulation region, the correspondence position of described screen shows described map tag; Or according to the position in described manipulation region, determine the display position of map tag, and then display map tag.
In this step, map tag is associated in advance with manipulation region.
In this step, described map tag can be the icon of other self-defined shape such as a circular icon or mouse-like.
Step S103, obtains the manipulation data in described manipulation region, and described manipulation data comprise slide displacement data and object run data.
In this step, described slide displacement data can be the displacement coordinate variable quantity of user when performing slide in described manipulation region.
In this step, described object run data can be then single key commands to screen, double-click instruction or touch slip gesture instruction.
Step S104, according to described slide displacement Data Control, map tag moves corresponding positions and moves on to target location.
In this step, whether described map tag moves on to target location, can be judged by user through human eye.
Step S105, operates the target of described target location according to described object run data.
This step comprises: the classification identifying described object run data; To the target of described target location, perform the operation corresponding with the classification of described object run data.The classification of described object run data can be the single key command to screen, double-click instruction or touch slip gesture instruction.And the target of described target location can be application icon, interactive interface etc., perform the operation corresponding with the classification of described object run data, can be corresponding with single key command enter program corresponding to application icon, close interactive interface, also can be corresponding with touching slip gesture instruction carry out upper downslide to interactive interface and shield or the operation etc. of page turning.
The embodiment of the present invention is according to the manipulation data of user in described manipulation region, map tag is made to move to target location, and the target of Action Target position, user is by be hold by one hand and the mode operated realizes controlling the destination object of arbitrary target location in full frame, without the need to hands grasping, without the need to converting holding mode, manipulating convenient and swift, can bring to user and being hold by one hand good Consumer's Experience more convenient to operate.
Fig. 2, be the schematic flow sheet of the second embodiment of the control method of mobile terminal of the present invention, this control method comprises:
Step S201, detect the operation that user draws a circle to approve closed region on the screen, the closed region according to delineation forms described manipulation region.
In this step, the screen of described mobile terminal comprises touch-screen.
In this step, described closed region can be that user directly inputs closed track delineation with hand or felt pen on screen.
Step S202, according to the position of the closed region of delineation, determines the position manipulating region.
In this step, the position of the closed region of delineation is got in the position in manipulation region namely.
In the present embodiment, grip the position of palm and determine when the position in manipulation region can also be hold by one hand operation according to user.Such as, if user's right hand grips mobile phone, then right hand thumb can touch screen right half part region, and the position in this region is then the position in manipulation region; If user's left-handed mobile phone, then left hand thumb can touch screen left-half region, and the position in this region is then the position in manipulation region.Certainly, user also can hands grasping mobile phone, and the position in left hand or right hand forefinger or other finger territories, access area is then identified as the position in manipulation region.
In the present embodiment, the position in described manipulation region also can be that being convenient to of automatically distributing after identifying the position gripping palm when user is hold by one hand operation manipulates the position pointed and touch.Described identification user grips the position of palm when being hold by one hand operation, by including but not limited to the detecting units such as pressure transducer obtain.
Step S203, according to the scope of the closed region of delineation, determines the area manipulating region.
In this step, namely the area in manipulation region gets the scope of the closed region of delineation.
Step S204, display map tag.
In this step, show described map tag at the predeterminated position of screen; Or determine the position of user in the touch point of described screen, map tag described in the position display in described touch point; Or determine the position of the touch point of user in described manipulation region, the correspondence position of described screen shows described map tag; Or according to the position in described manipulation region, determine the display position of map tag, and then display map tag.
In this step, map tag is associated in advance with manipulation region.
In this step, described map tag can be the icon of other self-defined shape such as a circular icon or mouse-like.
Step S205, obtains the manipulation data in described manipulation region, and described manipulation data comprise slide displacement data and object run data.
In this step, described slide displacement data can be the displacement coordinate variable quantity of user when performing slide in described manipulation region.
In this step, described object run data can be then single key commands to screen, double-click instruction or touch slip gesture instruction.
Step S206, detects the slide in manipulation region.
In this step, the slide in manipulation region can be that user is directly with the touch slide performed in the manipulation region pointed on screen.
Step S207, obtains the reference position of described slide and the final position of described slide, based on the coordinate of the starting position coordinates of described slide and the final position of described slide, obtains the displacement coordinate variable quantity of described slide.In the present embodiment, described starting position coordinates is designated as (A0x, A0y), and the coordinate of described final position is designated as (A0x+a0x, A0y+a0y), and described displacement coordinate variable quantity is designated as △ S.
Step S208, obtains the current position coordinates of described map tag.In the present embodiment, be designated as (Bx, By).
Step S209, superposes the displacement coordinate variable quantity of described slide and the current position coordinates of described map tag, obtains the object coordinate of described map tag.In the present embodiment, be designated as (Bx+a0x, By+a0y).
Step S210, moves to the position corresponding to described object coordinate by described mapped identification.In the present embodiment, move to by described mapped identification the position that coordinate information is (Bx+a0x, By+a0y).If receive object run data, then perform step S211; If receive bearing data, return step S206.
In this step, whether described map tag moves on to target location, can be judged by user through human eye.
Step S211, operates the target of described target location according to described object run data.
This step comprises: the classification identifying described object run data; To the target of described target location, perform the operation corresponding with the classification of described object run data.The classification of described object run data can be the single key command to screen, double-click instruction or touch slip gesture instruction.And the target of described target location can be application icon, interactive interface etc., perform the operation corresponding with the classification of described object run data, can be corresponding with single key command enter program corresponding to application icon, close interactive interface, also can be corresponding with touching slip gesture instruction carry out upper downslide to interactive interface and shield or the operation etc. of page turning.
The embodiment of the present invention is according to the manipulation data of user in described manipulation region, map tag is made to move to target location, and the target of Action Target position, user is by be hold by one hand and the mode operated realizes controlling the destination object of arbitrary target location in full frame, without the need to hands grasping, without the need to converting holding mode, manipulating convenient and swift, can bring to user and being hold by one hand good Consumer's Experience more convenient to operate.
Fig. 3, be the schematic flow sheet of the 3rd embodiment of the control method of mobile terminal of the present invention, this control method comprises:
Step S301, detect the operation that user draws a circle to approve closed region on the screen, the boundary rectangle according to the closed region of delineation forms described manipulation region.
In this step, the screen of described mobile terminal comprises touch-screen.
In this step, described closed region can be that user directly inputs closed track delineation with hand or felt pen on screen.
Step S302, according to the position of the closed region of delineation, determines the position manipulating region.
In this step, the position of the closed region of delineation is got in the position in manipulation region namely.
In the present embodiment, grip the position of palm and determine when the position in manipulation region can also be hold by one hand operation according to user.Such as, if user's right hand grips mobile phone, then right hand thumb can touch screen right half part region, and the position in this region is then the position in manipulation region; If user's left-handed mobile phone, then left hand thumb can touch screen left-half region, and the position in this region is then the position in manipulation region.Certainly, user also can hands grasping mobile phone, and the position in left hand or right hand forefinger or other finger territories, access area is then identified as the position in manipulation region.
In the present embodiment, the position in described manipulation region also can be that being convenient to of automatically distributing after identifying the position gripping palm when user is hold by one hand operation manipulates the position pointed and touch.Described identification user grips the position of palm when being hold by one hand operation, by including but not limited to the detecting units such as pressure transducer obtain.
Step S303, according to the scope of the boundary rectangle of the closed region of delineation, determines the area manipulating region.
In this step, namely the area in manipulation region gets the scope of the boundary rectangle of the closed region of delineation.
Step S304, is mapped in described screen by described manipulation region equal proportion.
In this step, equal proportion mapping relations are set up in described manipulation region and screen.In the present embodiment, suppose to manipulate the equal proportion mapping relations setting up 1:2 between region and screen.
Step S305, display map tag.
In this step, show described map tag at the predeterminated position of screen; Or determine the position of user in the touch point of described screen, map tag described in the position display in described touch point; Or determine the position of the touch point of user in described manipulation region, the correspondence position of described screen shows described map tag; Or according to the position in described manipulation region, determine the display position of map tag, and then display map tag.
In this step, described map tag can be the icon of other self-defined shape such as a circular icon or mouse-like.
Step S306, obtains the manipulation data in described manipulation region, and described manipulation data comprise slide displacement data and object run data.
In this step, described object run data can be then single key commands to screen, double-click instruction or touch slip gesture instruction.
Step S307, display and operation mark in described manipulation region.
In this step, described manipulation mark can be the icon of other self-defined shape such as a circular icon or mouse-like.
In this step, map tag marks with manipulation and is associated in advance.
Step S308, obtains the coordinate moving to current location under described manipulation is marked at slide.
In this step, described slide can be that user is directly with finger touch manipulation mark and in the operation manipulating in region this manipulation mark that drags or slide.
In this step, the coordinate of described current location is the position coordinates of manipulation mark relative screen.
Step S309, obtains the relative position of coordinate in described manipulation region of the current location of described manipulation mark.
In this step, the relative position in described manipulation region, refers to manipulate the position coordinates that mark manipulates region relatively, and it can draw according to the position coordinates of manipulation mark relative screen.In the present embodiment, the position coordinates that manipulation mark manipulates region is relatively designated as (A0x, A0y).
Step S310, obtains the target location of described map tag according to described relative position.In the present embodiment, because setting up the equal proportion mapping relations of 1:2 between manipulation region and screen, therefore the target location of map tag is then designated as (2A0x, 2A0y)
Step S311, moves to described target location described map tag.In the present embodiment, move to by described mapped identification the position that coordinate information is (2A0x, 2A0y).If receive object run data, then perform step S312; If receive bearing data, return step S308.
In this step, whether described map tag moves on to target location, can be judged by user through human eye.
Step S312, operates the target of described target location according to described object run data.
This step comprises: the classification identifying described object run data; To the target of described target location, perform the operation corresponding with the classification of described object run data.The classification of described object run data can be the single key command to screen, double-click instruction or touch slip gesture instruction.And the target of described target location can be application icon, interactive interface etc., perform the operation corresponding with the classification of described object run data, can be corresponding with single key command enter program corresponding to application icon, close interactive interface, also can be corresponding with touching slip gesture instruction carry out upper downslide to interactive interface and shield or the operation etc. of page turning.
The embodiment of the present invention is according to the manipulation data of user in described manipulation region, map tag is made to move to target location, and the target of Action Target position, user is by be hold by one hand and the mode operated realizes controlling the destination object of arbitrary target location in full frame, without the need to hands grasping, without the need to converting holding mode, manipulating convenient and swift, can bring to user and being hold by one hand good Consumer's Experience more convenient to operate.
Detailed introduction is done to the embodiment of the control method of mobile terminal of the present invention above.Below the device (i.e. mobile terminal) corresponding to said method is further elaborated.Wherein, mobile terminal can be mobile phone, panel computer, MP3, MP4 or notebook computer etc.
Fig. 4 is the structural representation of the first embodiment of mobile terminal of the present invention, and this mobile terminal 100 can comprise to be determined to manipulate regions module 110, map tag display module 120, acquisition module 130, mobile module 140 and operational module 150.
Determine to manipulate regions module 110, for according to the operation of user on the screen of described mobile terminal, determine the position and the area that manipulate region.
Map tag display module 120, for showing map tag.
Acquisition module 130, for obtaining the manipulation data in described manipulation region, described manipulation data comprise slide displacement data and object run data.
Mobile module 140, moves corresponding positions for map tag according to described slide displacement Data Control and moves on to target location.
Operational module 150, for operating the target of described target location according to described object run data.
Further, please refer to Fig. 5, is the structural representation of the embodiment of operational module of the present invention, and this operational module 150 comprises:
Identify class location 151, for identifying the classification of described object run data.
Executable operations unit 152, for the target to described target location, performs the operation corresponding with the classification of described object run data.
The embodiment of the present invention is according to the manipulation data of user in described manipulation region, map tag is made to move to target location, and the target of Action Target position, user is by be hold by one hand and the mode operated realizes controlling the destination object of arbitrary target location in full frame, without the need to hands grasping, without the need to converting holding mode, manipulating convenient and swift, can bring to user and being hold by one hand good Consumer's Experience more convenient to operate.
Fig. 6 is the structural representation of the second embodiment of mobile terminal of the present invention, and this mobile terminal 200 can comprise to be determined to manipulate regions module 210, map tag display module 220, acquisition module 230, mobile module 240 and operational module 250.
Determine to manipulate regions module 210, for according to the operation of user on the screen of described mobile terminal, determine the position and the area that manipulate region.This determines that manipulation regions module 210 can comprise:
Manipulation area formation unit 211, draws a circle to approve the operation of closed region on the screen for detecting user, the closed region according to delineation forms described manipulation region.
Position determination unit 212, for the position of the closed region according to delineation, determines the position manipulating region.
Area determining unit 213, for the scope of the closed region according to delineation, determines the area manipulating region.
Map tag display module 220, for showing map tag.
Acquisition module 230, for obtaining the manipulation data in described manipulation region, described manipulation data comprise slide displacement data and object run data.
Mobile module 240, moves corresponding positions for map tag according to described slide displacement Data Control and moves on to target location.This mobile module 240 can comprise:
Detecting unit 241, for detecting the slide in manipulation region.
Obtain displacement coordinate variable quantity unit 242, for the final position of the reference position and described slide that obtain described slide, based on the coordinate of the starting position coordinates of described slide and the final position of described slide, obtain the displacement coordinate variable quantity of described slide.
Obtain current position coordinates unit 243, for obtaining the current position coordinates of described map tag.
Superpositing unit 244, for the current position coordinates of the displacement coordinate variable quantity and described map tag that superpose described slide, obtains the object coordinate of described map tag.
Mobile unit 245, for moving to the position corresponding to described object coordinate by described mapped identification.
Operational module 250, for operating the target of described target location according to described object run data.
Further, described operational module 250 can comprise:
Identify class location, for identifying the classification of described object run data.
Executable operations unit, for the target to described target location, performs the operation corresponding with the classification of described object run data.
The embodiment of the present invention is according to the manipulation data of user in described manipulation region, map tag is made to move to target location, and the target of Action Target position, user is by be hold by one hand and the mode operated realizes controlling the destination object of arbitrary target location in full frame, without the need to hands grasping, without the need to converting holding mode, manipulating convenient and swift, can bring to user and being hold by one hand good Consumer's Experience more convenient to operate.
Fig. 7 is the structural representation of the 3rd embodiment of mobile terminal of the present invention, and this mobile terminal 300 can comprise to be determined to manipulate regions module 310, map tag display module 320, acquisition module 330, mobile module 340 and operational module 350.
Determine to manipulate regions module 310, for according to the operation of user on the screen of described mobile terminal, determine the position and the area that manipulate region.This determines that manipulation regions module 310 can comprise:
Manipulation area formation unit 311, draws a circle to approve the operation of closed region on the screen for detecting user, the boundary rectangle according to the closed region of delineation forms described manipulation region.
Position determination unit 312, for the position of the closed region according to delineation, determines the position manipulating region.
Area determining unit 313, for the scope of the boundary rectangle of the closed region according to delineation, determines the area manipulating region.
Geometric ratio map unit 314, for being mapped in described screen by described manipulation region equal proportion.
Map tag display module 320, for showing map tag.
Acquisition module 330, for obtaining the manipulation data in described manipulation region, described manipulation data comprise slide displacement data and object run data.
Mobile module 340, moves corresponding positions for map tag according to described slide displacement Data Control and moves on to target location.This mobile module 340 can comprise:
Display unit 341, marks for display and operation in described manipulation region.
Obtain position coordinates unit 342, for obtaining the coordinate moving to current location under described manipulation is marked at slide.
Obtain relative position unit 343, for obtaining the relative position of coordinate in described manipulation region of the current location of described manipulation mark.
Obtain target location unit 344, for obtaining the target location of described map tag according to described relative position.
Mobile unit 345, for moving to described target location described map tag.
Operational module 350, for operating the target of described target location according to described object run data.
Further, operational module 350 can comprise:
Identify class location, for identifying the classification of described object run data.
Executable operations unit, for the target to described target location, performs the operation corresponding with the classification of described object run data.
The embodiment of the present invention is according to the manipulation data of user in described manipulation region, map tag is made to move to target location, and the target of Action Target position, user is by be hold by one hand and the mode operated realizes controlling the destination object of arbitrary target location in full frame, without the need to hands grasping, without the need to converting holding mode, manipulating convenient and swift, can bring to user and being hold by one hand good Consumer's Experience more convenient to operate.
One of ordinary skill in the art will appreciate that all or part of flow process realized in above-described embodiment method, that the hardware that can carry out instruction relevant by computer program has come, described program can be stored in a computer read/write memory medium, this program, when performing, can comprise the flow process of the embodiment as above-mentioned each side method.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (Read-OnlyMemory, ROM) or random store-memory body (RandomAccessMemory, RAM) etc.
Be one or more embodiments provided in conjunction with particular content as mentioned above, do not assert that specific embodiment of the invention is confined to these explanations.Method all and of the present invention, structure etc. are approximate, identical, or for making some technology deduction or replace under concept thereof of the present invention, all should be considered as protection scope of the present invention.

Claims (12)

1. a control method for mobile terminal, is characterized in that, comprising:
According to the operation of user on the screen of described mobile terminal, determine the position and the area that manipulate region;
Display map tag;
Obtain the manipulation data in described manipulation region, described manipulation data comprise slide displacement data and object run data;
According to described slide displacement Data Control, map tag moves corresponding positions and moves on to target location;
The target of described target location is operated according to described object run data.
2. the control method of mobile terminal as claimed in claim 1, is characterized in that, described according to the operation of user on the screen of described mobile terminal, determines the manipulation position in region and the step of area, comprising:
Detect the operation that user draws a circle to approve closed region on the screen, the closed region according to delineation forms described manipulation region;
According to the position of the closed region of delineation, determine the position manipulating region;
According to the scope of the closed region of delineation, determine the area manipulating region.
3. the control method of mobile terminal as claimed in claim 1, is characterized in that, described according to the operation of user on the screen of described mobile terminal, determines the manipulation position in region and the step of area, comprising:
Detect the operation that user draws a circle to approve closed region on the screen, the boundary rectangle according to the closed region of delineation forms described manipulation region;
According to the position of the closed region of delineation, determine the position manipulating region;
According to the scope of the boundary rectangle of the closed region of delineation, determine the area manipulating region;
Described manipulation region equal proportion is mapped in described screen.
4. the control method of mobile terminal as claimed in claim 2, is characterized in that, described according to described slide displacement Data Control map tag move the step that corresponding positions moves on to target location, comprising:
Detect the slide in manipulation region;
Obtain the reference position of described slide and the final position of described slide, based on the coordinate of the starting position coordinates of described slide and the final position of described slide, obtain the displacement coordinate variable quantity of described slide;
Obtain the current position coordinates of described map tag;
Superpose the displacement coordinate variable quantity of described slide and the current position coordinates of described map tag, obtain the object coordinate of described map tag;
Described mapped identification is moved to the position corresponding to described object coordinate.
5. the control method of mobile terminal as claimed in claim 3, is characterized in that, described according to described slide displacement Data Control map tag move the step that corresponding positions moves on to target location, comprising:
Display and operation mark in described manipulation region;
Obtain the coordinate moving to current location under described manipulation is marked at slide;
Obtain the relative position of coordinate in described manipulation region of the current location of described manipulation mark;
The target location of described map tag is obtained according to described relative position;
Described map tag is moved to described target location.
6. the control method of the mobile terminal as described in one of Claims 1 to 5, is characterized in that, the described step operating the target of described target location according to described object run data, comprising:
Identify the classification of described object run data;
To the target of described target location, perform the operation corresponding with the classification of described object run data.
7. a mobile terminal, is characterized in that, comprising:
Determine to manipulate regions module, for according to the operation of user on the screen of described mobile terminal, determine the position and the area that manipulate region;
Map tag display module, for showing map tag;
Acquisition module, for obtaining the manipulation data in described manipulation region, described manipulation data comprise slide displacement data and object run data;
Mobile module, moves corresponding positions for map tag according to described slide displacement Data Control and moves on to target location;
Operational module, for operating the target of described target location according to described object run data.
8. mobile terminal as claimed in claim 7, is characterized in that, describedly determines that manipulation regions module comprises:
Manipulation area formation unit, draws a circle to approve the operation of closed region on the screen for detecting user, the closed region according to delineation forms described manipulation region;
Position determination unit, for the position of the closed region according to delineation, determines the position manipulating region;
Area determining unit, for the scope of the closed region according to delineation, determines the area manipulating region.
9. mobile terminal as claimed in claim 7, is characterized in that, describedly determines that manipulation regions module comprises:
Manipulation area formation unit, draws a circle to approve the operation of closed region on the screen for detecting user, the boundary rectangle according to the closed region of delineation forms described manipulation region;
Position determination unit, for the position of the closed region according to delineation, determines the position manipulating region;
Area determining unit, for the scope of the boundary rectangle of the closed region according to delineation, determines the area manipulating region;
Geometric ratio map unit, for being mapped in described screen by described manipulation region equal proportion.
10. mobile terminal as claimed in claim 8, it is characterized in that, described mobile module comprises:
Detecting unit, for detecting the slide in manipulation region;
Obtain displacement coordinate variable quantity unit, for the final position of the reference position and described slide that obtain described slide, based on the coordinate of the starting position coordinates of described slide and the final position of described slide, obtain the displacement coordinate variable quantity of described slide;
Obtain current position coordinates unit, for obtaining the current position coordinates of described map tag;
Superpositing unit, for the current position coordinates of the displacement coordinate variable quantity and described map tag that superpose described slide, obtains the object coordinate of described map tag;
Mobile unit, for moving to the position corresponding to described object coordinate by described mapped identification.
11. mobile terminals as claimed in claim 9, it is characterized in that, described mobile module comprises:
Display unit, marks for display and operation in described manipulation region;
Obtain position coordinates unit, for obtaining the coordinate moving to current location under described manipulation is marked at slide;
Obtain relative position unit, for obtaining the relative position of coordinate in described manipulation region of the current location of described manipulation mark;
Obtain target location unit, for obtaining the target location of described map tag according to described relative position;
Mobile unit, for moving to described target location described map tag.
12. mobile terminals as described in claim 7 ~ 11, it is characterized in that, described operational module comprises:
Identify class location, for identifying the classification of described object run data;
Executable operations unit, for the target to described target location, performs the operation corresponding with the classification of described object run data.
CN201410779283.4A 2014-12-16 2014-12-16 Mobile terminal manipulation method and mobile terminal Pending CN105278853A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410779283.4A CN105278853A (en) 2014-12-16 2014-12-16 Mobile terminal manipulation method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410779283.4A CN105278853A (en) 2014-12-16 2014-12-16 Mobile terminal manipulation method and mobile terminal

Publications (1)

Publication Number Publication Date
CN105278853A true CN105278853A (en) 2016-01-27

Family

ID=55147947

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410779283.4A Pending CN105278853A (en) 2014-12-16 2014-12-16 Mobile terminal manipulation method and mobile terminal

Country Status (1)

Country Link
CN (1) CN105278853A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108595241A (en) * 2018-04-24 2018-09-28 苏州蜗牛数字科技股份有限公司 A method of touch screen is virtualized
CN111343341A (en) * 2020-05-20 2020-06-26 北京小米移动软件有限公司 One-hand mode implementation method based on mobile equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102129312A (en) * 2010-01-13 2011-07-20 联想(新加坡)私人有限公司 Virtual touchpad for a touch device
CN102855066A (en) * 2012-09-26 2013-01-02 东莞宇龙通信科技有限公司 Terminal and terminal control method
CN103593136A (en) * 2013-10-21 2014-02-19 广东欧珀移动通信有限公司 Touch terminal, and one-hand operation method and device of large-screen touch terminal
CN103809888A (en) * 2012-11-12 2014-05-21 北京三星通信技术研究有限公司 Mobile terminal and manipulation method thereof
US8826187B2 (en) * 2007-12-20 2014-09-02 Google Inc. Method and system for moving a cursor and selecting objects on a touchscreen using a finger pointer

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8826187B2 (en) * 2007-12-20 2014-09-02 Google Inc. Method and system for moving a cursor and selecting objects on a touchscreen using a finger pointer
CN102129312A (en) * 2010-01-13 2011-07-20 联想(新加坡)私人有限公司 Virtual touchpad for a touch device
CN102855066A (en) * 2012-09-26 2013-01-02 东莞宇龙通信科技有限公司 Terminal and terminal control method
CN103809888A (en) * 2012-11-12 2014-05-21 北京三星通信技术研究有限公司 Mobile terminal and manipulation method thereof
CN103593136A (en) * 2013-10-21 2014-02-19 广东欧珀移动通信有限公司 Touch terminal, and one-hand operation method and device of large-screen touch terminal

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108595241A (en) * 2018-04-24 2018-09-28 苏州蜗牛数字科技股份有限公司 A method of touch screen is virtualized
CN111343341A (en) * 2020-05-20 2020-06-26 北京小米移动软件有限公司 One-hand mode implementation method based on mobile equipment
CN111343341B (en) * 2020-05-20 2020-09-11 北京小米移动软件有限公司 One-hand mode implementation method based on mobile equipment

Similar Documents

Publication Publication Date Title
US9304656B2 (en) Systems and method for object selection on presence sensitive devices
US8638303B2 (en) Stylus settings
CN105117056B (en) A kind of method and apparatus of operation touch-screen
US20090251441A1 (en) Multi-Modal Controller
EP2770443B1 (en) Method and apparatus for making contents through writing input on touch screen
US9778780B2 (en) Method for providing user interface using multi-point touch and apparatus for same
JP2011186550A (en) Coordinate input device, coordinate input method, and computer-executable program
JP5951886B2 (en) Electronic device and input method
US20120062520A1 (en) Stylus modes
KR101901735B1 (en) Method and system for providing user interface, and non-transitory computer-readable recording medium
CN105549813A (en) Mobile terminal control method and mobile terminal
CN109885222A (en) Icon processing method, device, electronic equipment and computer-readable medium
US20150355769A1 (en) Method for providing user interface using one-point touch and apparatus for same
CN107728910A (en) Electronic installation, display screen control system and method
JP2014006654A (en) Information display device, providing method of user interface and program
US20140359541A1 (en) Terminal and method for controlling multi-touch operation in the same
CN105278853A (en) Mobile terminal manipulation method and mobile terminal
CN103809912A (en) Tablet personal computer based on multi-touch screen
CN103809794A (en) Information processing method and electronic device
KR101422447B1 (en) Method and apparatus for changing page of e-book using pressure modeling
CN103677616A (en) Operating method of electronic device
KR20140083300A (en) Method for providing user interface using one point touch, and apparatus therefor
US20170075453A1 (en) Terminal and terminal control method
CN108008819A (en) A kind of page map method and terminal device easy to user's one-handed performance
CN105320424B (en) A kind of control method and mobile terminal of mobile terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160127