CN104423870A - Control in graphical user interface, display method as well as method and device for operating control - Google Patents

Control in graphical user interface, display method as well as method and device for operating control Download PDF

Info

Publication number
CN104423870A
CN104423870A CN201310409569.9A CN201310409569A CN104423870A CN 104423870 A CN104423870 A CN 104423870A CN 201310409569 A CN201310409569 A CN 201310409569A CN 104423870 A CN104423870 A CN 104423870A
Authority
CN
China
Prior art keywords
control
display screen
user
target location
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310409569.9A
Other languages
Chinese (zh)
Inventor
赵子鹏
杨帆
曹炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Samsung Telecom R&D Center
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Samsung Telecommunications Technology Research Co Ltd, Samsung Electronics Co Ltd filed Critical Beijing Samsung Telecommunications Technology Research Co Ltd
Priority to CN201310409569.9A priority Critical patent/CN104423870A/en
Publication of CN104423870A publication Critical patent/CN104423870A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The invention provides a control in a graphical user interface, a display method as well as a method and a device for operating the control. The control display method in the graphical user interface comprises the steps: receiving moving information of a user for operating the control; moving the control to a target position in the graphical user interface according to the moving information; displaying the control at the target position in the graphical user interface. According to the embodiment of the invention, the moving information of the user for operating the control is firstly received, then the control is moved to the target position according to the moving information, and the control is displayed at the target position in the graphical user interface, so that the control can be moved to the target position according to the moving information determined by the operation of the user and is endowed with a moving function, convenience and flexibility of operation of the user on all the controls on a display screen are improved, and the operation mode of the user is simplified.

Description

The method and apparatus of the control in graphic user interface, display packing and operational controls
Technical field
The present invention relates to terminal device technical field, in particular to the method and apparatus of the control in graphic user interface, display packing and operational controls.
Background technology
Now, comprise all many-sides that the terminal device such as mobile phone, panel computer has spread to people's life, people utilize various mobile terminal to carry out communicating, seeing video and reading etc., function is very many and powerful, and terminal device is easy to carry, user can use terminal device when riding or walking.
At present, the display screen of terminal device has the function of contact touch-control usually, and user is when needing the function of trigger control, and normally user utilizes the control on finger click display screen, to trigger the function of this control; The display screen of terminal device also can have suspension touch control function, and user carrys out by pointer the function that suspension touch control carrys out trigger control.Above-mentioned contact touch-control or suspension touch control carry out the mode of trigger control, usually need user can hold terminal device with one, go to trigger the control in display screen with another hand.Such as, when user's left hand holds terminal device, time the right hand carries article, user must put down article to vacate the control in right-hand operated display screen, and this is concerning very inconvenient user.Or when user takes bus and stands on bus, user must get a grip on the rail to prevent from falling down with a hand, and at this moment user holds terminal device with another, so just cannot operate the control in terminal device, very inconvenient.
In the prior art, user also can hold terminal device with one hand and the control utilizing the thumb of this hand to come on operating display, but, thumb can only subregion on operating display, most of control on display screen is positioned at the position that thumb cannot touch, so user is difficult to only hold terminal device with one hand and operate each control on its display screen.
Summary of the invention
The invention provides the method and apparatus of the control in a kind of graphic user interface, display packing and operational controls, for solving in prior art, user only cannot hold terminal device and the problem of each control on its display screen of handled easily with one hand.
For solving the problem, the invention provides the control display packing in graphic user interface, wherein, comprising the following steps:
Receive the mobile message of the operational controls of user;
According to described mobile message described control moved to the target location in graphic user interface;
Target location in described graphic user interface shows described control.
Present invention also offers the control in a kind of graphic user interface, comprising:
Receiver module, mobile module and display module,
Described receiver module, for receiving the mobile message of control;
Described mobile module, for moving to the target location in graphic user interface by described control according to described mobile message;
Described display module, shows described control for the target location in described graphic user interface.
Present invention also offers a kind of method of operational controls, wherein, comprising:
The display screen induction region of display screen detects operating article;
After operating article being detected, locate the control on described display screen according to the positioning action of user;
After the confirmation operation of user at described display screen induction region being detected, the control that triggered location arrives.
Present invention also offers a kind of device of operational controls, it is characterized in that, comprising: the second detection module, locating module and trigger module;
Described second detection module, for detecting operating article on the display screen induction region of display screen;
Described locating module, for after operating article being detected, locates the control on described display screen according to the positioning action of user;
Described trigger module, for after the confirmation operation of user at described display screen induction region being detected, the control that triggered location arrives.
The beneficial effect of embodiment provided by the invention:
In embodiment provided by the invention, first the mobile message of the operational controls of user is received, then according to mobile message, control is moved to target location, and target location display control in graphical user interfaces, target location is moved to the mobile message enabling control determine according to user operation, control the is possessed function of movement, improves convenience and dirigibility that user operates each control on display screen, simplifies the mode of operation of user.
Accompanying drawing explanation
The present invention above-mentioned and/or additional aspect and advantage will become obvious and easy understand from the following description of the accompanying drawings of embodiments, wherein:
Fig. 1 is the process flow diagram of control display packing first embodiment in graphic user interface of the present invention;
Fig. 2 is structural representation when control is in idle condition in the present embodiment;
Fig. 3 is structural representation when control is in state of activation in the present embodiment;
Fig. 4 is structural representation when control is in mobile status in the present embodiment;
Fig. 5 is that in the present embodiment, control is in structural representation when putting state in place
Fig. 6 is the structural representation of control first embodiment in graphic user interface of the present invention;
Fig. 7 is the structural representation of control second embodiment in graphic user interface of the present invention;
Fig. 8 follows the trail of control class inheritance schematic diagram in the present embodiment;
Fig. 9 is the tracking control class inheritance schematic diagram of android system in the present embodiment;
Figure 10 is the logical organization schematic diagram following the trail of control root class in the present embodiment;
Figure 11 is the logical organization schematic diagram following the trail of control project group class in the present embodiment;
Figure 12 is the process flow diagram of the present embodiment control track user operation;
Figure 13 is the process flow diagram of the present invention to method of operating first embodiment of control;
Figure 14 is the process flow diagram of the present invention to method of operating second embodiment of control;
Figure 15 is the schematic diagram according to the control on the moving direction of motion track location in the present embodiment;
Figure 16 is the schematic diagram that in the present embodiment, display effect moves to target location from the initial position of control;
Figure 17 is the process flow diagram of the present invention to the method for operating of control the 3rd embodiment;
Figure 18 is the schematic diagram according to the position orient control of the focus of eyeball in the present embodiment;
Figure 19 is the structural representation of the present invention to operating means first embodiment of control;
Figure 20 is the structural representation of the present invention to operating means second embodiment of control.
Embodiment
Be described below in detail embodiments of the invention, the example of described embodiment is shown in the drawings, and wherein same or similar label represents same or similar element or has element that is identical or similar functions from start to finish.Being exemplary below by the embodiment be described with reference to the drawings, only for explaining the present invention, and can not limitation of the present invention being interpreted as.
Those skilled in the art of the present technique are appreciated that unless expressly stated, and singulative used herein " ", " one ", " described " and " being somebody's turn to do " also can comprise plural form.Should be further understood that, the wording used in instructions of the present invention " comprises " and refers to there is described feature, integer, step, operation, element and/or assembly, but does not get rid of and exist or add other features one or more, integer, step, operation, element, assembly and/or their group.Should be appreciated that, when we claim element to be " connected " or " coupling " to another element time, it can be directly connected or coupled to other elements, or also can there is intermediary element.In addition, " connection " used herein or " coupling " can comprise wireless connections or couple.Wording "and/or" used herein comprises one or more arbitrary unit listing item be associated and all combinations.
Those skilled in the art of the present technique are appreciated that unless otherwise defined, and all terms used herein (comprising technical term and scientific terminology) have the meaning identical with the general understanding of the those of ordinary skill in field belonging to the present invention.Should also be understood that those terms defined in such as general dictionary should be understood to have the meaning consistent with the meaning in the context of prior art, unless and define as here, can not explain by idealized or too formal implication.
Those skilled in the art of the present technique are appreciated that, here used " terminal ", " terminal device " had both comprised the equipment of the wireless signal receiver only possessed without emissive ability, comprised again the reception having and can carry out two-way communication on bidirectional communication link and the equipment launching hardware.This equipment can comprise: tool is with or without honeycomb or other communication facilitiess of multi-line display; Can the PCS Personal Communications System (PCS) of combine voice and data processing, fax and/or its communication ability; The PDA(Personal Digital Assistant) of radio frequency receiver and pager, the Internet/intranet access, web browser, notepad, calendar and/or GPS (GPS) receiver can be comprised; And/or comprise the conventional laptop of radio frequency receiver and/or palmtop computer or other equipment.Here used " terminal ", " terminal device " can be portable, can transport, be arranged in the vehicles (aviation, sea-freight and/or land), or be suitable for and/or be configured to run at local runtime and/or with distribution form any other position in the earth and/or space.Here used " terminal ", " terminal device " can also be communication terminal, access terminals, music/video playback terminal, can be such as PDA, MID and/or the mobile phone with music/video playing function, can be the equipment such as intelligent television, Set Top Box.
Existing, when the application program that design and development mobile terminal runs, the position on a display screen of the control in application program is just fixed, and control does not possess dynamic mobile tracking function, can not in user's use procedure, the operation of following user is moved.Therefore, to the mode of operation of control in existing mobile terminal, be all the control that user initiatively goes on searching, close display screen, then click this control to start the function of this control.The mode of this operational controls has been present in for a long time, and user lacks the feeling of freshness to mode of operation.In addition, under the development trend that the display screen of mobile terminal is increasing, the convenience of the control in one-handed performance display screen sharply declines.Such as, user's one hand hold mobile phone and control on operating handset display screen time, thumb can only be used carry out control is operated, and the region that thumb can be clicked is also very limited.
Use the convenience of mobile terminal to improve user, the embodiment of the present invention proposes the control display packing in graphic user interface always, possesses the control of track user operation, and the method and apparatus of operational controls.
Fig. 1 is the process flow diagram of control display packing first embodiment in graphic user interface of the present invention.As shown in Figure 1, the flow process of the control display packing in the present embodiment graphic user interface comprises the steps:
Step 101: the mobile message receiving control;
Step 102: according to mobile message control moved to the target location in graphic user interface;
Step 103: target location display control in graphical user interfaces.
In the technical field of computer programming, control is a kind of graphical user-interface element, and be a kind of visual building block substantially, control is included among application program, control this application program process data and about the interactive operation between data and user.For convenience of explanation and understand the present invention propose said method, the present invention classifies to the state residing for control, and be described in conjunction with above-mentioned steps, the state wherein residing for control comprises: idle (Idle), activation (Activating), mobile (Moving) and put in place (Ready).Should be understood that, to the definition of the different conditions residing for control, just in order to technical scheme of the present invention is described, and should not be construed as limitation of the present invention.
Fig. 2 is structural representation when control is in idle condition in the present embodiment.As shown in Figure 2, when control is in idle condition, as the idle condition of control of the prior art, be in display state when waiting for user operation and also do not operated, now, control is presented at the initial position of graphic user interface, and the idle condition of control is the state before step 101.
In the present embodiment, the function of user can be followed the trail of or follow the tracks of in order to realize control, first perform step 101 to receive the mobile message of control, then in a step 102, control is moved to target location from initial position according to mobile message by terminal, again in step 103, target location display control in graphical user interfaces, thus realize object control being moved to target location.In the present embodiment, target location can for the position determined according to the position of operating article, it is such as the position that operating article hovers over correspondence position above display screen or operating article contact display screen, when in target location being position corresponding to user operation, the effect that control follows the trail of operating article can be obtained, greatly improve the convenience that user selects control.Target location also can be the position that user presets in graphical user interfaces, and this position preset can facilitate user to start control easily.
The control display packing in graphical user interfaces that the present invention proposes, directly on a graphical user interface can be carried out display operation to control, also can be made a response by the operation of control to user itself, to realize the tracking of control to user operation.The technology contents of the various methods of control is shown in graphical user interfaces disclosed in various embodiments of the present invention, if do not make specified otherwise, all be interpreted as not limiting realization body, as long as can corresponding step be performed, realize object of the present invention, the technology contents be equal to disclosed by the present invention all should be considered as.As embodiments of the invention, specifically, in a step 102, receive the mobile message of control, mobile message includes but not limited to following information: the coordinate information of control on the target location of graphic user interface.In the step 130 of the present embodiment, show control according on the target location on a graphical user interface of this coordinate information.
In the present embodiment, mobile message comprises the coordinate of the coordinate of the initial position at control place, the target location of control, and mobile message can also comprise the speed of control movement, the route etc. of control movement.Can also according to mobile information acquisition control at the display parameter of target location and/or the display parameter in moving process, display parameter are for defining display state when control is in different conditions.Specifically, when the initial position of control is identical with target location, preset display mode display control at the initial position of graphic user interface according to first.
Fig. 3 is structural representation when control is in state of activation in the present embodiment.As shown in Figure 3, the state that control is chosen by user is typically referred to when control is in state of activation in the present embodiment, or user carries out the state of a certain operation to control, click, input information etc. can be comprised to the operation of control, the action type of control is determined according to the function of control itself or type, and the operation of user is distinguished to some extent because of the type difference of control.The state of activation of control is different from the idle condition of control.As shown in Figure 3, when user be operating as choose current control time, control will convert state of activation to from idle condition; Or, when user be operating as trigger control certain application time, control will convert state of activation to from idle condition, be in state of activation control will according to first preset display mode display, to be different from the control that all the other are in idle condition.In actual applications, be in the control initial position of state of activation different from target location time, control is moved to the target location of graphic user interface, and presets display mode display control according to second.
Fig. 4 is structural representation when control is in mobile status in the present embodiment.As shown in Figure 4, after control Button2 is selected, by from its initial position rectilinear motion to the target location determined according to the operation of user, target location also can be the position preset, and the position that user carries out touch-control is normally convenient in target location.
In a step 102, after the positional information of operating article dwell point being detected, this position can be set as the target location of control, then control Button2 is moved to the position of operating article dwell point, move to the process of the target location of graphic user interface from initial position at Button2 control, include but not limited to under type: control is at the uniform velocity moved to target location from initial position with straight line, or, by control from initial position along curve movement to target location, in mobile control process, preset display mode according to the 3rd and show the control be in mobile status.Such as, the 3rd of control presets display mode can comprise the flicker frequency of control, color, brightness and/or size etc., preset display mode to make the control be in mobile status according to the 3rd and carry out Dynamic Announce, strengthen the animation effect of control in moving process, improve interest during user operation control.
Fig. 5 is that in the present embodiment, control is in structural representation when putting state in place.As shown in Figure 5, after target location control being moved to graphic user interface, control when being in target location presets display mode display according to second, carries out follow-up trigger action etc. to facilitate user.Such as, target location can move to the target location determined according to the operation of user or the position preset for control, to wait for the further operation of user.
In each embodiment provided by the invention, control can be preset display mode, the second default display mode or the 3rd default display mode according to first when different conditions and show, by changing display effects such as being in the brightness of the control of different conditions, size, flicker frequency and/or color, user can clearly be observed and operational controls.Wherein, first presets display mode, the second default display mode can be identical with the 3rd display parameter preset in display mode, also can be different.
As shown in Figure 5, after control Button2 arrives target location, be in the state waiting for that user operates further, when user carries out next step operation to control Button2, the action type according to user is run by control accordingly.In the present embodiment, after target location control being moved to graphic user interface, also following steps will be performed:
Detect the operation of user;
When the cancellation operation of user being detected, or when the operation of user not detected within the time of presetting, cancel at target location display control, control will turn back to initial position.
Wherein, can the rectilinear motion of defining operation thing or circular motion for cancelling operation, when display screen induction region detects the cancellation operation of user, or, when the operation of user not detected within the time of presetting, to cancel at target location display control, then control will turn back to initial position, and the control Button2 in Fig. 5 then disappears from target location, then turns back to initial position, now, control Button2 presents idle condition as shown in Figure 2.
In control display packing embodiment in graphic user interface provided by the invention, first the mobile message of the operational controls of user is received, then according to mobile message, control is moved to target location, and target location display control in graphical user interfaces, control is enable to move to target location according to user operation, control is made to have possessed the function of carrying out movement according to the operation of user or the input information of user, simplify user location and the mode of operation of trigger control, improve convenience and dirigibility that user operates each control on display screen.
Fig. 6 is the structural representation of control first embodiment in graphic user interface of the present invention.As shown in Figure 6, control in embodiment of the present invention graphic user interface comprises: the first receiver module 601, mobile module 602 and the first display module 603, wherein, first receiver module 601 is for receiving the mobile message of control, mobile module 602 is for moving to the target location in graphic user interface by control according to mobile message, the first display module 603 is for target location display control in graphical user interfaces.
Further, the first display module 603 can also according to first receiver module 601 receive the following information of mobile information acquisition: control is at the display parameter of target location and/or the display parameter in moving process.The target location that control moves to can for one of following: the position that user presets and the position that the position according to operating article is determined.In the present embodiment, when the initial position of control is identical with target location, the first display module 603 is preset display mode according to first and is shown control at initial position; When the initial position at control place is different from target location, control is moved to target location from initial position by mobile module 602, and the first display module 603 is preset explicit parameter according to second and shown control in target location.Move to the process of target location by control from initial position at mobile module 602, the first display module 603 is preset display mode according to the 3rd and is shown the control be in mobile status.Wherein, first presets display mode, the second default display mode and the 3rd is preset display mode and comprised following at least one display parameter: the brightness that control shows, color, size and flicker frequency.By selected control according to the display of above-mentioned predetermined display parameter, by changing the display effects such as the brightness of control, size, flicker frequency and/or color, user can be made clearly to observe and operational controls.Wherein, above-mentioned first presets display mode, the second default display mode can be identical with the 3rd display parameter preset in display mode, also can be different.
In control embodiment in graphic user interface provided by the invention, first receiver module receives the mobile message of control, then control is moved to target location according to mobile message by the first mobile module, and utilize the first display module target location display control in graphical user interfaces, target location is moved to according to mobile message to enable control, control the is possessed function of movement, improve convenience and dirigibility that user operates each control on display screen, simplify the mode of operation of user.
Fig. 7 is the structural representation of control second embodiment in graphic user interface of the present invention.As shown in Figure 7, control in the present embodiment graphic user interface also comprises: first detection module 604 and first cancels module 605, wherein, first detection module 604 is for detecting the operation of user, when first detection module 604 detects the cancellation operation of user, or when first detection module 604 does not detect the operation of user within the time of presetting, first cancels module 605 cancels the display of control in target location, and control will turn back to initial position.
In the present embodiment, control has possessed the function of movement, the cancellation operation of user is detected at first detection module, or when first detection module does not detect the operation of user within the time of presetting, cancel module and cancel the display of control in target location, control turns back to initial position, guarantee that the operation that control can follow user is moved, control is made to have possessed the function of carrying out movement according to the operation of user or the input information of user, simplify the mode of operation of user location and trigger control, improve convenience and dirigibility that user operates each control on display screen.
In embodiment provided by the invention, in order to provide the control possessing tracking function in graphical user interfaces, can adopt and include but not limited to that any one mode following realizes:
A. object-oriented programming mode is used, function directly by allowing newly-increased control in the original control of terminal or terminal obtain track user operation, can in Object-oriented Software Development platform in graphical user interfaces, in the structure top end of control level, there is a base class, other class in Software Development Platform is all inherited from this base class, thus has the attribute, Event and Method etc. of base class.In implementation in the present embodiment, can encapsulate a subclass inherited from this base class, called after " tracking control class " in the below of base class, in " tracking control class " below, bamboo product inherits other subclasses from " tracking control class ".In " tracking control class " and subclass thereof, design and Implement the attribute, Event and Method etc. required for control tracking.Existing software controls in Software Development Platform or newly-increased software controls, all can inherit the attribute, Event and Method etc. required for control tracking, thus can realize the tracking function of software controls from " tracking control class " and subclass thereof.
B., application developer uses object-oriented programming mode in control, also technical scheme disclosed in above-described embodiment can directly be realized; Or copy the attribute, the Event and Method that describe in " the tracking control class " proposed in A mode, in control, directly realize the function of similar tracking.
C. application developer's technical scheme disclosed in above-described embodiment, in the program design mode of not face object, such as, at processor-oriented program design mode, also can realize technical scheme disclosed in above-described embodiment.Such as, to copy in this above-described embodiment about attribute, Event and Method that " tracking control class " describes, in the program design mode of not face object, realize the present invention.
Below to realizing object of the present invention to be illustrated with mode A.
Fig. 8 follows the trail of control class inheritance schematic diagram in the present embodiment.As shown in Figure 8, in the below of control class base class, encapsulate a subclass inherited from this control base class, called after " tracking control class ", in " tracking control class " below, three subclasses from " tracking control class " are inherited in design again, and three subclasses are respectively " following the trail of control root class ", " following the trail of control project class " and " following the trail of control project group class ".In above four classes, design achieves control and has attribute, Event and Method required for tracking function.Existing software controls in Software Development Platform or newly-increased software controls, all can hold attribute, Event and Method required for tracking function according to the classification of base class from these three subclass relayings, so just can realize the tracking function of software controls.
In the present embodiment, the ingredient and succession structure of following the trail of control class are described with android system.Control View is the base class of all control classes in android system, present the most basic user interface UI tectonic block, it is the basis creating interactively graphic user interface, therefore, in android system, control class base class is exactly View, other controls are all inherited in View, each control has attribute and the method for View, occupies a viewing area on display screen, and is responsible for drawing and event handling.
Fig. 9 is the tracking control class inheritance schematic diagram of android system in the present embodiment.As shown in Figure 9, according to the technical scheme provided in the above embodiment of the present invention, first create the direct subclass TrackingView of control View, then establishment succession is respectively from three subclasses of TrackingView: TrackingViewRoot, TrackingViewItem and TrackingViewItemGroup.Pass shown in Fig. 9 is the imbody of the inheritance shown in Fig. 8 in android system.
In actual applications, if need the control in graphic user interface to possess tracking function, application developer can by the tracking control class allowing existing control or newly-increased control inherit, to realize the tracking function of control.Such as, application developer needs button control Button to possess tracking function, button control Button can be allowed directly to inherit and follow the trail of control project class TrackingViewItem, by inheriting thought in object-oriented, now button control Button also follows the trail of relevant having the ability by possessing.
In the present embodiment, respectively tracking control class (TrackingView), tracking control root class (TrackingViewRoot), tracking control project class (TrackingViewItem) and tracking control project group class (TrackingViewItemGroup) are described.
1. follow the trail of control class.
In the present embodiment, following the trail of control class is generally for other three Similar integral as abstract base class.Due in software development process, following the trail of control class can not design and show on a graphical user interface, the object for reaching creation with tracking function control just constructs tracking control class, follow the trail of control class and represent all classes that there is tracking function or follow the trail of action processing capacity, this Similar integral automatic control part class View, it is except the general utility functions with View, also has and follows the trail of the relevant function of control.Wherein, the major function target that tracking control class completes can comprise as follows: 1, abstract regulation follows the trail of characteristic and the function of control, inherits for other subclass; 2, control is provided whether to have the tracking attribute-bit following the trail of control ability; 3, the tracking control type attribute-bit providing control type to divide; 4, the tracking control status attribute of control state demarcation is provided to identify; 5, the method function arranging and read tracking attribute-bit is provided.
Because tracking control class is abstract class, so only inherit for the subclass of control, cannot instance object.In the present embodiment, the detailed design of following the trail of control class comprises: design is followed the trail of attribute and designed the method for following the trail of property value.Wherein, follow the trail of attribute tracking and whether there is tracking function for identifying control; The data value type of following the trail of attribute is Boolean: True or False, is defaulted as True.Follow the trail of control type attribute trackingViewType for identifying the type of control, wherein, data value type is enumerated value, comprising: trackingViewRoot, trackingViewItem and trackingViewItemGroup, and default value is trackingViewItem.Follow the trail of control status attribute trackingViewStatus for identifying the state of current control, the data value type of following the trail of control status attribute is enumerated value, and comprising: idle, activating, moving and ready, default value is idle.
Arrange and follow the trail of property value setTracking follows the trail of attribute data value for arranging current control, application developer can pass through (setTracking()) tracking property value is set, if be True by arranging tracking property value, then identifying this control can carry out tracking operation; If arranging and following the trail of property value is False, then identifying this control can not carry out tracking operation.Obtain and follow the trail of property value getTracking follows the trail of attribute data value for obtaining current control, tracking property value is True or False, and application developer can pass through (getTracking()) obtain current control tracking property value.
Arranging control type property value setTrackingViewType is then data value for arranging current control type attribute, if arranging control type property value is trackingViewRoot, then identifying this control is trackingViewRoot type; If arranging control type property value is trackingViewItem, then identifying this control is trackingViewItem type; If arranging control type property value is trackingViewItemGroup, then identifying this control is trackingViewItemGroup type.
Obtain control type property value getTrackingViewType for obtaining the data value of current control type attribute, application developer can pass through (getTrackingViewType()) obtain current control Type Attribute value.
Arrange control state attribute value setTrackingViewStatus, for arranging the data value of current control status attribute, application developer can pass through (setTrackingViewStatus()) state attribute value is set.If arranging control state attribute value is idle idle, then identifying this control example current is idle condition; If arrange control state attribute value for activating activating, then identifying this control example current is state of activation; If arranging control state attribute value is mobile moving, then identifying this control example current is mobile status; If arrange control state attribute value for the ready that puts in place, then identifying this control example current is the state of putting in place.In the present embodiment, obtain control state attribute value getTrackingViewStatus for obtaining the data value of current control status attribute, application developer can pass through (getTrackingViewStatus()) the current state property value of acquisition current control.
2. follow the trail of control root class.
In the present embodiment, building tracking control class to reach the target that generates and there are a series of controls of tracking function, following the trail of the major function class that control root class is tracking function logical process.Figure 10 is the logical organization schematic diagram following the trail of control root class in the present embodiment.As shown in Figure 10, in the graphic user interface of software, each software forms instantiation one can follow the trail of control root class, and to form software display forms, but this software display forms are sightless concerning user.Follow the trail of control root class and can unify to process the functions such as the action listener following the trail of control in Current software forms responds, tracking control moves.Wherein, following the trail of control root Similar integral from following the trail of control class TrackingView, following the trail of control class except there is the function of tracking control class TrackingView and control class View, also there is following function:
Other is set to have the information of the traceable control of tracking function to receive in Current software forms other log-on message being set to follow the trail of control or active inquiry, and these information is preserved management; Monitor and respond the external event of user operation in Current software forms, as finger touch display screen event, finger suspension event, eye trajectory track of events and mouse moving event etc., from external event, obtain the position coordinates that active user such as to point at the operating article; According to the external event of user operation, judge and search user want operate current form in traceable control; According to user operation, dynamically change the state of traceable control; According to user operation, the position upgrading traceable control makes it reach the object of track user finger.
In the present embodiment, the service regeulations of following the trail of control root class are: if application developer needs to use in current Graphics user interface to follow the trail of operating function, must in the software forms that current Graphics user interface is corresponding, the control of such tracking control root of instantiation one.Wherein, the design proposal of following the trail of control root class is as follows:
Design member's array variable, this member's array variable can design to record traceable control arrays TrackingViewArray, traceable control arrays for preserving in Current software forms the log-on message that other is set to follow the trail of control, or preserve follow the trail of control root class active inquiry to other be set to the information of traceable control.The data of traceable control arrays comprise: the coordinate (TrackingView_Coordinate) of the ID (TrackingView_ID) of traceable control, traceable control and the length and width (TrackingView_LengthWidth) of traceable control; Wherein, the ID of traceable control is for identifying all traceable software controls in Current software forms, the coordinate of traceable control is for identifying the display screen coordinate position of all traceable software controls in Current software forms, the length and width of traceable control, for identifying length and the width of all traceable software controls in Current software forms.
In the present embodiment, the function designing action listener and response in control root class can also followed the trail of, such as: the monitoring of finger touch event and response (TouchEventListener, onTouchEvent), the monitoring of finger suspension event and response (HoveringEventListener, onHoveringEvent), the monitoring of eye trajectory track of events and response (GazingTrackingEventListener, and the monitoring of mouse moving event and response (MouseMovingEventListener onGazingTrackingEvent), onMouseMovingEvent).Wherein, the monitoring of finger touch event and response are sent to the event of current application program for monitoring application system, and such as user's finger slides on a display screen, the event of click etc.; The monitoring of finger suspension event and response are sent to the event of current application program for monitoring application system, such as user's finger on a display screen side suspend static, suspend mobile etc. event, and in subsequent processes, processes this event; The monitoring of eye trajectory track of events and response are sent to the event of current application program for monitoring application system, the event of such as user's eye trajectory movement on a display screen, and in subsequent processes, process this event; The monitoring of mouse moving event and response are sent to the event of current application program for monitoring application system, such as mouse slides on a display screen, the static or event clicked, and in subsequent processes, process this event.
In the present embodiment, can also perform by the following method and follow the trail of action listener and the function in responding in control root class.Such as, by registering traceable control information (addTrackingView()), allow other traceable control in current Graphics user interface initiatively by the information registering of self in traceable control arrays.Traceable control information (traverseTrackingView() is searched) by traversal, initiatively traversal searches all tracking attributes in current Graphics user interface is the control of True, and the information of these controls is written in the traceable control arrays following the trail of control root class.By judging user operation direction of action (getGestureDirection()), judge the direction of operating of the operating articles such as the finger of active user, for " searching next traceable control ", thus determine that user wants the software controls of selecting to follow the trail of operation, operation comprises operating of contacts and suspension procedure.By searching next traceable control (findNextTrackingView()), with the direction of operating of the operating article obtained according to the method for " judging user operation direction of action ", then search and determine that user wants the software controls of selecting to follow the trail of operation.By activating current traceable control (activateTrackingView()), activate the traceable control chosen in current user operation, and by highlighted for this control display, flickering display etc.By upgrading the traceable control location (updateTrackingViewLocation() activated), the control that is activated is moved to the screen position obtaining current operation thing place.
3. follow the trail of control project class
In the present embodiment, tracking control project class represents software and is presented on graphic display interface, and the needs that application developer is arranged carry out the single tracking control project of tracking operation.Such as, follow the trail of control project class and can be set to a button control Button.All existing software controls or single control classes of newly-increased software possessing tracking function, all need to inherit and follow the trail of control project class to obtain the necessary attribute of tracking function, Event and Method.Wherein, following the trail of control project Similar integral from following the trail of control class TrackingView, following the trail of control class and there is the function of following the trail of control class TrackingView and control class View.Such as, the function of following the trail of control project class comprises: characteristic and function, the status display function of tracking control and the state recording function of tracking control of following the trail of control.Wherein, characteristic and the function of following the trail of control can be used for new control subclass to inherit use; Utilization state Presentation Function, can present different display effects to point out user according to the different conditions followed the trail of at control; Utilization state writing function record also preserves the information etc. of the co-ordinate position information of control, display attribute information, image resource information and operation logic processing capacity.
In the present embodiment, the service regeulations of following the trail of control project class are: follow the trail of control project class and can be used for programmer to transform or create new control subclass succession use.When using tracking control, needing instantiation to follow the trail of control and arranging its tracking attribute is Ture, arranges control type attribute for following the trail of control project type.Wherein, the design proposal of following the trail of control project class is as follows:
Used by statement in static interfaces topology file and follow the trail of control, and the base attribute of following the trail of control is set in static interfaces topology file, application program can read static interfaces topology file when loading and following the trail of control, when reading tracking control project class, the tracking control of the automatic instantiation the type of application system.Also can in the program operation phase, control project class is followed the trail of in instantiation voluntarily, and the interface function provided by following the trail of control class base class dynamically arranges tracking control property information.After control project class is followed the trail of in instantiation, such example information (such as: positional information etc.) will be stored in the member variable following the trail of control root class.
When following the trail of control root class and tracking finger, retrieval is followed the trail of all tracking control examples of the member variable of control root class, and the tracking control in this direction is notified according to the direction of motion of finger, the mobile status of trigger trace control class after finger stops being moved beyond setting-up time, after finger stopping is mobile, the state of the tracking control of correspondence is switched to " putting in place " state from " movement ", then waits for the click event of user.Follow the trail of control root class after receiving event, notify that the tracking control that it is possessed is switched to corresponding state, to respond processing logic.Be in different conditions according to tracking control, upgrade its display state, comprise highlighted display, flickering display etc. to notify user.
4. follow the trail of control project group class.
Tracking control project group class represents software and is presented on graphic display interface, and the needs that application developer is arranged carry out one group of tracking control project of tracking operation.Figure 11 is the logical organization schematic diagram following the trail of control project group class in the present embodiment.As shown in figure 11, follow the trail of control project group class and can contain two button control Button operation panels.In actual applications, follow the trail of control project group class and can comprise several tracking control project classes.In the present embodiment, all needs possesses the existing software controls of tracking function or newly-increased software controls group class, all needs to inherit to follow the trail of control project group class class to obtain the necessary attribute of tracking function, Event and Method.Wherein, follow the trail of control project group Similar integral from following the trail of control class, there is the function of following the trail of control class TrackingView and control class View.In the present embodiment, the functional objective of following the trail of control project group class comprises: characteristic and function, the status display function of tracking control and the state recording function of tracking control of following the trail of control.Wherein, characteristic and the function of following the trail of control can be used for new control subclass to inherit use; Utilization state Presentation Function, can present different display effects to point out user according to the different conditions followed the trail of at control; Utilization state writing function record also preserves the information etc. of the co-ordinate position information of control, display attribute information, image resource information and operation logic processing capacity.Follow the trail of in control project group control and can comprise the control followed the trail of in control project group, control, other the non-tracking controls followed the trail of in control project, its embedded tracking control project group control still can continue nestedly to comprise child control simultaneously.
In the present embodiment, the service regeulations of following the trail of control project group class are: follow the trail of control project group class and can be used for programmer to transform or create new control subclass succession use.When using tracking control, needing instantiation to follow the trail of control and arranging its tracking attribute is True, and arranging the type attribute following the trail of control is tracking control project set type.Wherein, the design proposal of following the trail of control project class is as follows:
Used by statement in static interfaces topology file and follow the trail of control, and the base attribute of following the trail of control is set in static interfaces topology file, application program can read static interfaces topology file when loading and following the trail of control, when reading tracking control project class, the tracking control of the automatic instantiation the type of application system.Also can in the program operation phase, control project class is followed the trail of in instantiation voluntarily, and the interface function provided by following the trail of control class base class dynamically arranges tracking control property information.After control project class is followed the trail of in instantiation, new child control can be added to this control, this child control can be follow the trail of control project type, can be follow the trail of control project set type or other non-tracking control type, the example information (such as: positional information, child control information etc.) following the trail of control project class will be stored in the member variable following the trail of control root class.
When control is moved, follows the trail of control project group and moved by all child controls of this control of carrying, when the operation receiving user, user will pass to the operation of user the corresponding child control of its carrying.
Figure 12 is the process flow diagram of the present embodiment control track user operation.As shown in figure 12, in the present embodiment, the flow process of control track user operation comprises following job step:
Step 1201, software forms load and control root class is followed the trail of in instantiation one.
In this step, each software forms instantiation one can follow the trail of control root class, and to form software display forms, this software display forms are sightless concerning user.Software forms load and after instantiation one tracking control root class, enter step 1202.
The control of tracking function is possessed in step 1202, tracking control root class example traversal Current software forms.
In this step, follow the trail of control root class example and traversal is checked the control possessing tracking function in Current software forms, and the information that traversal inspection obtains is recorded in traceable control arrays, then enter step 1203.
Step 1203, tracking control root class snoop-operations event.
In this step, follow the trail of control root class by snoop-operations event, user can utilize the operating articles such as finger to carry out operating of contacts and suspension procedure to display screen, if follow the trail of control root class to listen to user operation case, then enters step 1024.
Step 1204, tracking control root class detect the moving direction of operating article.
In this step, utilize operating article to carry out in the process of suspension procedure or operating of contacts user, detect the moving direction of operating article, to arrive the control that user wants to locate, then enter step 1205.
Step 1205, the control shielded according to the positioning action locating and displaying of operating article.
In this step, according to the positioning action of user, position, the state of the control navigated to is switched to state of activation from idle condition to the control that the diverse location of display screen shows, the control navigated to is preset display mode according to first and is shown.Wherein, positioning action comprises: user utilize operating article to click display screen induction region or by operating article suspension stay above display screen induction region etc.After the control on the positioning action locating and displaying screen of user, enter step 1206.
Step 1206, judge operating article the 3rd setting duration in whether move.
In this step, and start timer, judge whether operating article is moved in the 3rd setting duration, if judged result is yes, that is, operating article continues mobile in the 3rd setting duration, then cancel the location to control, and the state of the control being cancelled location is switched to idle condition from state of activation, then enter step 1203.If judged result is no, that is, operating article does not move in the 3rd setting duration, then enter step 1207.
Step 1207, the control navigated to is moved to target location from initial position.
In this step, if operating article does not move in the 3rd setting duration, then the state of the control navigated to is switched to mobile status from state of activation, then the control navigated to is being moved to target location from initial position, after control arrives target location, the state of control switches to from mobile status the state of putting in place, presets display mode show the control be in target location according to second.Target location can be the position that user presets, and also can be the position determined according to the position operation of operating article, the position of the position that such as operating article contact stops or operating article suspension stay.By the control that navigates to from after initial position moves to target location, enter step 1208.
Step 1208, judge operating article whether make confirmation operation.
In this step, control is from after initial position moves to target location, judge whether operating article makes confirmation operation to the control being in target location, if do not detect that operating article makes confirmation operation, then the state of control switches to idle condition from the state of putting in place, and control is turned back to initial position from target location, then enter step 1203.If the confirmation operation of operating article detected, then enter step 1209.In the present embodiment, the confirmation of operating article operation comprises: operating article clicks display screen, or operating article is in display screen induction region rest time more than the first setting duration etc., and wherein, the first setting duration can be 1s or 1.5s etc.After the confirmation operation of operating article at display screen induction region being detected, the function of the control of triggered location, shows the human window of control on a display screen, and user carries out information interaction by human window and this control.
The function of the control of step 1209, triggered location.
In this step, after the confirmation operation of operating article being detected, the function of the control of triggered location, shows the human window of control on a display screen, and user carries out information interaction by human window and this control.When the function of this control be performed complete after, when user closes this control, enter step 1210.
Step 1210, control is moved to initial position from target location.
In this step, after control is closed, the state of this control will be switched to idle condition from the state of putting in place, and this control will be moved to initial position from target location.
In the present embodiment, according to the control on the positioning action locating and displaying screen of operating article, according to the confirmation operation trigger control of operating article, thus make user just can realize the operation to each control on display screen at display screen induction region, guarantee that user only holds terminal device with one hand and each control easily on operating display, improve convenience and dirigibility that user operates each control on display screen, simplify the mode of operation of user.
Figure 13 is the process flow diagram of the present invention to method of operating first embodiment of control.As shown in figure 13, the present embodiment specifically comprises following job step to the workflow of the method for operating of control:
Step 131, display screen induction region detect operating article.
In the present embodiment, terminal display screen arranges display screen induction region, display screen induction region is the region that can sense external operation in display screen, display screen induction region is arranged on the place of user's handled easily, by the operation of operating article at display screen induction region, can select and/or trigger the control of optional position on display screen, it is more convenient to make the operation of control.Such as, the right hand accustomed to using is held to the people of terminal device, display screen induction region can be arranged on the region, lower right of display screen, this display screen induction region is that hand thumb can the region arrived of handled easily, hand thumb is utilized to operate to facilitate user, left hand accustomed to using is held to the people of terminal device, display screen induction region can be arranged on the bottom-left quadrant of display screen, this display screen induction region is the region that left hand thumb can operate, and utilizes left hand thumb manipulation to facilitate user.Wherein, whether display screen induction region can be the suspension induction region of induction external operation thing suspension procedure, have operating article to be suspended in above display screen induction region to detect; Display screen induction region can be also the contact induction region of induction external operation thing operating of contacts, and whether have operating article to contact display screen induction region for detecting, operation can be pointer or finger etc.When the operating article of display screen induction region being detected, enter step 132.
Step 132, after operating article being detected, according to the control of the positioning action locating and displaying screen display of user.
In this step, after operating article being detected, terminal device is according to the control of the positioning action locating and displaying screen display of user, can according to setting means display on a display screen by the control of locating, so that user observes or operates by the object of locating, display mode can be preset according to first by the control of locating and show.In the present embodiment, detect that the situation of operating article comprises: operating article contact display screen detected, comprise operating article to click display screen, double-click display screen or long exceed by display screen the time etc. preset, wherein, the long time preset by display screen can be 1s or 2s; Or operating article contact display screen and moving according to setting means on a display screen detected, setting means move comprise move linearly on a display screen, curve movement or the mode such as to move back and forth.For the display screen with suspension touch control function, detect that the situation of operating article also comprises: detect that operating article is suspended in the suspension induction region of display screen; Or the suspension induction region being suspended in display screen exceedes setting-up time, setting-up time can be 1s or 2s etc.; Or detect that operating article is suspended in the suspension induction region of display screen and moves according to setting means at suspension induction region, move according to setting means and be included in that suspension induction region upper straight moves, curve movement or the mode such as to move back and forth.
After operating article being detected, according to the control on the positioning action locating and displaying screen of user, such as, user carrys out orient control by eyeball, and user also can pass through operating article orient control, then enters step 133.
Step 133, detect operating article display screen induction region confirmation operation after, the function of the control of triggered location.
In each embodiment of the application, it can be same area that user performs display screen area corresponding when confirmation operates with the display screen area at the control place navigated to, and also can be zones of different.When user performs confirmation operation, corresponding display screen area is zones of different with the display screen area at the control place navigated to, user fill order manual manipulation can be facilitated better, namely user does not need the screen position that go to touch-control control place the same as existing mode of operation, and only needs the display screen area corresponding to finger to carry out confirmation and operate.
In each embodiment of the application, the control that various ways triggered location arrives can be had, such as, instruction can be sent to trigger its function directly to corresponding control, also can preserve the identification information of each control in advance, after confirmation operation user being detected, generate touch event, this touch event is sent to the control of location or comprise the application program of object of location, wherein, this touch event comprises the identification information of the control of location.The receiving end that this identification information is used to indicate touch event triggers the function of the control corresponding with this identification information, and the identification information that the object namely receiving touch event can comprise according to touch event, triggers the control that this identification information is corresponding.Wherein, the identification information of control can comprise the information such as the positional information of control and/or the ID of control.In prior art, it is general when user clicks display screen, touch event can be generated, the coordinate of click location is generally comprised in this touch event, according to the embodiment of the present application, need, according to such as upper type, the coordinate of the click location in this touch event to be replaced with the identification information of the control be positioned to.
In this step, terminal detects operating article at display screen induction region and whether makes confirmation operation, the confirmation operation of operating article comprises: operating article clicks display screen, or, operating article in display screen induction region rest time more than the first setting duration etc., wherein, the first setting duration can be 1s or 1.5s etc.After the confirmation operation of operating article at display screen induction region being detected, the function of the control of triggered location, shows the human window of control on a display screen, and user carries out information interaction by human window and this control.
In the present embodiment, after terminal detects the operating article of display screen induction region, according to the control of the positioning action locating and displaying screen display of user, after the confirmation operation of operating article at display screen induction region being detected, the function of the control of triggered location, thus make user just can realize the operation to each control on display screen at display screen induction region, guarantee that user only holds terminal device with one hand and each control easily on operating display, improve convenience and dirigibility that user operates each control on display screen, simplify the mode of operation of user.
Figure 14 is the process flow diagram of the present invention to method of operating second embodiment of control, Figure 15 is the schematic diagram according to the control on the moving direction of motion track location in the present embodiment, and Figure 16 is the schematic diagram that in the present embodiment, display effect moves to target location from the initial position of control.As shown in figure 14, the idiographic flow of the present embodiment to the method for operating of control comprises the steps:
The operating article of step 141, detection display screen induction region.
In the present embodiment, the display screen of terminal device can have the function of contact touch-control, and operating article is now finger etc.; The display screen of terminal device also can have the function of suspension touch control, comprises electromagnetism electric capacity display screen and self-capacitance mutual capacitance display screen, and the operating article of electromagnetism electric capacity display screen is electromagnetic touch pen, and the operating article of self-capacitance mutual capacitance display screen is finger etc.When terminal device is in running status, by the operating article of real-time detection display screen induction region, after operating article being detected, enter step 142.
Step 142, after operating article being detected, according to the control of the positioning action locating and displaying screen display of user.
In this step, when the operating article of display screen induction region being detected, if terminal receives the operation by operating article orient control that user triggers, or the configuration information by operating article orient control that user pre-sets detected, then detect the positioning action of operating article, carry out orient control with the positioning action according to operating article.Wherein, user is comprised by the positioning action of operating article orient control: user utilize operating article to click display screen induction region or by operating article suspension stay above display screen induction region etc.What user pre-set is comprised by the configuration information of operating article orient control: user is moved reciprocatingly above display screen induction region by operating article, rectilinear motion or curvilinear motion etc.
In the present embodiment, the positioning action detecting operating article comprises the state detecting operating article, if the movement of operating article at display screen induction region do not detected, then navigates to acquiescence control; If the movement of operating article at display screen induction region detected, then according to the control of motion track locating and displaying screen display.Wherein, during according to control on motion track locating and displaying screen, can according to the moving direction of motion track, move control that prelocalization arrives for benchmark with this, the control on the moving direction of location; Or if this moves as detecting that the first time after operating article moves, then this control moving that prelocalization arrives is acquiescence control; Or according to the final position of motion track, the control that orientation distance final position is nearest.
In actual applications, give tacit consent to control and comprise any one following control: in the control of display screen display the highest control of frequency of utilization, display screen display control in most recently used control, display screen display control in associate with current operation control, show the nearest control etc. of the control pre-set in the control of screen display, the control middle distance operating article showing screen display.
In the present embodiment, move control that prelocalization arrives for benchmark is according to the moving direction of operating article motion track with this, the control on the moving direction of location.As shown in figure 15, in actual applications, a two-dimensional coordinate system can be set on a display screen, the X-axis of coordinate system is the lower frame of display screen, left edge that Y-axis is display screen, optional position on display screen can utilize the coordinate (x, y) in this coordinate system to identify.When operating article navigates to a control G, record operating article is in the primary importance A(x1 in coordinate system, y1), if operating article moves to reach second place B(x2 again, y2) control N is navigated to, wherein, operating article is Jx=abs(x2-x1 in the displacement of X-direction), moving displacement is Wx=x2-x1; Operating article is Wy=y2-y1 at the moving displacement of Y direction, and displacement is Jy=abs(y2-y1).In the present embodiment, judge whether operating article moves by displacement, when displacement Jx is greater than threshold value Jxo, then identify operating article to be moved in the X direction, if when displacement Jx is equal to or less than threshold value Jxo, then identify operating article not to be moved in the X direction, when displacement Jy is greater than threshold value Jyo, then identify operating article to be moved in the Y direction, if when displacement Jy is equal to or less than threshold value Jyo, then identify operating article not to be moved in the Y direction, by arranging the threshold value of displacement, the operation reducing or avoid the minute movement of operating article and produce.When judging that operating article is only moved in X-direction, judged the moving direction of operating article by moving displacement, when Wx is greater than 0, mark operating article moves right in the X direction, when Wx is less than 0, then identifies operating article and is moved to the left in the X direction; When judging that operating article is only moved in the Y direction, when Wy is greater than 0, mark operating article moves up in the Y direction, when Wy is less than 0, then identifies operating article and moves down in the Y direction.When judging that operating article X-direction and Y-direction are all moved, comparing the size of Jx and Jy, if Jx is less than Jy, then ignores operating article and moving in the X direction, mark operating article only moves in the Y direction; If Jx is greater than Jy, then ignores operating article and move in the Y direction, mark operating article only moves in the X direction.After control according to the positioning action locating and displaying screen display of user, enter step 143.
Step 143, the control navigated to is moved to target location.
In this step, the control navigated to is moved to target location, if initial position is identical with target location, then preset the initial position display control of display mode at graphic user interface according to first.As shown in figure 16, for the convenience of the user the control navigated to is operated, the control moving target position that can also will navigate to, the control being in target location shows according to the second default display mode, target location can be determine according to the operation of user, also can be the position preset.Moving to the process of target location by control from initial position, control can be preset display mode according to the 3rd and carry out Dynamic Announce, to strengthen the animation display effect of control, improves the use enjoyment of user.
In addition, in actual applications, except realizing according to the control disclosed in above-mentioned with mobility, can also in display screen the icon of copy control, then the control icons copied is moved to target location, user can trigger the function of this control by triggering the control icons copied.
Specifically, manipulate for the convenience of the user to the control navigated to, control can be shown in the region of operating article corresponding position on screen, user directly can manipulate the control in this region by operating article.Such as, directly in the region of operating article corresponding position on screen, copy display effect, or, display effect is moved in the region of operating article corresponding position on screen from the original position of control.Now, the control in the region of operating article correspondence position, its display effect can be the display effect before control location, also can be the display effect behind control location.
In actual applications, also an all-transparent window can be set on screen, this all-transparent window is for realizing above-mentioned function, namely by the operating article of this all-transparent windows detecting screen induction region, and according to the control that the positioning action positioning screen of user shows, and the function of the control of triggered location.Particularly, all-transparent window and application program operate in terminal device simultaneously, and wherein, all-transparent window is parallel with the window at control place.After operating article being detected, according to the operation of user, the control navigated to is copied in the position that the operating article in all-transparent window is corresponding; Or, in the position copy control of the control navigated in all-transparent window, then the control copied is moved in the region of operating article correspondence position, to strengthen animation effect, improves the use enjoyment of user.
In this step, after showing on a display screen by the control navigated to according to predetermined display parameter, if detect that operating article operates in the confirmation that is operating as of display screen induction region, then enter step 144; If detect that operating article is operating as destruction operation at display screen induction region, enters step 145.
Step 144, detect operating article confirmation operation after, the function of the control of triggered location.
In this step, when detect operating article be operating as confirmation operation after, trigger and be in the function of the control of target location, and show the human window of control on a display screen, user carries out information interaction by human window and this control.Wherein, in the present embodiment, operation acknowledgement operation comprises: operating article clicks display screen, operating article in display screen induction region rest time more than the first setting duration etc.
Step 145, after the destruction operation of operating article being detected, cancel and show this control in target location.
In this step, when the destruction operation of operating article being detected, if initial position and target location are not identical, this control will turn back to described initial position and returns to by the state before locating; If initial position is identical with target location, the control be cancelled will return to by the state before locating.Wherein, detect that the destruction operation of operating article comprises: detect that operating article leaves display screen induction region, or the confirmation operation of operating article do not detected in the second setting duration, or detect that operating article there occurs again movement etc.In actual applications, the second setting duration can be the duration of the users such as 1s or 1.5s or default.
In the present embodiment, after terminal detects the operating article of display screen induction region, the control navigated to is moved to target location by the positioning action according to user, after the confirmation operation of operating article at display screen induction region being detected, the function of the control of triggered location, thus make user just can realize the operation to each control on display screen at display screen induction region, guarantee that user only holds terminal device with one hand and locates easily and trigger each control on display screen, improve dirigibility and the interest of each control on operating display.
Figure 17 is the process flow diagram of the present invention to the method for operating of control the 3rd embodiment, and Figure 18 is the schematic diagram according to the position orient control of the focus of eyeball in the present embodiment.As shown in figure 17, the idiographic flow of the present embodiment to the method for operating of control comprises the steps:
The operating article of step 171, detection display screen induction region.
In this step, the operation of terminal real-time detection display screen induction region, when terminal detects the operation of display screen induction region, if receive the operation by Ins location control that user triggers, or the configuration information by Ins location control that user pre-sets detected, illustrate that user wishes to implement positioning action by eyes, to be carried out the control on locating and displaying screen by eyes, then enter step 172.
The eyeball of step 172, detection user, according to the control at the eyeball focus positioning of focal place on a display screen of user.
In this step, as shown in figure 18, terminal detects the eyeball of user, according to the control at the focus positioning of focal place, position on a display screen of eyeball, improves the interest that user operates each control on display screen.After user to navigate to the control wanting to open by eyes, enter step 173.
Step 173, the control navigated to is moved to target location.
In this step, the control navigated to is moved to target location, if initial position is identical with target location, then presets the initial position display control of display mode at graphic user interface according to first, clearly observe to make user and/or operate the control navigated to.Operate the control navigated to for the convenience of the user, control can also be moved to target location, target location can be determine according to the operation of user, also can be the position preset.Such as, moved to by control in the region of operating article corresponding position on a display screen, user can by the control in this region of operating article direct control.Wherein, the control being in target location shows according to the second default display mode, and target location can be determine according to the operation of user, also can be the position preset.Moving to the process of target location by control from initial position, control can be preset display mode according to the 3rd and carry out Dynamic Announce, to strengthen the animation display effect of control, improves the use enjoyment of user.After the control navigated to is moved to target location, when detecting that the be operating as confirmation of operating article at display screen induction region operates, enter step 174; When detecting that operating article is operating as destruction operation at display screen induction region, enters step 175.
Step 174, when detect operating article confirmation operation after, the function of the control of triggered location.
In this step, when detect operating article be operating as confirmation operation after, the function of the control of triggered location, shows the human window of control on a display screen, and user carries out information interaction by human window and this control.Wherein, in the present embodiment, operation acknowledgement operation comprises: operating article clicks display screen, operating article in display screen induction region rest time more than the first setting duration etc.
Step 175, after the destruction operation of operating article being detected, cancel and show this control in target location.
In this step, if when detect operating article be operating as destruction operation time initial position not identical with target location, this control will turn back to described initial position and returns to by the state before locating; If initial position is identical with target location, the control be cancelled will return to by the state before locating.
In the present embodiment, positioning action is completed by the focus of eyeball, and the control navigated to is moved to target location, then at display screen induction region, confirmation operation is carried out to the control being in target location by operating article, the function of the control of triggered location, thus make user just can realize the operation to each control on display screen at display screen induction region, guarantee that user only holds terminal device and each control on its display screen of handled easily with one hand, improve dirigibility and the interest of each control on operating display.
Figure 19 is the structural representation of the present invention to operating means first embodiment of control.As shown in figure 19, the operating means of the present embodiment to control comprises: the second detection module 191, locating module 192 and trigger module 193, and wherein, the second detection module 191 for detecting operating article on the display screen induction region of display screen; Locating module 192 for after operating article being detected, according to the control on the positioning action locating and displaying of user screen; Trigger module 193 for detect user display screen induction region confirmation operation after, the control that triggered location arrives.Wherein, operating article specifically comprises in the confirmation operation of display screen induction region: operating article clicks described display screen, and operating article reaches the first setting duration in the rest time of display screen induction region.
In the present embodiment, after second detection module detects the operating article of display screen induction region, locating module is according to the control of the positioning action locating and displaying screen display of user, after the confirmation operation of operating article at display screen induction region being detected, the function of the control of trigger module triggered location, thus user just can be positioned each control on display screen at display screen induction region, the operations such as triggering, guarantee that user only holds terminal device with one hand and each control easily on operating display, also improve the interest that user operates each control on display screen simultaneously.
Figure 20 is the structural representation of the present invention to operating means second embodiment of control.As shown in figure 20, the operating means of the present embodiment to control also comprises: the second receiver module 194, second display module 195 and second cancels module 196.Wherein, the operation by Ins location control that the second receiver module 194 triggers for receiving user, or for detecting the configuration information by Ins location control that user pre-sets.Second display module 195 shows on a display screen according to setting means for the control navigated to by locating module 192, and such as, the control navigated to shows on a display screen in the mode changing display effect by the second display module 195; Or the display effect of the control navigated to is presented in the region of operating article corresponding position on a display screen by the second display module 195.Second display module 195 directly can copy display effect in the region of operating article corresponding position on a display screen, also display effect can be moved in the region of operating article corresponding position on a display screen from the position of control.Detect that user is to when after the destruction operation of the control of prelocalization at the second detection module 191, second cancels module 196 cancels the location of module cancellation to described control, wherein, user is to the destruction operation of the control when prelocalization, comprise following any one: the second detection module 191 detects that operating article leaves display screen induction region, second detection module 191 does not detect the confirmation operation of operating article in the second setting duration, or the second detection module 191 detects that operating article is moved.
Further, second detection module 191 detects operating article can comprise any one situation following: the second detection module 191 detects operating article contact display screen, second detection module 191 detects that operating article is suspended in the suspension induction region of display screen, second detection module 191 detects operating article contact display screen and moves according to setting means on a display screen, and the second detection module 191 detects that operating article is suspended in the suspension induction region of display screen and moves according to setting means at suspension induction region.Second detection module 191 also for detecting the eyeball of user, according to the eyeball focus on a display screen of user, the control at positioning of focal place.Second detection module 191 detects the state of operating article, if the second detection module 191 does not detect the movement of operating article at display screen induction region, then locating module 192 navigates to acquiescence control, if the second detection module 191 detects the movement of operating article at display screen induction region, then locating module 192 is according to the control of the motion track locating and displaying screen display of operating article.After the second detection module 191 detects the confirmation operation of operating article at display screen induction region, generate touch event, the application program of the control event of touching being sent to location or the control comprising location, wherein, touch event comprises the identification information of the control of location, and the receiving end that identification information is used to indicate touch event triggers the function of the control corresponding with this identification information.
In the present embodiment, locating module 192 is according to the moving direction of motion track, control that prelocalization arrives is moved for benchmark with this, control on the moving direction of location, wherein, if this moves is move for the first time after the second detection module 191 detects operating article, then this control moving that prelocalization arrives is acquiescence control; Or, locating module 192 according to the final position of motion track, the control that orientation distance final position is nearest.Default objects in the present embodiment can be following in one: the control that in the control of display screen display, frequency of utilization is the highest, most recently used control in the control of display screen display, the control associated with current operation in the control of display screen display, the control pre-set in the control of display screen display, or the control that the control middle distance operating article of display screen display is nearest.
In the present embodiment, after the second receiver module 194 receives the operation by operating article orient control of user's triggering, the second detection module 191 detects the state of operating article; Or after receiver module 194 detects the configuration information by operating article orient control that user pre-sets, the second detection module 191 detects the state of operating article.
In the present embodiment, after second detection module detects the operating article of display screen induction region, locating module is according to the control of the positioning action locating and displaying screen display of user, after detection module detects the confirmation operation of operating article at display screen induction region, the function of the control of trigger module triggered location, thus make user just can realize the operation to each control on display screen at display screen induction region, guarantee that user only holds terminal device with one hand and locates easily and trigger each control on display screen, improve dirigibility and the interest of each control on operating display.
Those skilled in the art of the present technique are appreciated that the present invention can relate to the equipment for performing the one or more operation in operation described in the application.Described equipment for required object and specialized designs and manufacture, or also can comprise the known device in multi-purpose computer, and described multi-purpose computer activates or reconstructs with having storage procedure Selection within it.Such computer program can be stored in equipment (such as, computing machine) in computer-readable recording medium or be stored in and be suitable for store electrons instruction and be coupled in the medium of any type of bus respectively, described computer-readable medium includes but not limited to dish (comprising floppy disk, hard disk, CD, CD-ROM and magneto-optic disk), the immediately storer (RAM) of any type, ROM (read-only memory) (ROM), electrically programmable ROM, electric erasable ROM(EPROM), electrically erasable ROM(EEPROM), flash memory, magnetic card or light card.Computer-readable recording medium comprises for be stored by the readable form of equipment (such as, computing machine) or any mechanism of transmission information.Such as, computer-readable recording medium comprise storer (RAM) immediately, ROM (read-only memory) (ROM), magnetic disk storage medium, optical storage medium, flash memory device, with electricity, light, sound or signal (such as carrier wave, infrared signal, digital signal) etc. that other form is propagated.
Those skilled in the art of the present technique are appreciated that the combination that can realize the frame in each frame in these structural drawing and/or block diagram and/or flow graph and these structural drawing and/or block diagram and/or flow graph with computer program instructions.These computer program instructions can be supplied to the processor of multi-purpose computer, special purpose computer or other programmable data disposal routes to generate machine, thus be created the method for specifying in frame for realizing structural drawing and/or block diagram and/or flow graph or multiple frame by the instruction that the processor of computing machine or other programmable data disposal routes performs.
Those skilled in the art of the present technique are appreciated that various operations, method, the step in flow process, measure, the scheme discussed in the present invention can be replaced, changes, combines or delete.Further, there is various operations, method, other steps in flow process, measure, the scheme discussed in the present invention also can be replaced, change, reset, decompose, combine or delete.Further, of the prior art have also can be replaced with the step in operation various disclosed in the present invention, method, flow process, measure, scheme, changed, reset, decomposed, combined or deleted.
The above is only some embodiments of the present invention; it should be pointed out that for those skilled in the art, under the premise without departing from the principles of the invention; can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.

Claims (30)

1. the control display packing in graphic user interface, is characterized in that, comprise the following steps:
Receive the mobile message of the operational controls of user;
According to described mobile message, described control is moved to the target location in graphic user interface;
Target location in described graphic user interface shows described control.
2. the control display packing in graphic user interface according to claim 1, is characterized in that, described target location comprises any one position following:
The position that user presets and the position determined according to the position of operating article.
3. the control display packing in graphic user interface according to claim 2, is characterized in that,
When the initial position showing described control is identical with described target location, preset the initial position display control of display mode at graphic user interface according to first;
When the initial position showing described control is not identical with described target location, described control is moved to target location from initial position, preset display mode at described target location display control according to second.
4. the control display packing in graphic user interface according to claim 3, is characterized in that, moves to the process of target location by described control from initial position, presets display mode show the control be in mobile status according to the 3rd.
5. according to the control display packing in one of any described graphic user interface of claim 1-4, it is characterized in that, by described control from after initial position moves to target location, also comprise:
Detect the operation of user;
When the cancellation operation of user being detected, or when the operation of user not detected within the time of presetting, cancel and show described control in target location, described control will turn back to initial position.
6. the control in graphic user interface, is characterized in that, comprising: receiver module, mobile module and display module,
Described receiver module, for receiving the mobile message of control;
Described mobile module, for moving to the target location in graphic user interface by described control according to described mobile message;
Described display module, shows described control for the target location in described graphic user interface.
7. the control in graphic user interface according to claim 6, is characterized in that, described target location comprises any one position following:
The position that user presets and the position determined according to the position of operating article.
8. the control in graphic user interface according to claim 7, is characterized in that,
When the initial position showing described control is identical with described target location, described display module is preset display mode according to first and is shown control at initial position;
When the initial position showing described control is not identical with described target location, described control is moved to target location from initial position by described mobile module, and described display module presets display mode at described target location display control according to second.
9. the control in graphic user interface according to claim 8, it is characterized in that, move to the process of target location by described control from initial position at described mobile module, described display module is preset display mode according to the 3rd and is shown the control be in mobile status.
10. according to the control in one of any described graphic user interface of claim 6-9, it is characterized in that, also comprise: detection module and cancel module,
Described detection module is for detecting the operation of user;
When described detection module detects the cancellation operation of user, or when described detection module does not detect the operation of user within the time of presetting, described cancellation module cancels the display of described control in target location, and described control will turn back to initial position.
The method of 11. 1 kinds of operational controls, is characterized in that, comprising:
The induction region of display screen detects operating article;
After operating article being detected, locate the control on described display screen according to the positioning action of user;
After the confirmation operation of user at described display screen induction region being detected, the control that triggered location arrives.
The method of 12. operational controls according to claim 11, is characterized in that, the described positioning action according to user also comprises after locating the control on described display screen:
Control will be navigated to and move to target location from initial position;
Described target location comprises any one position following: the position that user presets and the position that the position according to operating article is determined.
The method of 13. operational controls according to claim 12, is characterized in that,
If initial position is identical with target location, the described control navigated to is preset display mode according to first and is shown;
If initial position and target location are not identical, the described control navigated to is preset display mode according to second and is shown.
The method of 14. operational controls according to claim 12, is characterized in that, the described control navigated to moves to target location, comprising:
The described control navigated to moves with rectilinear motion mode or curvilinear motion mode;
In moving process, described control is preset display mode according to the 3rd and is shown.
The method of 15. operational controls according to claim 11, is characterized in that, described in operating article detected, comprising:
Detect that operating article contacts described display screen; Or
Detect that operating article is suspended in the suspension induction region of described display screen; Or
Detect that operating article contacts described display screen and moves according to setting means on described display screen; Or
Detect that operating article is suspended in the suspension induction region of described display screen and moves according to setting means at described suspension induction region.
The method of 16. operational controls according to claim 11, is characterized in that, the positioning action of described user, comprising:
User is by operating article orient control; Or
Detect user's eyeball, according to the position of focus on described display screen of described user's eyeball, locate the control on this position.
The method of 17. operational controls according to claim 16, is characterized in that, before detection user eyeball, also comprises:
Receive the positioning action signal by Ins location control that user triggers; Or
Receive the positioning action information by Ins location control that user pre-sets.
The method of 18. operational controls according to claim 11, is characterized in that, the described positioning action according to user locates the control on described display screen, comprising:
If detect, described operating article does not move on display screen induction region, then navigate to acquiescence control;
If detect, described operating article moves on display screen induction region, then locate the control on described display screen according to motion track.
The method of 19. operational controls according to claim 18, is characterized in that, according to the moving direction of described motion track, moves control that prelocalization arrives for benchmark, locate the control on described moving direction with this;
If this moves as detecting that the first time after operating article moves, then this control moving that prelocalization arrives is acquiescence control; Or
According to the final position of described motion track, the control that final position described in orientation distance is nearest.
The method of 20. operational controls according to claim 19, is characterized in that, described acquiescence control, comprising:
The control that in the control of described display screen display, frequency of utilization is the highest; Or
Most recently used control in the control of described display screen display; Or
The control associated with current operation in the control of described display screen display; Or
The control pre-set in the control of described display screen display; Or
The control that the control middle distance operating article of described display screen display is nearest.
The method of 21. operational controls according to claim 18 or 19, is characterized in that, before the described positioning action according to user locates the control on described display screen, also comprise
Receive the positioning action signal by operating article orient control that user triggers; Or
Receive the positioning action information by operating article orient control that user pre-sets.
The method of 22. operational controls according to claim 11, is characterized in that, described in detect that user operates in the confirmation of described display screen induction region, specifically comprise:
Described operating article clicks described display screen; Or
Described operating article reaches the first setting duration in the rest time of described display screen induction region.
The method of 23. operational controls according to claim 12, is characterized in that, at the described control that will navigate to from after initial position moves to target location, also comprises:
When the destruction operation of operating article being detected, if initial position and target location are not identical, this control will turn back to described initial position and returns to by the state before locating; If initial position is identical with target location, the control be cancelled will return to by the state before locating.
The method of 24. operational controls according to claim 23, is characterized in that, described in detect and comprise the destruction operation of operating article:
Detect that described operating article leaves display screen induction region; Or
The confirmation operation of described operating article is not detected in the second setting duration; Or
Detect that described operating article is moved.
The method of 25. operational controls according to claim 11, is characterized in that, the control on described display screen, comprising:
Control on described display screen within the scope of setting regions;
Described setting regions scope comprises: the regional extent beyond the setting range of described operating article corresponding position on a display screen.
The method of 26. operational controls according to claim 11, is characterized in that, the control that triggered location arrives comprises:
Generate touch event, the application program of the described control described touch event being sent to location or the described control comprising location, wherein, described touch event comprises the identification information of the described control of location, and the receiving end that described identification information is used to indicate described touch event triggers the control corresponding with this identification information.
The device of 27. 1 kinds of operational controls, is characterized in that, comprising: the second detection module, locating module and trigger module;
Described second detection module, for detecting operating article on the display screen induction region of display screen;
Described locating module, for after operating article being detected, locates the control on described display screen according to the positioning action of user;
Described trigger module, for after the confirmation operation of user at described display screen induction region being detected, the control that triggered location arrives.
The device of 28. operational controls according to claim 27, is characterized in that, also comprise:
Second display module, shows for the control navigated to is preset display mode according to first.
The device of 29. operational controls according to claim 28, is characterized in that, also comprise: the second mobile module, moves to target location for navigating to control from initial position;
Described target location comprises any one position following: the position that the position determined according to the position of operating article or user preset.
The device of 30. operational controls according to claim 27, it is characterized in that, also comprise: second cancels module, for when the destruction operation of operating article being detected, if initial position and target location are not identical, this control will turn back to described initial position and returns to by the state before locating;
If initial position is identical with target location, the control be cancelled will return to by the state before locating.
CN201310409569.9A 2013-09-10 2013-09-10 Control in graphical user interface, display method as well as method and device for operating control Pending CN104423870A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310409569.9A CN104423870A (en) 2013-09-10 2013-09-10 Control in graphical user interface, display method as well as method and device for operating control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310409569.9A CN104423870A (en) 2013-09-10 2013-09-10 Control in graphical user interface, display method as well as method and device for operating control

Publications (1)

Publication Number Publication Date
CN104423870A true CN104423870A (en) 2015-03-18

Family

ID=52972998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310409569.9A Pending CN104423870A (en) 2013-09-10 2013-09-10 Control in graphical user interface, display method as well as method and device for operating control

Country Status (1)

Country Link
CN (1) CN104423870A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834521A (en) * 2015-04-27 2015-08-12 深圳市金立通信设备有限公司 Application interface regulating method
CN104866189A (en) * 2015-04-27 2015-08-26 深圳市金立通信设备有限公司 Terminal
CN104899021A (en) * 2015-05-11 2015-09-09 广东美晨通讯有限公司 Display terminal and processing method for icon on interface of same
CN104915099A (en) * 2015-06-16 2015-09-16 努比亚技术有限公司 Icon sorting method and terminal equipment
CN105677498A (en) * 2015-12-29 2016-06-15 山东大学 Optimization method of View control state saving on Android system
CN105824508A (en) * 2016-04-01 2016-08-03 广东欧珀移动通信有限公司 Display method of terminal interface and terminal equipment
CN106155562A (en) * 2015-04-28 2016-11-23 天脉聚源(北京)科技有限公司 A kind of intelligent terminal's interface control method and device
CN106383642A (en) * 2016-09-09 2017-02-08 北京金山安全软件有限公司 Display method and related device for control of media playing interface
CN107193542A (en) * 2017-03-30 2017-09-22 腾讯科技(深圳)有限公司 Method for information display and device
CN107368230A (en) * 2016-05-13 2017-11-21 中兴通讯股份有限公司 A kind of method and apparatus of interface element movement
CN107422947A (en) * 2016-05-24 2017-12-01 中兴通讯股份有限公司 The moving method and device of control
CN109521932A (en) * 2018-11-06 2019-03-26 斑马网络技术有限公司 Voice control display processing method, device, vehicle, storage medium and equipment
CN109947546A (en) * 2019-03-13 2019-06-28 北京乐我无限科技有限责任公司 A kind of task executing method, device, electronic equipment and storage medium
CN110162358A (en) * 2019-04-09 2019-08-23 广州小鹏汽车科技有限公司 A kind of method and system for the animation effect indicating component
CN111035221A (en) * 2018-10-11 2020-04-21 夏普株式会社 Operation display device and cooking device
CN111290812A (en) * 2020-01-20 2020-06-16 北京无限光场科技有限公司 Application control display method and device, terminal and storage medium
CN112347273A (en) * 2020-11-05 2021-02-09 北京字节跳动网络技术有限公司 Audio playing method and device, electronic equipment and storage medium
CN113325986A (en) * 2021-05-28 2021-08-31 维沃移动通信(杭州)有限公司 Program control method, program control device, electronic device and readable storage medium
CN113750509A (en) * 2021-08-19 2021-12-07 珠海强源体育用品有限公司 Gun shot ranking dynamic display method, storage medium and display device
WO2022022443A1 (en) * 2020-07-28 2022-02-03 华为技术有限公司 Method for moving control and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060209013A1 (en) * 2005-03-17 2006-09-21 Mr. Dirk Fengels Method of controlling a machine connected to a display by line of vision
CN102479039A (en) * 2010-11-30 2012-05-30 汉王科技股份有限公司 Control method of touch device
CN102479027A (en) * 2010-11-24 2012-05-30 中兴通讯股份有限公司 Control method and device of application icons on touch screen
JP2013073466A (en) * 2011-09-28 2013-04-22 Kyocera Corp Device, method, and program
CN103186240A (en) * 2013-03-25 2013-07-03 成都西可科技有限公司 High-pixel camera-based method for detecting eye movement
CN103257818A (en) * 2012-02-20 2013-08-21 联想(北京)有限公司 Method and device for one-handed operation of icon on touch screen
CN104346085A (en) * 2013-07-25 2015-02-11 北京三星通信技术研究有限公司 Control object operation method and device and terminal device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060209013A1 (en) * 2005-03-17 2006-09-21 Mr. Dirk Fengels Method of controlling a machine connected to a display by line of vision
CN102479027A (en) * 2010-11-24 2012-05-30 中兴通讯股份有限公司 Control method and device of application icons on touch screen
CN102479039A (en) * 2010-11-30 2012-05-30 汉王科技股份有限公司 Control method of touch device
JP2013073466A (en) * 2011-09-28 2013-04-22 Kyocera Corp Device, method, and program
CN103257818A (en) * 2012-02-20 2013-08-21 联想(北京)有限公司 Method and device for one-handed operation of icon on touch screen
CN103186240A (en) * 2013-03-25 2013-07-03 成都西可科技有限公司 High-pixel camera-based method for detecting eye movement
CN104346085A (en) * 2013-07-25 2015-02-11 北京三星通信技术研究有限公司 Control object operation method and device and terminal device

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866189A (en) * 2015-04-27 2015-08-26 深圳市金立通信设备有限公司 Terminal
CN104834521A (en) * 2015-04-27 2015-08-12 深圳市金立通信设备有限公司 Application interface regulating method
CN106155562A (en) * 2015-04-28 2016-11-23 天脉聚源(北京)科技有限公司 A kind of intelligent terminal's interface control method and device
CN104899021B (en) * 2015-05-11 2018-06-19 广东美晨通讯有限公司 The processing method of icon in display terminal and its interface
CN104899021A (en) * 2015-05-11 2015-09-09 广东美晨通讯有限公司 Display terminal and processing method for icon on interface of same
CN104915099A (en) * 2015-06-16 2015-09-16 努比亚技术有限公司 Icon sorting method and terminal equipment
CN105677498B (en) * 2015-12-29 2018-05-25 山东大学 The optimization method that View controls state preserves on a kind of Android system
CN105677498A (en) * 2015-12-29 2016-06-15 山东大学 Optimization method of View control state saving on Android system
CN105824508A (en) * 2016-04-01 2016-08-03 广东欧珀移动通信有限公司 Display method of terminal interface and terminal equipment
CN105824508B (en) * 2016-04-01 2019-03-26 Oppo广东移动通信有限公司 A kind of display methods and terminal device of terminal interface
CN107368230A (en) * 2016-05-13 2017-11-21 中兴通讯股份有限公司 A kind of method and apparatus of interface element movement
CN107422947A (en) * 2016-05-24 2017-12-01 中兴通讯股份有限公司 The moving method and device of control
CN106383642A (en) * 2016-09-09 2017-02-08 北京金山安全软件有限公司 Display method and related device for control of media playing interface
CN107193542A (en) * 2017-03-30 2017-09-22 腾讯科技(深圳)有限公司 Method for information display and device
CN111035221A (en) * 2018-10-11 2020-04-21 夏普株式会社 Operation display device and cooking device
CN111035221B (en) * 2018-10-11 2022-02-11 夏普株式会社 Operation display device and cooking device
CN109521932A (en) * 2018-11-06 2019-03-26 斑马网络技术有限公司 Voice control display processing method, device, vehicle, storage medium and equipment
CN109947546A (en) * 2019-03-13 2019-06-28 北京乐我无限科技有限责任公司 A kind of task executing method, device, electronic equipment and storage medium
CN109947546B (en) * 2019-03-13 2021-08-20 北京乐我无限科技有限责任公司 Task execution method and device, electronic equipment and storage medium
CN110162358A (en) * 2019-04-09 2019-08-23 广州小鹏汽车科技有限公司 A kind of method and system for the animation effect indicating component
CN111290812A (en) * 2020-01-20 2020-06-16 北京无限光场科技有限公司 Application control display method and device, terminal and storage medium
CN111290812B (en) * 2020-01-20 2024-02-06 北京有竹居网络技术有限公司 Display method, device, terminal and storage medium of application control
WO2022022443A1 (en) * 2020-07-28 2022-02-03 华为技术有限公司 Method for moving control and electronic device
CN112347273A (en) * 2020-11-05 2021-02-09 北京字节跳动网络技术有限公司 Audio playing method and device, electronic equipment and storage medium
CN113325986A (en) * 2021-05-28 2021-08-31 维沃移动通信(杭州)有限公司 Program control method, program control device, electronic device and readable storage medium
CN113325986B (en) * 2021-05-28 2022-08-26 维沃移动通信(杭州)有限公司 Program control method, program control device, electronic device and readable storage medium
CN113750509A (en) * 2021-08-19 2021-12-07 珠海强源体育用品有限公司 Gun shot ranking dynamic display method, storage medium and display device

Similar Documents

Publication Publication Date Title
CN104423870A (en) Control in graphical user interface, display method as well as method and device for operating control
US8443302B2 (en) Systems and methods of touchless interaction
KR102131829B1 (en) Mobile terminal and method for controlling thereof
CA2959683C (en) Inactive region for touch surface based on contextual information
CN104346085A (en) Control object operation method and device and terminal device
US7411575B2 (en) Gesture recognition method and touch system incorporating the same
US20130241832A1 (en) Method and device for controlling the behavior of virtual objects on a display
US9015584B2 (en) Mobile device and method for controlling the same
CN106605202A (en) Handedness detection from touch input
CN109791468A (en) User interface for both hands control
CN108710469A (en) The startup method and mobile terminal and medium product of a kind of application program
KR20120126255A (en) Method and apparatus for controlling display of item
CN103116453A (en) Operation management method and operation management device of graphic object
CN102855068A (en) Interface operation control method and device and electronic equipment
CN104704454A (en) Terminal and method for processing multi-point input
US11221729B1 (en) Tracking and restoring pointer positions among applications
CN103777788A (en) Control method and electronic devices
Rehman et al. An architecture for interactive context-aware applications
CN102566930A (en) Method and device for accessing of application platform
EP2932365A1 (en) Touch screen device for handling lists
CN106033301A (en) An application program desktop management method and a touch screen terminal
CN109814794A (en) A kind of interface display method and terminal device
KR101894581B1 (en) Mobile terminal and method for controlling of the same
KR101371524B1 (en) Mouse Device For Controlling Remote Access

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned
AD01 Patent right deemed abandoned

Effective date of abandoning: 20190531