CN103150108A - Equipment screen component moving method and device, and electronic equipment - Google Patents

Equipment screen component moving method and device, and electronic equipment Download PDF

Info

Publication number
CN103150108A
CN103150108A CN2013100462947A CN201310046294A CN103150108A CN 103150108 A CN103150108 A CN 103150108A CN 2013100462947 A CN2013100462947 A CN 2013100462947A CN 201310046294 A CN201310046294 A CN 201310046294A CN 103150108 A CN103150108 A CN 103150108A
Authority
CN
China
Prior art keywords
viewing area
assembly
gesture
gesture operation
device screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013100462947A
Other languages
Chinese (zh)
Other versions
CN103150108B (en
Inventor
匡俊
唐东
杨柳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201310046294.7A priority Critical patent/CN103150108B/en
Publication of CN103150108A publication Critical patent/CN103150108A/en
Application granted granted Critical
Publication of CN103150108B publication Critical patent/CN103150108B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention is applies to the field of communication technology and provides an equipment screen component moving method and device, and electronic equipment. The method comprises the following steps: acquiring first gesture operation of a touch control object on the equipment screen, wherein the first gesture operation is used for moving the component to a target display area from a source display area on the equipment screen; analyzing the first gesture operation, so as to ensure the to-be-moved component corresponding to the first gesture operation and calculate the central position of the gesture area corresponding to the first gesture operation; taking the central position of the gesture area corresponding to the first gesture operation as the center, and applying for the target display area on the press mounting equipment screen; and displaying the to-be-moved component on the target display area. According to the invention, the personalized arrangement for menu icons or buttons and other components on the equipment screen can be realized, so as to bring convenience in moving utility program icons or buttons to convenient operation positions by a user.

Description

A kind of device screen assembly moving method, device and electronic equipment
Technical field
The invention belongs to communication technical field, relate in particular to a kind of device screen assembly moving method, device and electronic equipment.
Background technology
Along with the development of terminal technology, and the terminal device of touch-screen (as mobile phone, PAD, digital album (digital photo frame) etc.) is universal gradually, and the screen of terminal device is increasing, and it is important that the simplicity of operation seems more.
Prior art, the continuous increase of the terminal device screen of touch-screen has brought better visual experience to the user, yet, the position of the assemblies such as the icon of touch-screen screen or button is all fixed, the user can't modify to the position of the assemblies such as icon or button, therefore, and for the terminal screen of continuous expansion, the user in use, if need to search certain application or carry out a certain operation, often need repeatedly page turning to click and search, operation is trouble very.
To sum up, the module positions such as the menu icon on the device screen of prior art or button are fixed, and the user can't modify to its position.
Summary of the invention
The purpose of the embodiment of the present invention is to provide a kind of device screen assembly moving method, is intended to solve to a certain extent the module positions such as menu icon on the device screen of prior art or button and fixes, and can't realize the problem of the requirement that user individual arranges.
The embodiment of the present invention provides following technical scheme:
First aspect present invention provides a kind of device screen assembly moving method, and described method is applied to have the terminal device of device screen, and described device screen has the viewing area, and described viewing area comprises at least one assembly, and described method comprises:
Obtain first gesture operation of touch control object on device screen, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area;
Resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence;
Centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen;
Show assembly described to be moved in described target viewing area.
In the possible implementation of the first of first aspect, described the first gesture operation of described parsing, determine the assembly to be moved of described the first gesture operational correspondence, comprise: resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component count, the assembly to be moved of determining described the first gesture operational correspondence be on device screen whole/the part assembly; Perhaps resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component type, determine that the assembly relevant to the component type of described the first gesture operational correspondence is assembly to be moved; Perhaps resolve described the first gesture operation, according to the corresponding relation of default gesture operation and assembly function, determine that the assembly relevant to the assembly function of described the first gesture operational correspondence is assembly to be moved.
In conjunction with the possible implementation of the first of first aspect, in the possible implementation of the second, described obtain first gesture operation of touch control object on device screen after, described method also comprises: judge whether described the first gesture operation is for assembly is moved to the gesture operation of target viewing area from the viewing area, source of device screen, if carry out the step of described the first gesture operation of described parsing.
In conjunction with the first of first aspect possible implementation and the possible implementation of the second, in the third possible implementation, centered by the center in described gesture zone by described the first gesture operational correspondence, apply for the step of target viewing area on device screen after, described method also comprises: when the target viewing area of described application exceeds the viewing area of device screen, the described target of translation viewing area is until described target viewing area all is presented in the viewing area of described device screen.
Above-mentioned any possible implementation in conjunction with first aspect or first aspect, in the 4th kind of possible implementation, describedly show that in described target viewing area described assembly wait moving is specially: after arranging based on the described assembly convergent-divergent wait moving of large young pathbreaker of the target viewing area of described application, be presented in described target viewing area.
Above-mentioned any possible implementation in conjunction with first aspect or first aspect, in the 5th kind of possible implementation, after the described step that shows assembly described to be moved in described target viewing area, described method also comprises: obtain second gesture operation of touch control object on device screen, described the second gesture operation is used for assembly is moved to the viewing area, source from the target viewing area of device screen;
According to described the second gesture operation, show assembly described to be moved in the viewing area, source.
Above-mentioned any possible implementation in conjunction with first aspect or first aspect, in the 6th kind of possible implementation, after the described step of obtaining second gesture operation of touch control object on device screen, described method also comprises: judge that whether described the second gesture operation is for assembly is moved to the gesture operation of viewing area, source from the target viewing area of device screen, if, carry out describedly according to described the second gesture operation, show the step of assembly described to be moved in the viewing area, source.
Second aspect present invention provides a kind of device screen assembly mobile device, and described application of installation is in the terminal device with device screen, and described device screen has the viewing area, and described viewing area comprises at least one assembly, and described device comprises:
Acquiring unit is used for obtaining first gesture operation of touch control object on device screen, and described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area;
Resolution unit is used for resolving described the first gesture operation, determines the assembly to be moved of described the first gesture operational correspondence, and calculates the center in the gesture zone of described the first gesture operational correspondence;
Application unit is used for centered by the center in the gesture zone of described the first gesture operational correspondence application target viewing area on device screen;
Display unit is used at described target viewing area demonstration assembly described to be moved.
In the possible implementation of the first of second aspect, described resolution unit, concrete for resolving described the first gesture operation, according to the corresponding relation of default gesture operation and component count, the assembly to be moved of determining described the first gesture operational correspondence be on device screen whole/the part assembly; Perhaps
Resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component type, determine that the assembly relevant to the component type of described the first gesture operational correspondence is assembly to be moved; Perhaps
Resolve described the first gesture operation, according to the corresponding relation of default gesture operation and assembly function, determine that the assembly relevant to the assembly function of described the first gesture operational correspondence is assembly to be moved.
In conjunction with the possible implementation of the first of second aspect, in the possible implementation of the second, described device also comprises the first gesture judging unit, be used for judging whether described the first gesture operation is for assembly is moved to the gesture operation of target viewing area from the viewing area, source of device screen, if trigger described resolution unit and process.
In conjunction with the first of second aspect possible implementation and the possible implementation of the second, in the third possible implementation, described device also comprises translation unit, be used for when the target viewing area of described application exceeds the viewing area of device screen, the described target of translation viewing area is until described target viewing area all is presented in the viewing area of described device screen.
Above-mentioned any possible implementation in conjunction with second aspect or second aspect, in the 4th kind of possible implementation, described display unit, concrete be used for arranging based on the described assembly convergent-divergent wait moving of large young pathbreaker of the target viewing area of described application after, be presented in described target viewing area.
Above-mentioned any possible implementation in conjunction with first aspect or second aspect, in the 5th kind of possible implementation, described acquiring unit, also be used for obtaining second gesture operation of touch control object on device screen, described the second gesture operation is used for assembly is moved to the viewing area, source from the target viewing area of device screen; Described display unit also is used for according to described the second gesture operation, shows assembly described to be moved in the viewing area, source.
Above-mentioned any possible implementation in conjunction with first aspect or second aspect, in the 6th kind of possible implementation, described device also comprises the second gesture judging unit, be used for judging that whether described the second gesture operation is for assembly is moved to the gesture operation of viewing area, source from the target viewing area of device screen, if trigger described display unit and process.
Third aspect present invention provides a kind of electronic equipment, comprises device screen and processing module, and described processing module is connected with described device screen, wherein:
Described device screen is used for screen state, the assembly of display module in the viewing area, source and moves to the screen state of target viewing area or assembly from the viewing area, source at the screen state of target viewing area; Described device screen also is used for the slip of the touch point of induction touch control object on device screen, sends response signal to described processing module;
described processing module is used for receiving the response signal that described device screen sends, obtain first gesture operation of touch control object on device screen according to described response signal, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen, and show assembly described to be moved in described target viewing area.
In the possible implementation of the first of the third aspect, described processing module, the concrete response signal that is used for receiving described device screen transmission, obtain first gesture operation of touch control object on device screen according to described response signal, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area, resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component count, the assembly to be moved of determining described the first gesture operational correspondence be on device screen whole/the part assembly; Perhaps resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component type, determine that the assembly relevant to the component type of described the first gesture operational correspondence is assembly to be moved; Perhaps resolve described the first gesture operation, according to the corresponding relation of default gesture operation and assembly function, determine that the assembly relevant to the assembly function of described the first gesture operational correspondence is assembly to be moved; And calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen, and show assembly described to be moved in described target viewing area.
In conjunction with the first of the third aspect possible implementation and the possible implementation of the second, in the third possible implementation, described processing module, also be used for judging whether described the first gesture operation is for assembly is moved to the gesture operation of target viewing area from the viewing area, source of device screen, if carry out and resolve described the first gesture operation.
Above-mentioned any possible implementation in conjunction with first aspect or second aspect, in the 4th kind of possible implementation, described processing module, also be used for when the target viewing area of described application exceeds the viewing area of device screen, the described target of translation viewing area is until described target viewing area all is presented in the viewing area of described device screen.
The embodiment of the present invention compared with prior art, beneficial effect is: terminal device obtains first gesture operation of touch control object on device screen, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen; Show assembly described to be moved in described target viewing area.Realization can be carried out personal settings to the position of the assemblies such as the menu icon on device screen or button, with the position that facilitates the user that icon or the button of application program moved to handled easily, thus user-friendly.Simultaneously, the moving process user only need to input the movement that a gesture motion can realize assembly, and is simple, convenient.
Description of drawings
In order to be illustrated more clearly in the technical scheme of the embodiment of the present invention, during the below will describe embodiment, the accompanying drawing of required use is done to introduce simply, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skills, under the prerequisite of not paying creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Fig. 1 is the process flow diagram of the device screen assembly moving method that provides of the embodiment of the present invention one;
Fig. 2 is the process flow diagram of the device screen assembly moving method that provides of the embodiment of the present invention two;
Fig. 3 a is the schematic diagram of the moving assembly that provides of the embodiment of the present invention two;
Fig. 3 b is the schematic diagram of the reduction assembly that provides of the embodiment of the present invention two;
Fig. 4 is the structural drawing of the device screen assembly mobile device that provides of virtual bench embodiment one of the present invention;
Fig. 5 is the structural drawing of the device screen assembly mobile device that provides of virtual bench embodiment two of the present invention;
Fig. 6 is the structural representation of electronic equipment embodiment one of the present invention;
Fig. 7 is the structural representation of electronic equipment embodiment two of the present invention.
Embodiment
In order to make purpose of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, is not intended to limit the present invention.
In embodiments of the present invention, terminal device obtains first gesture operation of touch control object on device screen, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and according to analysis result, application target viewing area on device screen shows assembly described to be moved in described target viewing area, realization can be carried out personal settings to the position of the assemblies such as the menu icon on device screen or button.
Below in conjunction with specific embodiment, realization of the present invention is described in detail:
Embodiment one
Fig. 1 shows the process flow diagram of the device screen assembly moving method that the embodiment of the present invention one provides, described method is applied to have the terminal device of device screen, described device screen has the viewing area, and described viewing area comprises at least one assembly, and details are as follows for described method:
In S101, obtain first gesture operation of touch control object on device screen, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area;
In the present embodiment, described device screen can be the touch-screen screen.
In the present embodiment, described touch control object can be that finger, felt pen of user etc. can be at the object of the enterprising line slip of device screen.
In the present embodiment, described assembly can be the menu on device screen, icon, button or list of application etc.
In the present embodiment, described first not other touch gestures operation in terminal device of gesture operational zone can be double click operation, draws circle or draws rectangle etc., and the present invention does not do this to limit.
In the present embodiment, viewing area, described source is the original position zone of screen assembly on device screen, and described target viewing area is less than the viewing area of device screen, and is preferred, described target viewing area can be positioned at the lower right of device screen, with user-friendly.
In S102, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence;
In the present embodiment, the zone that described gesture zone can surround for the gesture motion track, the geometric center in the zone that the center in described gesture zone can surround for the gesture motion track, for example, when the first gesture is operating as rectangle, the gesture zone is the zone at rectangle place, the center in gesture zone is the cornerwise intersection point of rectangle, when the first gesture is operating as circle, the gesture zone is the zone at circular place, and the center in gesture zone is the circular center of circle.
In S103, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen;
In the present embodiment, described target viewing area can for circular, square, rectangle etc., specifically can be set according to user's needs.
In S104, show assembly described to be moved in described target viewing area.
In the present embodiment, in order to adapt to the size of different device screens, after can arranging based on the described assembly convergent-divergent wait moving of large young pathbreaker of the target viewing area of described application, be presented in described target viewing area, the size of convergent-divergent can be carried out adaptive control according to the size of device screen.
In the present embodiment, terminal device obtains first gesture operation of touch control object on device screen, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen; Show assembly described to be moved in described target viewing area.Realization can be carried out personal settings to the position of the assemblies such as the menu icon on device screen or button, with the position that facilitates the user that icon or the button of application program moved to handled easily, thus user-friendly.Simultaneously, the moving process user only need to input the movement that a gesture motion can realize assembly, and is simple, convenient.
Embodiment two
Fig. 2 shows the process flow diagram of the device screen assembly moving method that the embodiment of the present invention two provides, described method is applied to have the terminal device of device screen, described device screen has the viewing area, and described viewing area comprises at least one assembly, and details are as follows for described method:
In S201, obtain first gesture operation of touch control object on device screen, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area;
In S202, judge that whether described the first gesture operation is for assembly being moved to the gesture operation of target viewing area from the viewing area, source of device screen, if, carry out S203, if not, finish;
In the present embodiment, obtain touch control object after the first gesture operation on device screen, by judging whether described the first gesture operation is for assembly is moved to the gesture operation of target viewing area from the viewing area, source of device screen, can prevent from the icon on the current device screen is missed movement, thereby affect user's use.
In S203, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence;
Optionally, resolve described the first gesture operation in S203, the process of determining the assembly to be moved of described the first gesture operational correspondence can realize in such a way: resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component count, the assembly to be moved of determining described the first gesture operational correspondence be on device screen whole/the part assembly.
In the present embodiment, can will set up corresponding relation between gesture operation and module data, realization can be moved all according to gesture operation/assembly partly, for example, and can gesture operation a is corresponding with all components; Gesture operation b is corresponding with the part assembly, and is for example, can gesture operation b corresponding with the first row assembly of screen.
Optionally, S203 resolves described the first gesture operation, the process of determining the assembly to be moved of described the first gesture operational correspondence can also realize in such a way: resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component type, determine that the assembly relevant to the component type of described the first gesture operational correspondence is assembly to be moved;
In the present embodiment, can set up corresponding relation with between gesture operation and different component type, realization can be moved dissimilar assembly according to gesture operation, for example, and can gesture operation a is corresponding with the icon type assembly; Gesture operation b is corresponding with the type of button assembly etc.
Optionally, S203 resolves described the first gesture operation, the process of determining the assembly to be moved of described the first gesture operational correspondence further can realize in such a way: resolve described the first gesture operation, according to the corresponding relation of default gesture operation and assembly function, determine that the assembly relevant to the assembly function of described the first gesture operational correspondence is assembly to be moved;
In the present embodiment, can set up corresponding relation with between gesture operation and different assembly functions, for example, can gesture operation a is corresponding with the game class assembly; Can gesture operation b is corresponding with the communication class assembly, can be with gesture operation c corresponding with the amusement class component etc.
In S204, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen;
Optionally, in order to take full advantage of the space of device screen, can be according to the number of assembly to be moved, self-adaptation is determined the size of application target viewing area, when the number of assembly is more, larger target viewing area can be applied for, when the number of assembly is less, less target viewing area can be applied for.
In the present embodiment, can also apply for the target viewing area of fixed size.
preferably, if during the marginal position of the close device screen of the central point in gesture zone, may cause the component targets viewing area can't show on device screen fully, translation target viewing area, to guarantee that the target viewing area can show at device screen fully, it is concrete: when described target viewing area exceeds the viewing area of device screen, the described target of translation viewing area, until described target viewing area all is presented in the viewing area of device screen, for example, the limit that exceeds the viewing area of device screen in described target viewing area can be moved to the limit of device screen and overlap, thereby the target viewing area that guarantees application can show fully, and take full advantage of the usage space of device screen.
In S205, show assembly described to be moved in described target viewing area.
In the present embodiment, can directly show assembly to be moved in described target viewing area, preferably, after can also arranging based on the described assembly convergent-divergent wait moving of large young pathbreaker of the target viewing area of described application, be presented in described target viewing area, thereby utilize more fully the space of target viewing area.
In S206, obtain second gesture operation of touch control object on device screen, described the second gesture operation is used for assembly is moved to the viewing area, source from the target viewing area of device screen.
In the present embodiment, the second gesture operation can be for double-clicking, draw circle or drawing rectangle etc. at the device screen white space, at this not in order to limit the present invention.
In the present embodiment, described the second gesture operational zone is other touch gestures operation in terminal device not, can be double click operation, draw circle or draw rectangle etc., at this not in order to limit the present invention.
In S207, judge that whether described the second gesture operation is for assembly being moved to the gesture operation of viewing area, source from the target viewing area of device screen, if, carry out S208, if not, finish.
In the present embodiment, obtain touch control object after the second gesture operation on device screen, by judging whether described the second gesture operation is for assembly is moved to the gesture operation of target viewing area from the viewing area, source of device screen, can prevent from the assembly on the current device screen is missed reduction, thereby affect user's use.
In S208, according to described the second gesture operation, show assembly described to be moved in the viewing area, source.
In the present embodiment, show described when the assembly that moves in described target viewing area, perhaps move to the process of target viewing area from the viewing area, source at assembly, record assembly described to be moved position and the coordinate in source region, S208 is specially according to described the second gesture, and position and the coordinate of the assembly to be moved of described record in source region, show assembly described to be moved in the viewing area, source.
for the ease of understanding, describe with device screen assembly moving process and the reduction process of next realization example to the embodiment of the present invention, but the situation with this realization example is not limited, see also the schematic diagram that Fig. 3 a is moving assembly, be specially: the user wants the menu component in the operating handset of circle position, so make the first gesture operation in the zone at circle place (as double-clicking, draw circle etc., but be not limited to these operations), at this moment, menu icon on device screen is presented at centered by the center of circle, in the target viewing area of application, being about to menu icon on device screen shows in moving to the target viewing area from the viewing area, source, 3b moves to for the assembly with the target viewing area schematic diagram that the viewing area, source shows, being specially the user wants menu icon is shown in the viewing area, source, can input the second gesture operation at device screen, so that moving to the viewing area, source from the target viewing area, menu icon shows.
In the present embodiment, terminal device obtains first gesture operation of touch control object on device screen, when described the first gesture operation of judgement is when assembly being moved to the gesture operation of target viewing area from the viewing area, source of device screen, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen; Show assembly described to be moved in described target viewing area.Realization can be modified to the position of the assemblies such as the menu icon on device screen or button, and is with the position that facilitates the user that icon or the button of application program moved to handled easily, user-friendly.Simultaneously, can obtain second gesture operation of touch control object on device screen, according to described the second gesture operation, show assembly described to be moved in the viewing area, source, facilitate the user as required the touch screen screen assembly to be carried out personal settings.Simultaneously, moving process and reduction process user only need to input the movement that a gesture motion can realize assembly, and be simple, convenient.
Embodiment three
Fig. 4 shows the structural drawing of the device screen assembly mobile device that virtual bench embodiment one of the present invention provides, described device can be applied to have the terminal device of device screen, described device screen has the viewing area, described viewing area comprises at least one assembly, for convenience of explanation, only show the part relevant to the embodiment of the present invention.
Described device screen assembly mobile device comprises: acquiring unit 41, resolution unit 42, application unit 43, display unit 44 are specially:
Acquiring unit 41 is used for obtaining first gesture operation of touch control object on device screen, and described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area;
Resolution unit 42 is used for resolving described the first gesture operation, determines the assembly to be moved of described the first gesture operational correspondence, and calculates the center in the gesture zone of described the first gesture operational correspondence;
Application unit 43 is used for centered by the center in the gesture zone of described the first gesture operational correspondence application target viewing area on device screen;
Display unit 44 is used at described target viewing area demonstration assembly described to be moved.
The device screen assembly mobile device that the embodiment of the present invention provides can use in the embodiment of the method one of aforementioned correspondence, and details do not repeat them here referring to the description of above-described embodiment one.
In the present embodiment, the acquiring unit of terminal device obtains first gesture operation of touch control object on device screen, resolution unit is resolved described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence, application unit centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen; Display unit shows assembly described to be moved in described target viewing area.Realization can be carried out personal settings to the position of the assemblies such as the menu icon on device screen or button, with the position that facilitates the user that icon or the button of application program moved to handled easily, thus user-friendly.Simultaneously, the moving process user only need to input the movement that a gesture motion can realize assembly, and is simple, convenient.
Embodiment four
Fig. 5 shows the structural drawing of the device screen assembly mobile device that virtual bench embodiment two of the present invention provides, described device can be applied to have the terminal device of device screen, described device screen has the viewing area, described viewing area comprises at least one assembly, for convenience of explanation, only show the part relevant to the embodiment of the present invention.
Described device screen assembly mobile device comprises: acquiring unit 51, the first gesture judging unit 52, the second gesture judging units 53, resolution unit 54, application unit 55, translation unit 56, and display unit 57 is specially:
The difference of the embodiment of the present invention and embodiment three is:
Optionally, described the first gesture judging unit 52 is used for judging whether described the first gesture operation is for assembly is moved to the gesture operation of target viewing area from the viewing area, source of device screen, if trigger described resolution unit and process 54.
Optionally, described resolution unit 54 concrete is used for resolving described the first gesture operation, according to the corresponding relation of default gesture operation and component count, the assembly to be moved of determining described the first gesture operational correspondence be on device screen whole/the part assembly;
Optionally, described resolution unit 54, concrete for resolving described the first gesture operation, according to the corresponding relation of default gesture operation and component type, determine that the assembly relevant to the component type of described the first gesture operational correspondence is assembly to be moved;
Optionally, described resolution unit 54, concrete for resolving described the first gesture operation, according to the corresponding relation of default gesture operation and assembly function, determine that the assembly relevant to the assembly function of described the first gesture operational correspondence is assembly to be moved.
Optionally, described translation unit 56 is used for when described target viewing area exceeds the viewing area of device screen, and the described target of translation viewing area is until described target viewing area all is presented in the viewing area of device screen.
Optionally, described display unit 57, concrete be used for arranging based on the described assembly convergent-divergent wait moving of large young pathbreaker of the target viewing area of described application after, be presented in described target viewing area;
Optionally, described acquiring unit 51 also is used for obtaining second gesture operation of touch control object on device screen, and described the second gesture operation is used for assembly is moved to the viewing area, source from the target viewing area of device screen; Described display unit 57 also is used for according to described the second gesture operation, shows assembly described to be moved in the viewing area, source.
Optionally, described the second gesture judging unit 53 is used for judging that whether described the second gesture operation is for assembly is moved to the gesture operation of viewing area, source from the target viewing area of device screen, if triggers described display unit processing 57.
The device screen assembly mobile device that the embodiment of the present invention provides can use in the embodiment of the method two of aforementioned correspondence, and details do not repeat them here referring to the description of above-described embodiment two.
In the present embodiment, terminal device obtains first gesture operation of touch control object on device screen, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen shows assembly described to be moved in described target viewing area.Realization can be modified to the position of the assemblies such as the menu icon on device screen or button, and is with the position that facilitates the user that icon or the button of application program moved to handled easily, user-friendly.Simultaneously, can obtain second gesture operation of touch control object on device screen, according to described the second gesture operation, show assembly described to be moved in the viewing area, source, facilitate the user as required the touch screen screen assembly to be carried out personal settings.Simultaneously, moving process and reduction process user only need to input the movement that a gesture motion can realize assembly, and be simple, convenient.
Embodiment five
Fig. 6 is the structural representation of electronic equipment embodiment one of the present invention, as shown in Figure 6, the present embodiment also provides a kind of electronic equipment, that this electronic equipment can be specially is portable, pocket, hand-held, carrying device screen equipment built-in computer or vehicle-mounted, for example can be specially mobile phone, panel computer, notebook computer, personal digital assistant (Personal Digital Assistant; Hereinafter to be referred as: PDA), GPRS navigator etc., should be understood that, the electronic equipment of the embodiment of the present invention comprises the device screen assembly.
Particularly, the electronic equipment in the present embodiment comprises device screen 601 and processing module 602, and processing module 602 is connected with device screen 601.Wherein:
Device screen 601 is used for screen state, the assembly of display module in the viewing area, source and moves to the screen state of target viewing area or assembly from the viewing area, source at the screen state of target viewing area; Described device screen also is used for the slip of the touch point of induction touch control object on device screen, sends response signal to described processing module.
processing module 602 is used for receiving the response signal that described device screen sends, obtain first gesture operation of touch control object on device screen according to described response signal, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen, and show assembly described to be moved in described target viewing area.
Optionally, described processing module 602, the concrete response signal that is used for receiving described device screen transmission, obtain first gesture operation of touch control object on device screen according to described response signal, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area, resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component count, the assembly to be moved of determining described the first gesture operational correspondence be on device screen whole/the part assembly; Perhaps resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component type, determine that the assembly relevant to the component type of described the first gesture operational correspondence is assembly to be moved; Perhaps resolve described the first gesture operation, according to the corresponding relation of default gesture operation and assembly function, determine that the assembly relevant to the assembly function of described the first gesture operational correspondence is assembly to be moved; And calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen, and show assembly described to be moved in described target viewing area.
Optionally, described processing module 602 is used for also judging whether described the first gesture operation is for assembly is moved to the gesture operation of target viewing area from the viewing area, source of device screen, if carry out and resolve described the first gesture operation.
Optionally, described processing module 602 also is used for when the target viewing area of described application exceeds the viewing area of device screen, and the described target of translation viewing area is until described target viewing area all is presented in the viewing area of described device screen.
In the present embodiment, the terminal device processing module is obtained first gesture operation of touch control object on device screen, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen; Show assembly described to be moved in described target viewing area at device screen.Realization can be carried out personal settings to the position of the assemblies such as the menu icon on device screen or button, with the position that facilitates the user that icon or the button of application program moved to handled easily, thus user-friendly.Simultaneously, the moving process user only need to input the movement that a gesture motion can realize assembly, and is simple, convenient.
Embodiment six
Fig. 7 is the structural representation of electronic equipment embodiment two of the present invention, as shown in Figure 7, Fig. 7 shows a kind of specific embodiment of electronic equipment, in this embodiment, electronic equipment 70 comprises radiating circuit 702, receiving circuit 703, power controller 704, processor 706, storer 707 and antenna 701.Processor 706 is controlled the operation of electronic equipment 70.Storer 707 can comprise ROM (read-only memory) and random access memory, and provides instruction and data to processor 706.The part of storer 707 can also comprise non-volatile row random access memory (NVRAM).In concrete application, electronic equipment 70 can embed or itself can be exactly the Wireless Telecom Equipment of mobile phone and so on for example, can also comprise the carrier that holds radiating circuit 702 and receiving circuit 703, to allow carrying out data transmission and reception between electronic equipment 70 and remote location.Radiating circuit 702 and receiving circuit 703 can be coupled to antenna 701.Each assembly of electronic equipment 70 is coupled by bus system 3100, and wherein bus system 3100 except comprising data bus, also comprises power bus, control bus and status signal bus in addition.But for the purpose of clearly demonstrating, in the drawings various buses all are designated as bus system 3100.Electronic equipment 70 can also comprise decoding processor 705.
The method that the invention described above embodiment discloses can be applied in processor 706, in other words by processor 706 to realize, electronic equipment in the invention described above embodiment can be specially above-mentioned electronic equipment shown in Figure 7 and realize, the processing module in above-mentioned electronic equipment can be understood as the processor in electronic equipment 706 in Fig. 7.Processor 706 may be a kind of integrated circuit (IC) chip, has the executive capability of instruction and data, and the processing power of signal.In implementation procedure, each step of said method can be completed by the integrated logic circuit of the hardware in processor 706 or the instruction of software form.Above-mentioned processor can be general processor (CPU), digital signal processor (DSP), special IC (ASIC), ready-made programmable gate array (FPGA) or other programmable logic device (PLD), discrete gate or transistor logic device, discrete hardware components.Can realize or carry out disclosed each method, step and logic diagram in the embodiment of the present invention.General processor can be that microprocessor or this processor can be also the processors of any routine etc.Step in conjunction with the disclosed method of the embodiment of the present invention can be presented as directly that hardware processor is complete, and is perhaps complete with the hardware in processor and software module combination.Software module can be positioned at random access memory, and flash memory, ROM (read-only memory) are in the storage medium of this area maturations such as programmable read only memory or electrically erasable programmable storer, register.This storage medium is positioned at storer 707, and the information in processor read memory 707 is completed the step of said method in conjunction with its hardware.
It should be noted that in above-described embodiment, included unit is just divided according to function logic, but is not limited to above-mentioned division, as long as can realize corresponding function; In addition, the concrete title of each functional unit also just for the ease of mutual differentiation, is not limited to protection scope of the present invention.
In addition, one of ordinary skill in the art will appreciate that all or part of step that realizes in the various embodiments described above method is to come the relevant hardware of instruction to complete by program, corresponding program can be stored in a computer read/write memory medium, described storage medium is as ROM/RAM, disk or CD etc.
The above is only preferred embodiment of the present invention, not in order to limiting the present invention, all any modifications of doing within the spirit and principles in the present invention, is equal to and replaces and improvement etc., within all should being included in protection scope of the present invention.

Claims (18)

1. a device screen assembly moving method, is characterized in that, described method is applied to have the terminal device of device screen, and described device screen has the viewing area, and described viewing area comprises at least one assembly, and described method comprises:
Obtain first gesture operation of touch control object on device screen, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area;
Resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence;
Centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen;
Show assembly described to be moved in described target viewing area.
2. the method for claim 1, is characterized in that, described the first gesture operation of described parsing is determined to comprise the assembly to be moved of described the first gesture operational correspondence:
Resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component count, the assembly to be moved of determining described the first gesture operational correspondence be on device screen whole/the part assembly; Perhaps
Resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component type, determine that the assembly relevant to the component type of described the first gesture operational correspondence is assembly to be moved; Perhaps
Resolve described the first gesture operation, according to the corresponding relation of default gesture operation and assembly function, determine that the assembly relevant to the assembly function of described the first gesture operational correspondence is assembly to be moved.
3. method as claimed in claim 1 or 2, is characterized in that, described obtain first gesture operation of touch control object on device screen after, described method also comprises:
Judge that whether described the first gesture operation is for assembly being moved to the gesture operation of target viewing area from the viewing area, source of device screen, if carry out the step of described the first gesture operation of described parsing.
4. method as described in the claims 1 to 3 any one, is characterized in that, centered by the center in described gesture zone by described the first gesture operational correspondence, after the step of application target viewing area, described method also comprises on device screen:
When the target viewing area of described application exceeded the viewing area of device screen, the described target of translation viewing area was until described target viewing area all is presented in the viewing area of described device screen.
5. method as described in claim 1 to 4 any one, is characterized in that, describedly shows that in described target viewing area assembly described to be moved is specially:
After arranging based on the described assembly convergent-divergent wait moving of large young pathbreaker of the target viewing area of described application, be presented in described target viewing area.
6. method as described in claim 1 to 5 any one, is characterized in that, after the described step that shows assembly described to be moved in described target viewing area, described method also comprises:
Obtain second gesture operation of touch control object on device screen, described the second gesture operation is used for assembly is moved to the viewing area, source from the target viewing area of device screen;
According to described the second gesture operation, show assembly described to be moved in the viewing area, source.
7. method as described in claim 1 to 6 any one, is characterized in that, after the described step of obtaining second gesture operation of touch control object on device screen, described method also comprises:
Judge that whether described the second gesture operation is for assembly is moved to the gesture operation of viewing area, source from the target viewing area of device screen, if, carry out describedly according to described the second gesture operation, show the step of assembly described to be moved in the viewing area, source.
8. a device screen assembly mobile device, is characterized in that, described application of installation is in the terminal device with device screen, and described device screen has the viewing area, and described viewing area comprises at least one assembly, and described device comprises:
Acquiring unit is used for obtaining first gesture operation of touch control object on device screen, and described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area;
Resolution unit is used for resolving described the first gesture operation, determines the assembly to be moved of described the first gesture operational correspondence;
Computing unit is used for calculating the center in the gesture zone of described the first gesture operational correspondence;
Application unit is used for centered by the center in the gesture zone of described the first gesture operational correspondence application target viewing area on device screen;
Display unit is used at described target viewing area demonstration assembly described to be moved.
9. device as claimed in claim 8, it is characterized in that, described resolution unit, concrete for resolving described the first gesture operation, according to the corresponding relation of default gesture operation and component count, the assembly to be moved of determining described the first gesture operational correspondence be on device screen whole/the part assembly; Perhaps
Resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component type, determine that the assembly relevant to the component type of described the first gesture operational correspondence is assembly to be moved; Perhaps
Resolve described the first gesture operation, according to the corresponding relation of default gesture operation and assembly function, determine that the assembly relevant to the assembly function of described the first gesture operational correspondence is assembly to be moved.
10. install as claimed in claim 8 or 9, it is characterized in that, described device also comprises the first gesture judging unit, be used for judging whether described the first gesture operation is for assembly is moved to the gesture operation of target viewing area from the viewing area, source of device screen, if trigger described resolution unit and process.
11. device as described in claim 8 to 10 any one, it is characterized in that, described device also comprises translation unit, be used for when the target viewing area of described application exceeds the viewing area of device screen, the described target of translation viewing area is until described target viewing area all is presented in the viewing area of described device screen.
12. device as described in claim 8 to 11 any one is characterized in that, described display unit, concrete be used for arranging based on the described assembly convergent-divergent wait moving of large young pathbreaker of the target viewing area of described application after, be presented in described target viewing area.
13. device as described in claim 8 to 12 any one, it is characterized in that, described acquiring unit also is used for obtaining second gesture operation of touch control object on device screen, and described the second gesture operation is used for assembly is moved to the viewing area, source from the target viewing area of device screen;
Described display unit also is used for according to described the second gesture operation, shows assembly described to be moved in the viewing area, source.
14. device as described in claim 8 to 13 any one, it is characterized in that, described device also comprises the second gesture judging unit, be used for judging that whether described the second gesture operation is for assembly is moved to the gesture operation of viewing area, source from the target viewing area of device screen, if trigger described display unit and process.
15. an electronic equipment is characterized in that, comprises device screen and processing module, described processing module is connected with described device screen, wherein:
Described device screen is used for screen state, the assembly of display module in the viewing area, source and moves to the screen state of target viewing area or assembly from the viewing area, source at the screen state of target viewing area; Described device screen also is used for the slip of the touch point of induction touch control object on device screen, sends response signal to described processing module;
described processing module is used for receiving the response signal that described device screen sends, obtain first gesture operation of touch control object on device screen according to described response signal, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area, resolve described the first gesture operation, determine the assembly to be moved of described the first gesture operational correspondence, and calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen, and show assembly described to be moved in described target viewing area.
16. electronic equipment as claimed in claim 15, it is characterized in that, described processing module, the concrete response signal that is used for receiving described device screen transmission, obtain first gesture operation of touch control object on device screen according to described response signal, described the first gesture operation is used for assembly is moved to from the viewing area, source of device screen the target viewing area, resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component count, the assembly to be moved of determining described the first gesture operational correspondence be on device screen whole/the part assembly; Perhaps resolve described the first gesture operation, according to the corresponding relation of default gesture operation and component type, determine that the assembly relevant to the component type of described the first gesture operational correspondence is assembly to be moved; Perhaps resolve described the first gesture operation, according to the corresponding relation of default gesture operation and assembly function, determine that the assembly relevant to the assembly function of described the first gesture operational correspondence is assembly to be moved; And calculate the center in the gesture zone of described the first gesture operational correspondence, centered by the center in the gesture zone of described the first gesture operational correspondence, application target viewing area on device screen, and show assembly described to be moved in described target viewing area.
17. electronic equipment as described in claim 15 or 16, it is characterized in that, described processing module, also be used for judging whether described the first gesture operation is for assembly is moved to the gesture operation of target viewing area from the viewing area, source of device screen, if carry out and resolve described the first gesture operation.
18. electronic equipment as described in claim 15 to 17 any one, it is characterized in that, described processing module, also be used for when the target viewing area of described application exceeds the viewing area of device screen, the described target of translation viewing area is until described target viewing area all is presented in the viewing area of described device screen.
CN201310046294.7A 2013-02-05 2013-02-05 Equipment screen component moving method and device, and electronic equipment Active CN103150108B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310046294.7A CN103150108B (en) 2013-02-05 2013-02-05 Equipment screen component moving method and device, and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310046294.7A CN103150108B (en) 2013-02-05 2013-02-05 Equipment screen component moving method and device, and electronic equipment

Publications (2)

Publication Number Publication Date
CN103150108A true CN103150108A (en) 2013-06-12
CN103150108B CN103150108B (en) 2017-04-19

Family

ID=48548218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310046294.7A Active CN103150108B (en) 2013-02-05 2013-02-05 Equipment screen component moving method and device, and electronic equipment

Country Status (1)

Country Link
CN (1) CN103150108B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103501412A (en) * 2013-10-12 2014-01-08 深圳市中兴移动通信有限公司 Shooting method, shooting interface setting method and shooting equipment
CN103885691A (en) * 2014-03-20 2014-06-25 小米科技有限责任公司 Method and device for executing backspacing operation
CN104750352A (en) * 2013-12-31 2015-07-01 环达电脑(上海)有限公司 Restoration control method of menu window
CN104866183A (en) * 2014-02-24 2015-08-26 联想(北京)有限公司 Data display method and electronic equipment
CN105306826A (en) * 2015-11-20 2016-02-03 小米科技有限责任公司 Camera function setting method and device
WO2017088309A1 (en) * 2015-11-27 2017-06-01 西安中兴新软件有限责任公司 Method and apparatus for moving icon, and computer storage medium
CN106952630A (en) * 2017-03-23 2017-07-14 深圳市茁壮网络股份有限公司 Pixel region processing method, device and pixel region switching method and apparatus
CN107315514A (en) * 2017-06-14 2017-11-03 深圳传音通讯有限公司 The startup control method and device of application program
CN108363600A (en) * 2018-01-17 2018-08-03 五八有限公司 Component display methods, device and the electronic equipment of application program
WO2018214986A1 (en) * 2017-05-24 2018-11-29 上海星佑网络科技有限公司 Display object operation method and apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467336A (en) * 2010-11-19 2012-05-23 联想(北京)有限公司 Electronic equipment and object selection method thereof
CN102722280A (en) * 2012-05-21 2012-10-10 华为技术有限公司 Method and device for controlling screen movement, and terminal
CN102883066A (en) * 2012-09-29 2013-01-16 惠州Tcl移动通信有限公司 Method for realizing file operation based on gesture recognition and cellphone

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467336A (en) * 2010-11-19 2012-05-23 联想(北京)有限公司 Electronic equipment and object selection method thereof
CN102722280A (en) * 2012-05-21 2012-10-10 华为技术有限公司 Method and device for controlling screen movement, and terminal
CN102883066A (en) * 2012-09-29 2013-01-16 惠州Tcl移动通信有限公司 Method for realizing file operation based on gesture recognition and cellphone

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103501412A (en) * 2013-10-12 2014-01-08 深圳市中兴移动通信有限公司 Shooting method, shooting interface setting method and shooting equipment
CN104750352A (en) * 2013-12-31 2015-07-01 环达电脑(上海)有限公司 Restoration control method of menu window
CN104866183A (en) * 2014-02-24 2015-08-26 联想(北京)有限公司 Data display method and electronic equipment
CN103885691A (en) * 2014-03-20 2014-06-25 小米科技有限责任公司 Method and device for executing backspacing operation
CN105306826A (en) * 2015-11-20 2016-02-03 小米科技有限责任公司 Camera function setting method and device
WO2017088309A1 (en) * 2015-11-27 2017-06-01 西安中兴新软件有限责任公司 Method and apparatus for moving icon, and computer storage medium
CN106814948A (en) * 2015-11-27 2017-06-09 西安中兴新软件有限责任公司 A kind of method and apparatus of moving icon
CN106952630A (en) * 2017-03-23 2017-07-14 深圳市茁壮网络股份有限公司 Pixel region processing method, device and pixel region switching method and apparatus
WO2018214986A1 (en) * 2017-05-24 2018-11-29 上海星佑网络科技有限公司 Display object operation method and apparatus
CN107315514A (en) * 2017-06-14 2017-11-03 深圳传音通讯有限公司 The startup control method and device of application program
CN108363600A (en) * 2018-01-17 2018-08-03 五八有限公司 Component display methods, device and the electronic equipment of application program

Also Published As

Publication number Publication date
CN103150108B (en) 2017-04-19

Similar Documents

Publication Publication Date Title
CN103150108A (en) Equipment screen component moving method and device, and electronic equipment
KR102020345B1 (en) The method for constructing a home screen in the terminal having touchscreen and device thereof
RU2687037C1 (en) Method, device for fast screen separation, electronic device, ui display and storage medium
CN111149086B (en) Method for editing main screen, graphical user interface and electronic equipment
US9350841B2 (en) Handheld device with reconfiguring touch controls
US9804898B2 (en) Method and apparatus for processing applications of mobile terminal
US20200319773A1 (en) Method and apparatus for component display processing
WO2014196760A1 (en) Electronic device and method for controlling applications in the electronic device
EP3564802B1 (en) Method and device for displaying application, and electronic terminal
US20100315439A1 (en) Using motion detection to process pan and zoom functions on mobile computing devices
KR20100013539A (en) User interface apparatus and method for using pattern recognition in handy terminal
CN115220838A (en) Widget processing method and related device
CN105122176A (en) Systems and methods for managing displayed content on electronic devices
CN102880414A (en) Terminal equipment and method for starting program rapidly
CN103677630A (en) Apparatus and method for processing split view in portable device
JP2015005173A (en) Portable information terminal including touch screen, and input method
CN107704157B (en) Multi-screen interface operation method and device and storage medium
US20150169216A1 (en) Method of controlling screen of portable electronic device
EP2790096A2 (en) Object display method and apparatus of portable electronic device
WO2014112029A1 (en) Information processing device, information processing method, and program
US9417724B2 (en) Electronic apparatus
WO2017022031A1 (en) Information terminal device
CN110023894A (en) A kind of mobile application figure calibration method and terminal
CN103383630A (en) Method for inputting touch and touch display apparatus
CN106020471A (en) Operation method of mobile terminal and mobile terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant