CN109857292B - Object display method and terminal equipment - Google Patents
Object display method and terminal equipment Download PDFInfo
- Publication number
- CN109857292B CN109857292B CN201811613598.6A CN201811613598A CN109857292B CN 109857292 B CN109857292 B CN 109857292B CN 201811613598 A CN201811613598 A CN 201811613598A CN 109857292 B CN109857292 B CN 109857292B
- Authority
- CN
- China
- Prior art keywords
- screen
- input
- sub
- target
- terminal device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 230000004044 response Effects 0.000 claims abstract description 17
- 238000004590 computer program Methods 0.000 claims description 13
- 230000005484 gravity Effects 0.000 claims description 13
- 238000012217 deletion Methods 0.000 claims description 3
- 230000037430 deletion Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 abstract description 10
- 238000004891 communication Methods 0.000 abstract description 5
- 230000006870 function Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 9
- 230000009471 action Effects 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention provides an object display method and terminal equipment, which are applied to the technical field of communication and aim to solve the problem that the convenience of moving an icon of an application program by the terminal equipment is poor. Specifically, be applied to the terminal equipment including first screen and second screen in this scheme, include: receiving a first input of a user under the condition that a first object in a to-be-moved state is included on a first screen, wherein the first input is an input for controlling the terminal equipment to rotate; in response to the first input, determining a first position on the second screen if it is determined that the rotation angle of the terminal device is greater than or equal to a first angle threshold; and controlling the first object to move from the first screen to be displayed on the first position. The method and the device are particularly applied to the process that the terminal device moves the object in a cross-screen mode, such as the icon in a cross-screen mode.
Description
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an object display method and terminal equipment.
Background
With the development of communication technology, the intelligent degree of terminal equipment such as mobile phones and tablet computers is continuously improved so as to meet various requirements of users. For example, users are increasingly demanding on the convenience of icons for mobile applications on terminal devices.
The double-screen terminal device can display contents, such as icons of application programs, on two screens of the terminal device at the same time. At this time, the user may need to move the icon of the application on one screen to be displayed on the other screen in the dual-screen terminal device. Specifically, since the joint portion between two screens in the current dual-screen terminal device is a non-display area such as a frame of the two screens, a user cannot successfully drag an icon of an application from one screen to the other screen, and thus the terminal device cannot successfully move the icon from one screen to the other screen for display. Therefore, the convenience of the icons of the mobile application programs of the terminal equipment is poor.
Disclosure of Invention
The embodiment of the invention provides an object display method and terminal equipment, and aims to solve the problem that the convenience of icons of mobile application programs of the terminal equipment is poor.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides an object display method, which is applied to a terminal device including a first screen and a second screen, and includes: receiving a first input of a user under the condition that a first object in a to-be-moved state is included on a first screen, wherein the first input is an input for controlling the terminal equipment to rotate; in response to the first input, determining a first position on the second screen if it is determined that the rotation angle of the terminal device is greater than or equal to a first angle threshold; and controlling the first object to move from the first screen to be displayed on the first position.
In a second aspect, an embodiment of the present invention further provides a terminal device, where the terminal device includes a first screen and a second screen, and further includes: the device comprises a receiving module, a determining module and a control module; the receiving module is used for receiving a first input of a user under the condition that a first object in a to-be-moved state is included on a first screen, wherein the first input is an input for controlling the terminal equipment to rotate; a determining module, configured to determine, in response to the first input received by the receiving module, a first position on the second screen in a case where it is determined that the rotation angle of the terminal device is greater than or equal to a first angle threshold; and the control module is used for controlling the first object to move from the first screen to the first position determined by the determination module for displaying.
In a third aspect, an embodiment of the present invention provides a terminal device, which includes a processor, a memory, and a computer program stored in the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the object display method according to the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps of the object display method according to the first aspect.
In an embodiment of the present invention, a terminal device includes a first screen and a second screen. Receiving a first input of a user under the condition that a first object in a to-be-moved state is included on a first screen, wherein the first input is an input for controlling the terminal equipment to rotate; in response to the first input, determining a first position on the second screen if it is determined that the rotation angle of the terminal device is greater than or equal to a first angle threshold; and controlling the first object to move from the first screen to be displayed on the first position. Based on the scheme, the terminal equipment can move the first object in the state of waiting to move on the first screen to the first position on the second screen for displaying under the triggering of the rotation input of the user, such as under the action of the gravity sensing of the terminal equipment. Therefore, the terminal equipment can move the first object from the first screen to the second screen for displaying without receiving the input of manually dragging the first object by the user but receiving the rotation input of the terminal equipment by the user, namely the terminal equipment moves the first object across screens. Therefore, convenience of moving objects (such as icons) across screens of the terminal equipment is improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating an object display method according to an embodiment of the present invention;
fig. 3 is a schematic diagram of content displayed by a terminal device according to an embodiment of the present invention;
fig. 4 is a second schematic flowchart of an object display method according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a possible terminal device according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. "plurality" means two or more than two.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first screen and the second screen, etc. are for distinguishing different screens, not for describing a particular order of the screens.
The terminal equipment comprises a first screen and a second screen. Receiving a first input of a user under the condition that a first object in a to-be-moved state is included on a first screen, wherein the first input is an input for controlling the terminal equipment to rotate; in response to the first input, determining a first position on the second screen if it is determined that the rotation angle of the terminal device is greater than or equal to a first angle threshold; and controlling the first object to move from the first screen to be displayed on the first position. Based on the scheme, the terminal equipment can move the first object in the state of waiting to move on the first screen to the first position on the second screen for displaying under the triggering of the rotation input of the user, such as under the action of the gravity sensing of the terminal equipment. Therefore, the terminal equipment can move the first object from the first screen to the second screen for displaying without receiving the input of manually dragging the first object by the user but receiving the rotation input of the terminal equipment by the user, namely the terminal equipment moves the first object across screens. Therefore, convenience of moving objects (such as icons) across screens of the terminal equipment is improved.
It should be noted that, in the object display method provided in the embodiment of the present invention, the execution main body may be a terminal device, or a Central Processing Unit (CPU) of the terminal device, or a control module in the terminal device for executing the object display method. In the embodiment of the present invention, an object display method executed by a terminal device is taken as an example to describe the object display method provided in the embodiment of the present invention.
The terminal device in the embodiment of the present invention may be a terminal device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the object display method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application. For example, applications such as a system setup application, a system chat application, and a system camera application. And the third-party setting application, the third-party camera application, the third-party chatting application and other application programs.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the object display method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the object display method may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can implement the object display method provided by the embodiment of the invention by running the software program in the android operating system.
The following describes in detail an object display method provided by an embodiment of the present invention with reference to a flowchart of the object display method shown in fig. 2. Wherein, although the logical order of the object display methods provided by embodiments of the present invention is illustrated in method flow diagrams, in some cases, the steps shown or described may be performed in an order different than here. For example, the object display method illustrated in fig. 2 may include S201-S203:
s201, under the condition that a first object in a to-be-moved state is included in a first screen, the terminal device receives a first input of a user, wherein the first input is an input for controlling the terminal device to rotate.
In the embodiment of the present invention, the terminal device includes a first screen and a second screen, that is, the terminal device is a dual-screen terminal device.
Optionally, the terminal device having at least two screens (e.g., the first screen and the second screen) provided in the embodiment of the present invention may be a folding screen type terminal device or a non-folding screen type terminal device. At least two screens in the folding screen type terminal equipment can be folded, and the folding angle between two adjacent screens in the at least two screens can be an angle between 0 degree and 360 degrees. At least two screens in the non-folding screen type terminal device may be arranged on different surfaces in the terminal device, for example, when the non-folding screen type terminal device is a mobile phone, the at least two screens (such as a first screen and a second screen) may be arranged on the front surface and the back surface of the mobile phone, respectively. For example, the angle of folding between the first screen and the second screen in the terminal device provided by the embodiment of the present invention may be 180 degrees, that is, the first screen and the second screen are on the same plane.
Optionally, the object (e.g., the first object) provided in the embodiment of the present invention may be an icon of an application program in a desktop of the terminal device, an icon of a shortcut key of the application program, an icon of a shortcut key of a file (e.g., a document or an icon) stored in the terminal device, and the like, which is not specifically limited in this embodiment of the present invention.
In addition, in the embodiment of the invention, the object in the state to be moved is an object which needs to be moved across screens, and is not moved on the current screen. Specifically, the first object in the state of waiting to move on the first screen needs to be displayed by moving to the second screen across screens, rather than moving on the first screen.
It should be noted that, currently, a user may perform selection input on objects displayed on one screen (e.g., the first screen) of the terminal device, so that the objects are in a state to be edited, so as to control the terminal device to edit the objects. The terminal equipment edits an object, namely the terminal equipment deletes the object or moves the object on a screen where the object is located currently.
It is understood that the movement of an object in a to-be-edited state on one screen by the current terminal device is the movement of the object on the screen. In the embodiment of the invention, the movement of the terminal device to the object in the to-be-moved state on one screen is the cross-screen movement of moving the object from the screen to other screens in the terminal device.
Optionally, in the embodiment of the present invention, when the first object on the first screen of the terminal device is in a state to be edited, or in a state to be moved, the display effect of the object is different. For example, when the first object is in a state to be edited, the first object may swing around its center as a fulcrum to prompt the user that the first object is currently in the state to be edited; and a delete button can be displayed at the upper right corner of the first object, so that the user can input the delete button to trigger the terminal device to delete the first object. In addition, when the first object is in the state to be moved, the display parameter display such as the transparency of the first object can be changed to prompt the user that the first object is currently in the state to be moved, namely the first object can be moved across screens; also, the first object in the to-be-moved state does not generally support the user's deletion input.
It is to be understood that the input for controlling the rotation of the terminal device by the user may be a rotation input for controlling the first screen and the second screen in the terminal device, such as a clockwise rotation input or a counterclockwise rotation input.
Optionally, in the embodiment of the present invention, the first input is a rotation input in a first direction from a first target screen to a second target screen, the first target screen is a first screen, and the second target screen is a second screen; or the first input is a rotation input in a second direction from the second target screen to the first target screen, the second target screen is the first screen, and the first target screen is the second screen.
For example, the first direction rotation input may specifically be a clockwise rotation input from the first target screen to the second target screen of the terminal device, that is, a clockwise rotation input from the first screen to the second screen; the second direction rotation input may specifically be a counterclockwise rotation input from the first target screen to the second target screen of the terminal device, that is, a counterclockwise rotation input from the second screen to the first screen.
It is understood that, in the embodiment of the present invention, a gravity sensor and/or an acceleration sensor are installed in the terminal device, and these sensors may be used to detect the rotation angle of the terminal. Wherein the rotation angle of the terminal device can indicate the tilt angle and direction of the terminal device based on the ground plane. Specifically, since the positions of the first screen and the second screen in the terminal device are fixed, the rotation angle of the terminal device is the rotation angle of the first screen and the second screen.
That is to say, the first input of the user is a clockwise rotation input for controlling the terminal device to rotate along the first screen to the second screen, that is, the user instructs the terminal device to move the first object on the first screen to the second screen for display along the gravity sensing direction under the action of gravity sensing.
S202, in response to the first input, under the condition that the rotation angle of the terminal device is larger than or equal to the first angle threshold value, the terminal device determines a first position on the second screen.
It can be understood that, in the embodiment of the present invention, the rotation angle of the terminal device is greater than or equal to the first angle threshold, which indicates that the user performs a rotation input on the terminal device, in order to trigger the terminal device to move the object to be moved displayed on the first screen to the second screen. At this time, the rotation input is not usually a false-triggered input by the user.
Illustratively, the first screen and the second screen of the terminal device are in the same plane, and the first screen is located on the left side of the second screen, then the user's rotation input to the terminal device is a clockwise rotation input along the first screen to the second screen.
For example, as shown in fig. 3, a schematic diagram of content displayed by a terminal device according to an embodiment of the present invention is provided. The terminal device shown in fig. 3 (a) includes a screen 31 and a screen 32, where the screen 31 is a first screen and the screen 32 is a second screen. Among them, an object 311, an object 312, an object 313, and an object 314 are displayed in the screen 31, and the object 311 and the object 312 are objects in a state to be moved. An object 321 and an object 322 are displayed in the screen 32.
Subsequently, an interface as shown in (c) of fig. 3 may be displayed by the terminal device receiving a clockwise rotation input to the terminal device along the screen 31 to the screen 32 by the user as shown in (b) of fig. 3, the object 313 and the object 314 being displayed on the screen 31 shown in (c) of fig. 3, and the object 321, the object 322, the object 311, and the object 312 being displayed on the screen 32. That is, the terminal device moves the object 311 and the object 312 displayed on the screen 31 to the screen 32 across screens under the trigger of the user.
It can be understood that, in the embodiment of the present invention, in the case that there is an idle position on the second screen of the terminal device, the terminal device may determine the first position on the second screen. Specifically, the first position may be an idle position on the current second screen; or the first positions are positions where the objects are currently displayed on the second screen, and the number of the first positions is smaller than or equal to the number of the current idle positions on the second screen.
S203, the terminal device controls the first object to move from the first screen to the first position to be displayed.
Specifically, in the embodiment of the present invention, the terminal device may control the first object to move from the first screen to the first position for display when the rotation angle of the terminal device is greater than or equal to the first preset threshold.
When the terminal device determines the position in the idle position on the second screen as the first position, the terminal device moves the first object from the first screen to the positions on the second screen for display, and the position of the object displayed on the second screen before the first object is moved may not be affected.
In addition, under the condition that the terminal equipment determines a position on the second screen, where the object is already displayed, as the first position, the terminal equipment moves the first object from the first screen to the position on the second screen for display, and moves the originally displayed object on the position to other positions on the second screen except the position for display, for example, moves the originally displayed object to an idle position on the second screen for display.
It should be noted that the object display method provided by the embodiment of the present invention is applied to a terminal device including a first screen and a second screen. Receiving a first input of a user under the condition that a first object in a to-be-moved state is included on a first screen, wherein the first input is an input for controlling the terminal equipment to rotate; in response to the first input, determining a first position on the second screen if it is determined that the rotation angle of the terminal device is greater than or equal to a first angle threshold; and controlling the first object to move from the first screen to be displayed on the first position. Based on the scheme, the terminal equipment can move the first object in the state of waiting to move on the first screen to the first position on the second screen for displaying under the triggering of the rotation input of the user, such as under the action of the gravity sensing of the terminal equipment. Therefore, the terminal equipment can move the first object from the first screen to the second screen for displaying without receiving the input of manually dragging the first object by the user but receiving the rotation input of the terminal equipment by the user, namely the terminal equipment moves the first object across screens. Therefore, convenience of moving objects (such as icons) across screens of the terminal equipment is improved.
In a possible implementation manner, in the object display method provided in the embodiment of the present invention, a user may trigger a terminal device to select objects to be moved, and control the objects to be in a state to be moved. For example, as shown in fig. 4, the object display method according to the embodiment of the present invention may further include, before the foregoing S201, S204 and S205:
and S204, the terminal equipment receives a second input of the user, wherein the second input is an input for selecting the first object from the objects displayed on the first screen.
It should be noted that the screen (e.g., the first screen) of the terminal device provided in the embodiment of the present invention may be a touch screen, and the touch screen may be configured to receive an input from a user and display a content corresponding to the input to the user in response to the input. The second input may be a touch screen input, a fingerprint input, a gravity input, a key input, or the like. The touch screen input is input such as press input, long press input, slide input, click input, and hover input (input by a user near the touch screen) of a touch screen of the terminal device by the user. The fingerprint input is input by a user to a sliding fingerprint, a long-press fingerprint, a single-click fingerprint, a double-click fingerprint and the like of a fingerprint identifier of the terminal equipment. The gravity input is input such as shaking of the terminal equipment in a specific direction, shaking of the terminal equipment for a specific number of times and the like. The key input corresponds to a single-click input, a double-click input, a long-press input, a combination key input, and the like of the user for a key such as a power key, a volume key, a Home key, and the like of the terminal device. Specifically, the embodiment of the present invention does not specifically limit the manner of the second input, and may be any realizable manner.
Optionally, different selection inputs of the terminal device to the first object displayed on the first screen are respectively used for triggering the first object to be in the state to be edited or the first object to be in the state to be moved.
Illustratively, the user inputs a long press of the first object with a first pressure value for triggering the first object to be in a state to be edited. And the user inputs the first object with a second pressure value for long time, and the first object is triggered to be in a state to be moved. The first pressure value is different from the second pressure value, if the first pressure value is smaller than the second pressure value. That is, the user triggers the terminal device to select the first object as the object to be moved by inputting the specific input of the long press with a larger pressure value for the first object.
And S205, responding to the second input, and controlling the first object to be in a state of waiting to move by the terminal equipment.
It is understood that the display effect before the first object is in the state to be moved is different from the display effect when the first object is in the state to be moved. In this manner, the user may be prompted that the first object has been selected and may be moved from the first screen to the second screen display.
It should be noted that, with the object display method provided in the embodiment of the present invention, since the terminal device supports the user to select the first object that needs to be moved from the first screen to the second screen for display through a specific input, for example, the first object is controlled to be in a state to be moved through the second input, the subsequent terminal device can conveniently move the first object selected by the user across screens to the second screen for display.
In a possible implementation manner, in the object display method provided in the embodiment of the present invention, before the first object in the state that the terminal device is to be moved moves to the second screen for display, the terminal device may first determine at which position on the second screen the first object may be displayed. The first object may include one or more sub-objects, and if the first object includes a plurality of sub-objects, the terminal device may determine a first location including a plurality of sub-locations on the second screen. Of course, when the first object is an object, the first position determined by the terminal device on the second screen is a position.
Specifically, in the embodiment of the present invention, the first object includes M sub-objects, where M is a positive integer. In the embodiment of the present invention, S206 is further included before S202, and for example, S206 is further included after S201 and before S202, at this time, S202 may be implemented by S202a or S202 b:
s206, the terminal equipment obtains the target number, wherein the target number is the number of the positions of the objects which are not displayed on the second screen.
And the position of the object which is not displayed on the second screen is the idle position on the second screen before the terminal equipment does not move the first object.
S202a, under the condition that the target number is larger than or equal to M, the terminal device determines M sub-positions on the second screen.
Wherein each sub-position is used to display a different sub-object, and at this time, the first position includes M sub-positions.
Specifically, the number of targets is greater than or equal to M, which indicates that there are enough idle positions on the second screen of the terminal device to display M sub-objects.
Optionally, before the terminal device moves the first object to the second screen for display, the M sub-positions determined by the terminal device may all be idle positions on the second screen, or the M sub-positions may all be positions on the second screen where the object is displayed, or one part of the M sub-positions is an idle position and the other part of the M sub-positions is a position where the object is already displayed.
Specifically, before the terminal device moves the first object to the second screen for display, the objects displayed on the second screen may be closely arranged, and at this time, the idle position on the second screen is a position behind a position where the object closest to the current display arrangement order is located, that is, the idle position on the second screen is also closely arranged. Or, the displayed objects on the second screen may be arranged dispersedly, and the idle positions on the second screen may be arranged dispersedly; for example, an object is displayed in a first position, a second position is a free position, and a third position is a position on the second screen.
Optionally, in this embodiment of the present invention, an arrangement order of the M sub-objects on the second screen is the same as an arrangement order of the M sub-objects on the first screen. At this time, the idle positions on the second screen may be closely arranged before the terminal device moves the first object to be displayed on the second screen.
In this way, since the arrangement order of the M sub-objects on the second screen after the terminal device moves is the same as the arrangement order of the M sub-objects on the first screen, and the arrangement order of the M sub-objects on the first screen is usually the arrangement order that the user is accustomed to, the arrangement order of the M sub-objects on the second screen after the terminal device moves is the arrangement order that the user is accustomed to. Therefore, the user can use the M sub-objects displayed on the second screen according to the use habit, and the user experience is improved.
It can be understood that, in the embodiment of the present invention, after the terminal device displays the M sub-objects on the second screen, the user may input a sub-object in the M sub-objects to control the terminal device to execute a function corresponding to the sub-object, or display an interface corresponding to the sub-object.
In addition, it is understood that, before the terminal device moves the first object to the second screen for display, if the free positions on the second screen may be arranged dispersedly, the terminal device may insert the first object between objects currently displayed on the second screen to move the first object to the second screen for display.
Optionally, before the terminal device moves the first object to the second screen for display, if the number of idle positions included in the second screen is greater than M, the M sub-positions on the second screen may be any M of the idle positions, or the M sub-positions on the second screen may be specific M of the idle positions, such as the M idle positions with the top ranking order among the idle positions.
S202b, under the condition that the target number is smaller than M, the terminal device determines the first position on the second screen.
The first position is used for displaying a target object, and the target object is used for indicating M sub-objects.
It will be appreciated that the number of targets is less than M, indicating that there are insufficient free positions on the second screen to display M sub-objects in the first object.
Specifically, in the embodiment of the present invention, the target object provided by the terminal device may be an icon of a folder, where the folder includes the M sub-objects. Optionally, the arrangement order of the M sub-objects included in the folder may be the same as the arrangement order of the M sub-objects on the first screen.
It is understood that the user's input of the target object displayed on the second screen may trigger the terminal device to expand the target object, i.e. to expand the M sub-objects indicated by the display target object on the second screen. At this time, the user may input a sub-object of the M sub-objects indicated by the display target object expanded on the second screen, so as to control the terminal device to use a function corresponding to the sub-object, or display an interface corresponding to the sub-object.
Optionally, before the terminal device moves the first object to the second screen for display, if the second screen includes a plurality of idle positions, the first position of the terminal device for displaying the target object may be any one of the idle positions, or may be a specific idle position of the idle positions, such as the first idle position of the idle positions that is the most front in the arrangement order.
It should be noted that, in the object display method provided by the terminal device in the embodiment of the present invention, when the idle positions included in the second screen of the terminal device are enough, for example, when the number of targets is greater than or equal to M, the terminal device may display M sub-objects at M sub-positions on the second screen, respectively; in addition, even if the free positions included on the second screen of the terminal device are insufficient, such as the number of targets is less than M, the terminal device may display a target object indicating M sub-objects at the first position on the second screen. In this way, the device can successfully move the first object in the to-be-moved state, i.e., M sub-objects, from the first screen to the second screen for display.
In a possible implementation manner, in the object display method provided in the embodiment of the present invention, S202 in the above embodiment may be implemented by S202 c. Specifically, S202a in the above embodiment may be implemented by S202 c:
s202c, for each of the M sub-objects, the terminal device determines, according to the information of the target sub-object, a sub-position corresponding to the target sub-object on the second screen, so as to determine M sub-positions on the second screen.
The target sub-object is any one of the M sub-objects, and the information of the target sub-object includes an application type of the application indicated by the target sub-object or a usage frequency of the application indicated by the target sub-object.
For example, in implementing the present invention, the object provided, such as each sub-object in the above M sub-objects, may be an icon of an application installed in the terminal device.
It can be understood that, for one sub-object (denoted as object 1) of the M sub-objects, the terminal device may determine the application type indicated by each object currently displayed on the second screen, and determine an object (denoted as object 2) whose application type indicated in the objects is the same as the application type indicated by the sub-object. Therefore, the sub-position (noted as position 1) corresponding to the object 1 determined by the terminal device on the second screen is near the position (noted as position 2) of the object 2 on the second screen, for example, the position 1 is a position behind the position 2 on the second screen, or the position 1 is a position before the position 2 on the second screen.
Optionally, for one sub-object (denoted as object 1) of the M sub-objects, the terminal device may determine the frequency of use of the application program indicated by each object currently displayed on the second screen, and determine an object (denoted as object 3) whose frequency of use of the application program is close to the frequency of use of the application program indicated by object 1. At this time, the sub-position determined by the terminal device for the object 1 may be in the vicinity of the position (denoted as position 4) where the object 3 is located on the second screen, for example, the position 1 is a position subsequent to the position 3 on the second screen, or the position 1 is a position previous to the position 3 on the second screen.
Further, for one sub-object (such as object 1) in the M sub-objects, the terminal device may first determine whether an object of an application program of the same application type exists on the second screen according to the application type of the application program indicated by the sub-object; if an object (such as object 2) of an application program with the same application type exists, a position near the position of the object on the second screen is determined so as to display the moved sub-object. If there is no object (e.g., object 2) of the application of the same type as the application, the object (e.g., object 3) of the application of which the frequency of use is close to the frequency of use of the application indicated by the sub-object is determined. Then, if there is an object (e.g., object 3) of the application program whose frequency of use is close to that of the application program indicated by the sub-object, the terminal device determines a position near the position where the object is located on the second screen to display the moved sub-object. If there is no application program object (such as object 3) with the frequency close to the application program indicated by the sub-object, the terminal device displays the moved sub-object at any free position on the second screen.
It should be noted that, with the object display method provided in the embodiment of the present invention, since the application type of the application indicated by the sub-object and the usage frequency of the application included in the information of one sub-object can reflect the usage habit of the user on the application, the terminal device determines the sub-position corresponding to the target sub-object on the second screen according to the information of the target sub-object, so as to determine M sub-positions on the second screen, and the M sub-objects displayed on the M sub-positions can better conform to the usage habit of the user.
Fig. 5 is a schematic structural diagram of a possible terminal device according to an embodiment of the present invention. The terminal device 50 shown in fig. 5 includes a first screen and a second screen, and further includes: a receiving module 501, a determining module 502 and a control module 503; a receiving module 501, configured to receive a first input of a user when a first object in a to-be-moved state is included on a first screen, where the first input is an input for controlling terminal device 50 to rotate; a determining module 502, configured to determine, in response to the first input received by the receiving module 501, a first position on the second screen in a case where it is determined that the rotation angle of the terminal device 50 is greater than or equal to the first angle threshold; and a control module 503, configured to control the first object to move from the first screen to the first position determined by the determining module 502 for displaying.
Optionally, the receiving module 501 is further configured to receive a second input from the user before receiving the first input from the user, where the second input is an input for selecting the first object from the objects displayed on the first screen; the control module 503 is further configured to control the first object to be in the state to be moved in response to the second input received by the receiving module 501.
Optionally, the first object includes M sub-objects, where M is a positive integer; the terminal device 50 further includes: an acquisition module; an obtaining module, configured to obtain, by the determining module 502, a number of targets before determining the first position on the second screen, where the number of targets is a number of positions where the object is not displayed on the second screen; a determining module 502, specifically configured to determine M sub-positions on a second screen when the number of targets acquired by the acquiring module is greater than or equal to M, where each sub-position is used to display a different sub-object, and the first position includes the M sub-positions; and determining a first position on the second screen under the condition that the target number is less than M, wherein the first position is used for displaying the target object, and the target object is used for indicating M sub-objects.
Optionally, an arrangement order of the M sub-objects on the second screen is the same as an arrangement order of the M sub-objects on the first screen.
Optionally, the determining module 502 is specifically configured to, for each sub-object in the M sub-objects, determine, according to the information of the target sub-object, a sub-position corresponding to the target sub-object on the second screen, so as to determine M sub-positions on the second screen; the target sub-object is any one of the M sub-objects, and the information of the target sub-object includes an application type of the application indicated by the target sub-object or a usage frequency of the application indicated by the target sub-object.
Optionally, the clockwise rotation input to the second target screen along the first target screen is used to trigger the object to be moved on the first target screen to be moved to the second target screen; the anticlockwise rotation input to the first target screen along the second target screen is used for triggering the object in the state to be moved on the second target screen to be moved on the first target screen; the first target screen is a first screen, the second target screen is a second screen, and the first input is a clockwise rotation input to the second screen along the first screen.
The terminal device 50 provided in the embodiment of the present invention can implement each process implemented by the terminal device in the foregoing method embodiments, and for avoiding repetition, details are not described here again.
It should be noted that, the terminal device provided in the embodiment of the present invention includes a first screen and a second screen. Receiving a first input of a user under the condition that a first object in a to-be-moved state is included on a first screen, wherein the first input is an input for controlling the terminal equipment to rotate; in response to the first input, determining a first position on the second screen if it is determined that the rotation angle of the terminal device is greater than or equal to a first angle threshold; and controlling the first object to move from the first screen to be displayed on the first position. Based on the scheme, the terminal equipment can move the first object in the state of waiting to move on the first screen to the first position on the second screen for displaying under the triggering of the rotation input of the user, such as under the action of the gravity sensing of the terminal equipment. Therefore, the terminal equipment can move the first object from the first screen to the second screen for displaying without receiving the input of manually dragging the first object by the user but receiving the rotation input of the terminal equipment by the user, namely the terminal equipment moves the first object across screens. Therefore, convenience of moving objects (such as icons) across screens of the terminal equipment is improved.
Fig. 6 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention, where the terminal device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 6 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, a pedometer, and the like.
The terminal device comprises a user input unit 107, a first display unit and a second display unit, wherein the user input unit 107 is used for receiving a first input of a user under the condition that a first object in a to-be-moved state is included on a first screen, and the first input is an input for controlling the terminal device to rotate; a processor 110 for determining a first position on the second screen in response to a first input received by the user input unit 107 in case that it is determined that the rotation angle of the terminal device is greater than or equal to a first angle threshold; and controlling the first object to move from the first screen to be displayed on the first position.
It should be noted that, the terminal device provided in the embodiment of the present invention includes a first screen and a second screen. Receiving a first input of a user under the condition that a first object in a to-be-moved state is included on a first screen, wherein the first input is an input for controlling the terminal equipment to rotate; in response to the first input, determining a first position on the second screen if it is determined that the rotation angle of the terminal device is greater than or equal to a first angle threshold; and controlling the first object to move from the first screen to be displayed on the first position. Based on the scheme, the terminal equipment can move the first object in the state of waiting to move on the first screen to the first position on the second screen for displaying under the triggering of the rotation input of the user, such as under the action of the gravity sensing of the terminal equipment. Therefore, the terminal equipment can move the first object from the first screen to the second screen for displaying without receiving the input of manually dragging the first object by the user but receiving the rotation input of the terminal equipment by the user, namely the terminal equipment moves the first object across screens. Therefore, convenience of moving objects (such as icons) across screens of the terminal equipment is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 6, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Preferably, an embodiment of the present invention further provides a terminal device, which includes a processor 110, a memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program is executed by the processor 110 to implement each process of the foregoing method embodiment, and can achieve the same technical effect, and for avoiding repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (12)
1. An object display method applied to a terminal device including a first screen and a second screen, comprising:
receiving a first input of a user under the condition that a first object in a to-be-moved state is included on the first screen, wherein the first input is an input for controlling the terminal equipment to rotate, the first object in the to-be-moved state does not support a deletion input of the user, and the first object in the to-be-moved state is an object needing to move across screens;
in response to the first input, determining a first position on the second screen in a case that a rotation angle of a terminal device is determined to be greater than or equal to a first angle threshold, the rotation angle of the terminal device indicating a tilt angle and a direction of the terminal device based on a ground plane, the positions of the first screen and the second screen being fixed, the terminal device being a folder type terminal device;
controlling the first object to move from the first screen to the first position along the gravity sensing direction for displaying;
the first object comprises M sub-objects, M being a positive integer;
the method further comprises, prior to determining the first location on the second screen:
acquiring the number of targets, wherein the number of targets is the number of positions of the objects which are not displayed on the second screen;
the determining a first location on the second screen comprises:
and determining the first position on the second screen when the target number is less than M, wherein the first position is used for displaying a target object, and the target object is used for indicating the M sub-objects.
2. The method of claim 1, wherein prior to receiving the first input from the user, the method further comprises:
receiving a second input of the user, wherein the second input is an input of selecting the first object from the objects displayed on the first screen;
controlling the first object to be in the to-be-moved state in response to the second input.
3. The method according to claim 1, wherein an arrangement order of the M sub-objects on the second screen is the same as an arrangement order of the M sub-objects on the first screen.
4. The method of claim 1, wherein determining M sub-positions on the second screen comprises:
for each sub-object in the M sub-objects, determining a sub-position corresponding to a target sub-object on the second screen according to information of the target sub-object, so as to determine the M sub-positions on the second screen;
the target sub-object is any one of the M sub-objects, and the information of the target sub-object includes an application type of an application indicated by the target sub-object or a usage frequency of the application indicated by the target sub-object.
5. The method of claim 1,
the first input is a first direction rotation input from a first target screen to a second target screen, the first target screen is the first screen, and the second target screen is the second screen; or,
the first input is a rotation input in a second direction from the second target screen to the first target screen, the second target screen is the first screen, and the first target screen is the second screen.
6. An object display apparatus, comprising a first screen and a second screen, and further comprising:
the device comprises a receiving module, a determining module, an obtaining module and a control module;
the receiving module is configured to receive a first input of a user under the condition that the first screen includes a first object in a to-be-moved state, where the first input is an input for controlling a terminal device to rotate, the first object in the to-be-moved state does not support a deletion input of the user, and the first object in the to-be-moved state is an object that needs to be moved across screens;
the determining module is configured to determine a first position on the second screen in response to the first input received by the receiving module, where it is determined that a rotation angle of a terminal device is greater than or equal to a first angle threshold, the rotation angle of the terminal device indicating a tilt angle and a direction of the terminal device based on a ground plane, the positions of the first screen and the second screen being fixed, and the terminal device being a foldable terminal device;
the control module is used for controlling the first object to move from the first screen to the first position determined by the determination module along the gravity sensing direction for displaying;
the first object comprises M sub-objects, M being a positive integer;
the obtaining module is configured to obtain a target number before the determining module determines the first position on the second screen, where the target number is the number of positions where the object is not displayed on the second screen;
the determining module is specifically configured to determine the first position on the second screen when the number of targets is less than M, where the first position is used to display a target object, and the target object is used to indicate the M sub-objects.
7. The apparatus of claim 6,
the receiving module is further configured to receive a second input of the user before receiving the first input of the user, where the second input is an input of selecting the first object from the objects displayed on the first screen;
the control module is further configured to control the first object to be in the state to be moved in response to the second input received by the receiving module.
8. The apparatus of claim 6, wherein an arrangement order of the M sub-objects on the second screen is the same as an arrangement order of the M sub-objects on the first screen.
9. The apparatus of claim 6,
the determining module is specifically configured to determine, for each of the M sub-objects, a sub-position corresponding to a target sub-object on the second screen according to information of the target sub-object, so as to determine M sub-positions on the second screen;
the target sub-object is any one of the M sub-objects, and the information of the target sub-object includes an application type of an application indicated by the target sub-object or a usage frequency of the application indicated by the target sub-object.
10. The apparatus of claim 6,
the first input is a first direction rotation input from a first target screen to a second target screen, the first target screen is the first screen, and the second target screen is the second screen; or,
the first input is a rotation input in a second direction from the second target screen to the first target screen, the second target screen is the first screen, and the first target screen is the second screen.
11. A terminal device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the object display method according to any one of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the object display method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811613598.6A CN109857292B (en) | 2018-12-27 | 2018-12-27 | Object display method and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811613598.6A CN109857292B (en) | 2018-12-27 | 2018-12-27 | Object display method and terminal equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109857292A CN109857292A (en) | 2019-06-07 |
CN109857292B true CN109857292B (en) | 2021-05-11 |
Family
ID=66892702
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811613598.6A Active CN109857292B (en) | 2018-12-27 | 2018-12-27 | Object display method and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109857292B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112162667A (en) * | 2020-09-28 | 2021-01-01 | 维沃移动通信有限公司 | Object display control method and device and electronic equipment |
CN112558851B (en) * | 2020-12-22 | 2023-05-23 | 维沃移动通信有限公司 | Object processing method, device, equipment and readable storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105068745A (en) * | 2015-07-27 | 2015-11-18 | 惠州Tcl移动通信有限公司 | Method and system for operating display screen of mobile phone by pressure touch and rotation detection |
CN105549858A (en) * | 2015-11-30 | 2016-05-04 | 东莞酷派软件技术有限公司 | Display method and user terminal |
CN106155476A (en) * | 2016-06-21 | 2016-11-23 | 努比亚技术有限公司 | Mobile terminal and screen content changing method |
CN106933453A (en) * | 2017-03-20 | 2017-07-07 | 维沃移动通信有限公司 | The method for sorting and mobile terminal of a kind of desktop icons |
CN108228029A (en) * | 2018-01-09 | 2018-06-29 | 维沃移动通信有限公司 | The method for sorting and mobile terminal of a kind of icon |
CN108717343A (en) * | 2018-04-08 | 2018-10-30 | Oppo广东移动通信有限公司 | Application icon processing method, device and mobile terminal |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7996792B2 (en) * | 2006-09-06 | 2011-08-09 | Apple Inc. | Voicemail manager for portable multifunction device |
EP3734404A1 (en) * | 2011-02-10 | 2020-11-04 | Samsung Electronics Co., Ltd. | Portable device comprising a touch-screen display, and method for controlling same |
CN105426067A (en) * | 2014-09-17 | 2016-03-23 | 中兴通讯股份有限公司 | Desktop icon replacement method and apparatus |
CN106648284A (en) * | 2015-11-04 | 2017-05-10 | 中国移动通信集团公司 | Method, device and terminal for icon sequencing |
KR102524190B1 (en) * | 2016-06-08 | 2023-04-21 | 삼성전자 주식회사 | Portable apparatus having a plurality of touch screens and control method thereof |
CN108052247A (en) * | 2017-11-29 | 2018-05-18 | 努比亚技术有限公司 | Desktop icons method of adjustment, mobile terminal and computer readable storage medium |
CN109407932B (en) * | 2018-10-31 | 2020-10-30 | 维沃移动通信有限公司 | Icon moving method and mobile terminal |
-
2018
- 2018-12-27 CN CN201811613598.6A patent/CN109857292B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105068745A (en) * | 2015-07-27 | 2015-11-18 | 惠州Tcl移动通信有限公司 | Method and system for operating display screen of mobile phone by pressure touch and rotation detection |
CN105549858A (en) * | 2015-11-30 | 2016-05-04 | 东莞酷派软件技术有限公司 | Display method and user terminal |
CN106155476A (en) * | 2016-06-21 | 2016-11-23 | 努比亚技术有限公司 | Mobile terminal and screen content changing method |
CN106933453A (en) * | 2017-03-20 | 2017-07-07 | 维沃移动通信有限公司 | The method for sorting and mobile terminal of a kind of desktop icons |
CN108228029A (en) * | 2018-01-09 | 2018-06-29 | 维沃移动通信有限公司 | The method for sorting and mobile terminal of a kind of icon |
CN108717343A (en) * | 2018-04-08 | 2018-10-30 | Oppo广东移动通信有限公司 | Application icon processing method, device and mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
CN109857292A (en) | 2019-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111061574B (en) | Object sharing method and electronic device | |
CN108255378B (en) | Display control method and mobile terminal | |
CN111596845B (en) | Display control method and device and electronic equipment | |
CN111459355B (en) | Content sharing method and electronic equipment | |
CN110069179B (en) | Icon control method and terminal equipment | |
CN110874147B (en) | Display method and electronic equipment | |
CN110851051A (en) | Object sharing method and electronic equipment | |
CN108762634B (en) | Control method and terminal | |
CN110489029B (en) | Icon display method and terminal equipment | |
CN108595237A (en) | A kind of method and terminal of display content | |
CN110989881B (en) | Icon arrangement method and electronic equipment | |
CN109828705B (en) | Icon display method and terminal equipment | |
CN109032468B (en) | Method and terminal for adjusting equipment parameters | |
CN109857289B (en) | Display control method and terminal equipment | |
CN110928461A (en) | Icon moving method and electronic equipment | |
CN108681427B (en) | Access right control method and terminal equipment | |
CN110489045B (en) | Object display method and terminal equipment | |
US20220317862A1 (en) | Icon moving method and electronic device | |
CN111026299A (en) | Information sharing method and electronic equipment | |
CN109408072B (en) | Application program deleting method and terminal equipment | |
CN110231897A (en) | A kind of object processing method and terminal device | |
CN109976611B (en) | Terminal device control method and terminal device | |
CN111026350A (en) | Display control method and electronic equipment | |
CN110225180B (en) | Content input method and terminal equipment | |
CN110166586B (en) | Content display method and terminal equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |