CN110703970B - Information processing method and electronic equipment - Google Patents
Information processing method and electronic equipment Download PDFInfo
- Publication number
- CN110703970B CN110703970B CN201910906953.7A CN201910906953A CN110703970B CN 110703970 B CN110703970 B CN 110703970B CN 201910906953 A CN201910906953 A CN 201910906953A CN 110703970 B CN110703970 B CN 110703970B
- Authority
- CN
- China
- Prior art keywords
- display
- interactive object
- focus area
- area
- interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses an information processing method and electronic equipment, relates to the field of electronic information processing, and is designed for improving the use satisfaction of users. The method is applied to the electronic equipment comprising the display interaction unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object; the method specifically comprises the following steps: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area; according to the first instruction, at least part of the first interactive object is displayed in the focus area, and the first interactive object is displayed in the focus area accessible to the operation body, so that the operation of the operation body is facilitated, and the use satisfaction of a user is improved.
Description
Technical Field
The present invention relates to the field of information processing, and in particular, to an information processing method and an electronic device.
Background
With the development of electronic information technology, the interactive interface of some handheld electronic devices (such as large-screen mobile phones, tablet computers, electronic books and other devices) is larger and larger, and often some information that needs to be confirmed by a user is displayed in an area that cannot be reached by one hand of the user, or an area that receives a user instruction is located in an area that cannot be reached by one hand of the user; these all result in inconvenient operation for the user, and the user experience is poor.
Disclosure of Invention
In view of this, embodiments of the present invention are intended to provide an information processing method and an electronic device, so as to improve intelligence of the electronic device, so that an operator, such as a user, can interact with the electronic device without changing an interaction state.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the invention provides an information processing method, which is applied to electronic equipment comprising a display interaction unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the method comprises the following steps:
at a first moment, determining a focus area and forming a first instruction; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
and displaying at least part of the first interactive object in the focus area according to the first instruction.
Preferably, the determining the focus area comprises:
before a first moment, the display interaction unit receives a first operation input by an operation body;
determining a first position of the display interaction unit acted by the first operation;
and determining the area within a preset distance from the first position as a focus area.
Preferably, the first and second electrodes are formed of a metal,
the electronic device includes a sensing unit;
the determining the focus area includes:
the sensing unit detects that an operation body is in a reachable area corresponding to the display interaction unit in the first time interaction state;
determining the focal area in dependence on the reachable area.
Preferably, the first and second liquid crystal display panels are,
the determining the focus area includes:
counting the positions of the historical operations of the operation body acting on the display interaction unit within the specified time to form a statistical result;
and determining the focus area according to the statistical result.
Preferably, after the displaying at least part of the first interactive object in the focus area according to the first instruction, the method further comprises:
displaying a second interactive object associated with the first interactive object at least partially in the focus area.
Preferably, the first and second electrodes are formed of a metal,
displaying the first interactive object and the second interactive object at least partially in the focus area comprises:
determining a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the first interactive object and the second interactive object according to the relative position parameter.
Preferably, before displaying the first interactive object and/or the second interactive object in the focus area, the method further comprises:
receiving a second operation acting on a third interactive object;
responding to the second operation, and generating the first interactive object and/or the second interactive object;
the determining, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focus area, a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit includes:
determining a relative display position parameter of the first interactive object and the second interactive object according to the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object and the position of the focus area;
and adjusting and displaying the relative display position relation of the first interactive object and the second interactive object on the display interactive unit according to the relative display position parameter.
Preferably, the displaying at least part of the first interactive object in the focus area according to the first instruction comprises:
determining a first display parameter of the first interactive object according to the first instruction and the size of the focus area;
and displaying the first interactive object in the focus area according to the first display parameter.
Preferably, the first and second electrodes are formed of a metal,
the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter, and a display position parameter of the first interactive object.
A second aspect of the present invention provides an electronic device, comprising a display interaction unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit;
the processing unit is used for determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit is specifically configured to display at least part of the first interaction object in the focus area according to the first instruction.
Further, the air conditioner is provided with a fan,
the display interaction unit is further used for receiving a first operation input by an operation body before a first moment;
the processing unit is specifically configured to determine a first position where the first operation acts on the display interaction unit; and determining the area within a preset distance from the first position as a focus area.
Further, the air conditioner is provided with a fan,
the electronic device includes a sensing unit;
the sensing unit detects that an operation body is in a reachable area corresponding to the display interaction unit in the first time interaction state;
the processing unit is specifically configured to determine the focal area based on the reachable region.
Further, the air conditioner is characterized in that,
the processing unit is specifically used for counting the positions of historical operations of the operation body acting on the display interaction unit within the specified time to form a statistical result; and determining the focus area according to the statistical result.
Further, the air conditioner is characterized in that,
the display interaction unit is further configured to display, after the at least part of the first interaction object is displayed in the focus area according to the first instruction, a second interaction object associated with the first interaction object at least partially in the focus area.
Further, the air conditioner is provided with a fan,
the processing unit is further configured to determine, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focus area, a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit;
the display interaction unit is specifically configured to display the first interaction object and the second interaction object according to the relative position parameter.
Further, the air conditioner is provided with a fan,
the display interaction unit is further used for receiving a second operation acting on a third interaction object before the first interaction object and/or the second interaction object are/is displayed in the focus area; responding to the second operation, and generating the first interactive object and/or the second interactive object;
the processing unit is specifically configured to determine, according to the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object, and the position of the focal region, a relative display position parameter of the first interactive object and the second interactive object;
and the display interaction unit is further used for adjusting and displaying the relative display position relationship of the first interaction object and the second interaction object on the display interaction unit according to the relative display position parameter.
Further, the air conditioner is provided with a fan,
the processing unit is further configured to determine a first display parameter of the first interactive object according to the first instruction and the size of the focal region;
the display interaction unit is further configured to display the first interaction object in the focus area according to the first display parameter.
Further, the air conditioner is provided with a fan,
the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter, and a display position parameter of the first interactive object.
According to the information processing method and the electronic device, before a first interactive object is displayed, a focus area for displaying the first interactive object is determined, and at least part of the first interactive object is displayed in the focus area; the focus area is an area on the display interaction unit which can be touched by an operation body (such as a user), so that the operation body can conveniently touch or approach the first interaction object under the condition that the current (such as the first moment) interaction state is not changed, and input of a control instruction to the electronic equipment is completed.
Drawings
Fig. 1 is a schematic view illustrating a display effect of an electronic device according to an embodiment of the invention;
FIG. 2 is a schematic flowchart of an information processing method according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of determining a focus area according to an embodiment of the present invention;
FIG. 4 is a second flowchart illustrating an information processing method according to an embodiment of the invention;
FIG. 5 is a diagram illustrating an effect displayed by the information processing method according to the embodiment of the present invention;
FIG. 6 is a second schematic view illustrating the effect of the information processing method according to the embodiment of the invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solution of the present invention is further described in detail with reference to the drawings and the specific embodiments of the specification.
The first embodiment of the method comprises the following steps:
as shown in fig. 1, the present embodiment provides an information processing method, which is applied to an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body on the interaction object;
as shown in fig. 2, the method includes:
step S110: at a first moment, determining a focus area and forming a first instruction; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area; in fig. 1, the focus area is an area 111 that is accessible or near on the display interaction unit 110 when the user holds the electronic device from one side;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
The display interaction unit generally includes a display screen, and the display screen can receive touch operation or proximity control operation of a user at the same time. The focus area in the step S110 may be the whole display screen or a part of the display screen. The first-time interaction state is a state in which an operation body (such as a user) interacts with the electronic equipment in a posture kept by the electronic equipment in an operation posture at a first time; in particular to a single-hand-held electronic device such as an operation body (such as a user). It is possible that the partial area of the display interaction unit is an area that is inaccessible to the operator in the current interaction state of the user with the electronic device.
The area of a display interaction unit of the electronic equipment is larger; the user is in the first moment interaction state, i.e. the state of the single-handed handheld electronic device, the partial area displaying the interaction unit is inaccessible to the user without changing the hand-holding direction or making a large change in the posture of the handheld electronic device. In a specific application situation, if the electronic device is a large-screen mobile phone or a mini tablet computer, the user is on a bus at the first moment; when a user holds the handrail in the bus, the user can only use the other hand to operate the electronic device, and the interaction state between the operation body (such as the user) and the electronic device is the first time interaction state when the user operates the electronic device.
For another example, the user is a child, the hand is small, the strength is small, and the risk that the device slips off and the like may occur when the device is held by one hand; at the moment, the user needs to hold the electronic equipment by two hands during operation, so that the current interaction state of the user is that the user holds the electronic equipment by two hands; when two handheld devices interact, the area of the display interaction unit that is accessible or accessible to the user is limited, such as limited to the range in which the thumbs of the user's hands are movable within the display interaction unit. The current moment is a first moment, and the current interaction state is a first moment interaction state.
In step S110, an instruction is formed after the focus area is determined; the instruction is used for adjusting the display position of a first interactive object to be interacted with an operation body, so that at least part of the first interactive object is displayed in the focus area.
The first interactive object being at least partially displayed in the focus area includes the following two situations:
the first method comprises the following steps: the first interactive objects are all displayed in the focus area; this situation applies generally to: when the display area of the first interactive object is not larger than the area of the focus area;
and the second method comprises the following steps: the first interactive object part is displayed in the focus area; typically this situation may include that the display interaction unit has to display a part of the first interaction object outside the focus area, because the display area of the first interaction object is larger than the area of the focus area; it may further comprise that the display area of the first interactive object is not larger than the area of the focus area; in any case, it is mainly ensured that the interaction items such as the control and the selection option that need to be operated by the user in the first interaction object are preferentially displayed in the focus area.
Specifically, as many contents are displayed on the first interactive object, the user is required to input and confirm to execute the subsequent operation; in order to facilitate the user operation, the 'determination' control which the user needs to click or approach is displayed in the focus area which can be touched by the user; other content displayed as to the first interactive object may be displayed outside the focus area.
In the step S120, the first instruction may be an operation instruction carrying a display position parameter of a first interactive object; there are many ways in which the instructions can be constructed, and no further development is made here.
In summary, the information processing method according to this embodiment determines the focus area of the display interaction unit according to the interaction state at the first time, and displays the first interaction object to be interacted with the operation body in the focus area as much as possible, thereby facilitating the operation of the operation body; therefore, the operation body can easily and simply complete the interaction with the electronic equipment under the condition that the operation posture of the operation body or the placing posture of the electronic equipment is not changed, and the intelligence of the equipment and the use satisfaction of a user are improved.
The second method embodiment:
as shown in fig. 1, the present embodiment provides an information processing method, which is applied to an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body on the interaction object;
as shown in fig. 2, the method includes:
step S110: at a first moment, determining a focus area and forming a first instruction; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
As shown in fig. 3, the determining the focus area in step S110 includes the following steps:
step S111: before a first moment, the display interaction unit receives a first operation input by an operation body;
step S112: determining a first position of the display interaction unit acted by the first operation;
step S113: determining an area within a preset distance from the first position as a focus area;
usually, the interaction state at the first time is continuous with the interaction state at a time before the first time, and usually the same interaction state is maintained or slightly changed, and the area accessible to the operation body on the display interaction unit is still maintained as the same area or slightly different, so that the focus area is determined by the first operation of an operation before the first time in this embodiment. Before the first time point in step S101, preferably before the first time point, a time point corresponding to the latest one operation input by the operation body; the first operation is preferably the last operation of an operator (e.g., a user) before the first time.
The first operation can be a click operation, a sliding operation or a pressing operation; the first location may be an interaction point or an interaction area; the interaction mode of the operation body and the electronic equipment comprises touch interaction and proximity interaction; the touch interaction operation body realizes the operation input of the electronic equipment by touching the interaction display unit; the operation gesture of the proximity interaction operation body is within a preset range in the proximity interaction display unit, and the operation input is completed. The accessible areas comprise an area where the operating body can contact the display interaction unit in the first-moment interaction state and an area corresponding to the display interaction unit, which can approach and complete operation input.
The first operation may be a preceding operation dedicated to determining the focus area, or may be one operation performed by the operation body when another operation is performed before the first time. For example, the user inputs an operation of waking up the electronic device before the first time; the operation body contacts or approaches the display interaction unit when performing the operation; this operation may be performed as the first operation described in the present embodiment as the operation of determining the focus area at the first timing.
In step S113, the preset distance may be a pre-stored distance, which may be determined according to historical data of an area accessible with a certain interaction position as a center in the interaction state of the same operation body corresponding to the first time, may be determined according to statistical data of an area accessible with a certain interaction position as a center in the interaction state of different operation bodies corresponding to the first time, or may be data obtained by performing simulation or estimation according to various personal interaction state data of the user in combination with statistical data of a plurality of users; there are various ways of obtaining said predetermined distance, which will not be described in further detail herein.
Compared with the information processing method described in the first embodiment, the information processing method described in this embodiment provides a specific implementation method for determining the focal region through the first operation before the first time, and has the advantage of simple implementation.
The third method embodiment:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein the electronic device includes a sensing unit, and the determining the focus area in step S110 may include:
the sensing unit detects that an operation body is in a reachable region corresponding to the display interaction unit in the first time interaction state;
determining the focal area in dependence on the reachable area.
The sensing unit can determine the reachable region through the change of sensing signals such as electric signals or magnetic signals when an operation body moves above the display interaction unit; specifically, when a finger of a user moves on the display interaction unit or stays on the display interaction area or a certain area above the display area, the finger may cut or intercept the magnetic induction line; the sensing unit determines the accessible region from the magnetic flux or the change in the magnetic flux.
For another example, when the user moves within a preset distance on or above the display interaction unit, due to the movement of the user's finger or staying in different areas of the display interaction unit, different luminous fluxes of different areas on the display interaction unit will be caused, so that the reachable area is determined according to the different luminous fluxes.
In particular, there are many ways to detect the operating body in the accessible region corresponding to the display interaction unit, and they are not listed here.
On the basis of the first method embodiment, the focal area is determined through the sensing unit, and the method has the advantages of high equipment intelligence and high user satisfaction degree, and is simple and convenient to implement.
The fourth method embodiment:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body on the interaction object;
as shown in fig. 2, the method includes:
step S110: at a first moment, determining a focus area and forming a first instruction; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein the determining the focus area in step S110 may include:
counting the positions of the historical operations of the operation body acting on the display interaction unit within the specified time to form a statistical result;
and determining the focus area according to the statistical result.
Before the first time, the electronic device may receive a plurality of operations input by an operation body, and in this embodiment, the specified time may preferably be several times before the first time; the operation body performs multiple operations within the moments, the position of the operation body acting on the display interaction unit in the multiple operations can be counted according to the multiple operations, and according to one or more operations within the specified time, which areas of the display interaction unit perform the interaction operation with the electronic equipment within the specified time can be known, and the operation input in which areas of the display interaction unit is the most frequent or the most frequent; the focal region may be determined according to a pre-stored selection strategy.
If all the areas receiving the input operation of the operation body in the specified time can be positioned in the focus area; and a method of setting an area in which an input operation is frequently received as a focus area.
In addition, usually, the operation body (such as the user) has a commonly used interaction state, such as the user is used to hold the electronic device with one hand and interacts with the electronic device; it is highly likely that the first time is still the usual interaction state, so that the interaction state at the first time can be known, and the focus area in the first interaction state can be easily known according to the history.
The information processing method described in this embodiment is further improved on the basis of the first method embodiment, and a method for determining a focus area based on historical operation statistics is provided according to the continuity of the interaction state of the operation body over time and the usual interaction state of the operation body and the electronic device, so that the intelligence and user satisfaction of the electronic device are improved, and the method has the advantage of simplicity and convenience in implementation.
Method example five:
as shown in fig. 1, the present embodiment provides an information processing method, which is applied to an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein, as shown in fig. 4, the method further comprises:
step S130: displaying a second interactive object associated with the first interactive object at least partially in the focus area.
The present embodiment is further improved on the basis of any of the above method embodiments, and at least a part of the second interactive object associated with the first interactive object is also displayed in the focus area by the addition of step S130, so as to facilitate the operation of the user.
Specifically, for example, the electronic device displays a first interactive object to be operated by the user, such as a video player, in a focus area at a first time, and the operation body controls the video player to be opened through interaction, so that the user has a high probability of selecting a region to play a video file already stored in the electronic device; therefore, for convenience of operation, the folder in which each video file is stored or each video file itself can be displayed in the focus area.
The video player is the first interactive object; each video file or the folder in which each video file is stored is the second interactive object associated with the first interactive object.
In a specific implementation process, the association between the first interactive object and the second interactive object may be established in advance, or may be determined by the electronic device according to content carried by each interactive object of the electronic device and the operational association.
For another example, the electronic device displays the first interactive object song a in the focus area at a first time; the indication indicates that the user may listen to a song by using the electronic device, and another song B is also stored in the electronic device.
Based on any of the above method embodiments, the information processing method in this embodiment not only displays at least part of the first interactive object in the focus area, but also displays at least part of the second interactive object associated with the first interactive object in the focus area, which again facilitates the operation of the user, and improves the intelligence of the electronic device and the usage satisfaction of the user.
The sixth method embodiment:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein, as shown in fig. 4, the method further comprises:
step S130: displaying a second interactive object associated with the first interactive object at least partially in the focus area.
Specifically, the step S130 may include:
determining a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the first interactive object and the second interactive object according to the relative position parameter.
In this embodiment, the relative position parameter includes a distance between the first interactive object and the second interactive object; and the position parameter relation such as the sequencing mode of the first interactive object and the second interactive object.
The content properties of the first interactive object and the second interactive object may determine the probability of interaction with the user and the operation performed by the interactive electronic device; the position of the focus area at least partially determines the first interactive object and the second interactive object display positions; the size of the focus area determines the layout and display shape of the first and second interactive objects.
In fig. 5, the content attribute of the "determine" interactive object is to determine to delete file a; the content attribute of the cancel interactive object is a cancel deletion file A; firstly, the two interactive objects operate on the same file A, and have relevance in operation. If the 'determination' corresponds to the first interactive object; then "cancel" corresponds to the second interactive object.
As can be seen from the left diagram of fig. 5, the second interactive object "cancel" is not displayed in the focus area 111 of the display interaction unit 110. And the distance between the first interactive object and the second interactive object is d 1; the first interactive object is located a distance d3 from the edge of the display interaction area.
With the method of this embodiment, first, the second interactive object needs to be "determined" to be at least partially displayed in the focal region 111; therefore, in order to achieve the above object, with the information processing method according to this embodiment, the display effect of the first interactive object and the second interactive object is as shown in the right diagram of fig. 5.
Wherein the distance between the first interactive object and the second interactive object is changed from the original d1 to d 2; the distance between the first interactive object and the edge of the display interactive unit is changed from the original d3 to d 4.
In summary, the present embodiment further provides how to display the first interactive object and the second interactive object having an association in the focus area based on the previous embodiment, so as to facilitate user operation, and have the advantage of simple implementation.
Method example seven:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein, as shown in fig. 4, the method further comprises:
step S130: displaying a second interactive object associated with the first interactive object at least partially in the focus area.
Specifically, the step S130 may include:
determining a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the first interactive object and the second interactive object according to the relative position parameter.
Further, before displaying the first interactive object and/or the second interactive object in the focus area, the method further comprises:
receiving a second operation acting on a third interactive object;
responding to the second operation, and generating the first interactive object and/or the second interactive object;
the determining, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focus area, a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit includes:
determining a relative display position parameter of the first interactive object and the second interactive object according to the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object and the position of the focus area;
and adjusting and displaying the relative display position relation of the first interactive object and the second interactive object on the display interactive unit according to the relative display position parameter.
Specifically, the second operation is to delete the file a displayed on the display interaction unit; the interactive object corresponding to the file A is a third interactive object; after the electronic equipment responds to the second operation, the first interactive object 'confirm' and the second interactive object 'cancel' are formed.
Specifically, as shown in fig. 6, at this time, in the left image of fig. 6, the first interaction object "determination" is only partially displayed in the focus area 111 of the display interaction unit 110; the second interactive object "cancel" is displayed completely within the focus area 111. Deleting the third interactive object file A according to the second operation, wherein obviously, the possibility that the first interactive object is touched by the user is higher than that of the second interactive object; and at this moment, the first interactive object is only partially displayed in the focus area, and the user obviously takes a little more effort to touch the first interactive object than the second interactive object.
In order to solve the above problem, in this embodiment, the determining the display parameters of the first interactive object and the second interactive object includes the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object, and the position of the focus area; in this way, the display of each interactive object is determined in consideration of not only the content attribute of the first interactive object and the content attribute of the second interactive object but also the third interactive object and the second operation.
After the information processing method in this embodiment is executed, the first interactive object and the second interactive object are adjusted from the display effect in the left image in fig. 6 to the display effect in the right image in fig. 6.
The first interactive object is adjusted from the left side to the right side of the second interactive object and is completely positioned in the focus area; meanwhile, the distance between the first interactive object and the second interactive object is changed from D1 to D2; the distance between the second interactive object and the edge of the display interactive unit 110 is D3 in the left diagram of fig. 6; and the distance between the first interactive object and the edge of the display interactive unit 110 in the right diagram of fig. 6 is D4; it is apparent that D4 is less than D3; d1 is less than D1; by changing the relative display position parameter, it is obvious that the first interactive object with higher probability of being touched by the user is completely positioned in the focus area, so as to be more convenient for the user to touch; meanwhile, most of the second interactive object is also located in the focus area, and it is obviously easier to touch than the operation body (e.g. the user) to touch the first interactive object in fig. 6.
For another example, the user enters the nth page of a website with a hierarchical structure; wherein N is greater than 2; when it is to range from layer to the first layer page; the method comprises the following steps that two interactive objects can be corresponded when page selection operation is carried out; one is "next page", and the other is "previous page". If the user has continuously operated the next page for a plurality of times during the operation; if only one of the interactive objects can be displayed in the focus area on the display interactive unit at the first moment, the display positions of the interactive object "next page" and the interactive object "previous page" are preferably exchanged according to the previous operation, so that the interactive object "next page" is displayed in the focus area, and the user can conveniently operate the focus area.
In summary, this embodiment provides a more preferable method for the relative display position parameters of the first interactive object and the second interactive object based on the above embodiment, and adjusts the relative position relationship between the first interactive object and the second interactive object according to the relative display position parameters, so as to improve the intelligence of the device again and facilitate the user operation.
The eighth embodiment of the method:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: at a first moment, determining a focus area and forming a first instruction; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein, the step S120 may specifically include:
determining a first display parameter of the first interactive object according to the first instruction and the size of the focus area;
and displaying the first interactive object in the focus area according to the first display parameter.
In a specific implementation, the area of the first interactive object may be larger than the area of the focus region; in this embodiment, in order to enable the first interactive object to be completely displayed in the focus area, after receiving the first instruction, the first display parameter is further determined according to the first instruction and the size of the focus area; the first display parameter at least comprises one of an area parameter, a shape parameter, an interactive information content layout parameter and a display position parameter of the first interactive object; when the first interactive object is displayed according to the first display parameter, the first interactive object can be completely displayed in the focus area, and operation of a user is facilitated.
And displaying the first interactive object according to the first display parameter, wherein the display area, the display shape and the display area of the first interactive object are changed. When the focus area is a circular area, the first interactive object can be adjusted from the original rectangular shape to a circular shape in order to facilitate sufficient display area of the first interactive object and to utilize the area of the focus area as much as possible.
In summary, the information processing method according to this embodiment is improved based on the previous method embodiment, so that the intelligence of the device is improved and the satisfaction of the user is improved.
The first equipment embodiment:
as shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by an operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focal region according to the first instruction.
The specific structure of the display interaction unit 110 may include a screen; the screen can be a liquid crystal screen or an OLED screen; the screen is used for displaying information which needs to be displayed to an operation body (such as a user) of the electronic equipment, and meanwhile, various input operations of the operation body are received so as to control the operation of the electronic equipment.
The processing unit 120 may be a processor included in the electronic device; the processor is an electronic component with processing function, such as a central processing unit, a microprocessor, a digital signal processor, a programmable logic circuit and the like.
The electronic device can be an electronic device such as a smart phone, a tablet computer and an electronic book, and is preferably a mobile electronic device.
The electronic device described in this embodiment provides hardware support for the information processing method described in the first embodiment of the method, and can be used to implement any of the technical solutions described in the first embodiment of the method.
Apparatus embodiment II
As shown in fig. 1 and fig. 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focal region according to the first instruction.
The display interaction unit 110 is further configured to receive, before a first time, a first operation input by an operation body;
the processing unit 120 is specifically configured to determine a first position where the first operation acts on the display interaction unit; and determining an area within a preset distance from the first position as a focus area.
The processing unit 120 may include a receiving interface and a calculator;
the receiving interface is used for receiving and displaying a first position of the positioned first operation of the interaction unit 110;
and the calculator is used for calculating the focus area from the first position received by the receiving interface and the preset distance.
The electronic device described in this embodiment provides specific hardware support for the information processing method described in method embodiment two, can implement any technical scheme of the information processing method described in method embodiment two, and has the same advantages of high intelligence and high user satisfaction.
The third equipment embodiment:
as shown in fig. 1 and fig. 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focus area according to the first instruction.
Wherein the electronic device comprises a sensing unit;
the sensing unit detects that an operation body is in a reachable area corresponding to the display interaction unit in the first time interaction state;
the processing unit 120 is specifically configured to determine the focus area in dependence on the reachable region.
The specific structure of the sensing unit varies with the manner in which the accessible region is acquired; specifically, the structure may include a light flux sensor, a magnetic flux sensor, and the like.
The electronic device described in this embodiment provides a specific hardware support for the information processing method described in method embodiment three, can implement any technical scheme of the information processing method described in method embodiment three, and has the same advantages of high intelligence and high user satisfaction.
The fourth equipment embodiment:
as shown in fig. 1 and fig. 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focus area according to the first instruction.
The display interaction unit 110 is further configured to receive, before a first time, a first operation input by an operation body;
the processing unit 120 is specifically configured to count positions where historical operations of the operation body act on the display interaction unit within a specified time, and form a statistical result; and determining the focus area according to the statistical result.
The processing unit 120 may include a storage medium in addition to a processor; the storage medium records the positions of the historical operations acting on the display interaction unit. The processor is connected with the storage medium through a bus; the processing unit 120 may also include an external communication interface through which the processor may interact with peripherals.
The electronic device described in this embodiment provides specific hardware support for the information processing method described in the fourth embodiment of the method, can be used to implement the technical solution described in any of the fourth embodiment of the method, and has the advantages of high intelligence and user satisfaction, as well as simple structure and easy implementation.
Device example five:
as shown in fig. 1 and fig. 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is further configured to, after the at least part of the first interaction object is displayed in the focus area according to the first instruction, display at least part of a second interaction object associated with the first interaction object in the focus area.
The display interaction unit 110 may include a display processor connected to the display screen in addition to the display screen; the display processor may be configured to control the display interaction unit 110 to display the first interaction object in the focus area and simultaneously display the second interaction object in the focus area according to the first instruction. The display processor may be a separate processor from the processing unit 120; a logic device in the processing unit 120; the processing unit 120 corresponds to the same processor having a high processing capability.
The structure of the display interaction unit 110 is various, and is not illustrated here.
The embodiment is further defined on the basis of any one of the above device embodiments; any technical scheme of the information processing method in the fifth embodiment of the method can be realized, and the method has the advantages of high intelligence and high user satisfaction.
The sixth embodiment of the device:
as shown in fig. 1 and fig. 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by an operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is further configured to, after the displaying of at least part of the first interaction object in the focus area according to the first instruction, display at least part of a second interaction object associated with the first interaction object in the focus area.
The processing unit 120 is further configured to determine, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focal region, a relative display position parameter of the first interactive object and the second interactive object on the display interaction unit;
the display interaction unit 110 is specifically configured to display the first interaction object and the second interaction object according to the relative position parameter.
The content attribute includes the display content itself, the meaning of the content itself, and display parameters such as the content display mode and the display size.
The processing unit 120 determines a relative display position parameter of the first interactive object and the second interactive object according to the first interactive object, the second interactive object and the focus area; the display interaction unit 110 displays the first display interaction object and the second display interaction object according to the relative position parameter; since the display parameters are different at the previous and subsequent times, it appears that there is an adjustment of the relative positional relationship of the first interactive object and the second interactive object on the display interaction unit.
The embodiment is further defined on the basis of any one of the above device embodiments; any technical scheme of the information processing method in the sixth embodiment of the method can be implemented, and the method has the advantages of high intelligence and high user satisfaction.
Device embodiment seven
As shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by an operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is further configured to, after the displaying of at least part of the first interaction object in the focus area according to the first instruction, display at least part of a second interaction object associated with the first interaction object in the focus area.
The processing unit 120 is further configured to determine, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focus area, a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit;
the display interaction unit 110 is specifically configured to display the first interaction object and the second interaction object according to the relative position parameter.
The content attribute includes the display content itself, the meaning of the content itself, and display parameters such as the content display mode and the display size.
The display interaction unit 110 is further configured to receive a second operation acting on a third interaction object before the first interaction object and/or the second interaction object are displayed in the focus area; responding to the second operation, and generating the first interactive object and/or the second interactive object;
the processing unit 120 is specifically configured to determine a relative display position parameter of the first interactive object and the second interactive object according to the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object, and the position of the focus area;
the display interaction unit 110 is further configured to adjust and display a relative display position relationship between the first interaction object and the second interaction object on the display interaction unit according to the relative display position parameter.
In this embodiment, the display interaction unit 110 further receives a second operation before the first time; the second operation may be an operation input specifically for determining the focus position at the first time, or may be an operation when the operating body instructs the electronic device to perform another operation at the first time, and in this embodiment, the second operation instructing the electronic device to perform another operation is one of the bases for determining the focus area at the first time.
The electronic device described in this embodiment is a further improvement based on the previous embodiment, provides specific implementation hardware for the information processing method described in method embodiment seven, can be used to implement any of the technical solutions of method embodiment seven, and has the advantages of simple structure and high intelligence of the electronic device.
The eighth embodiment of the device:
as shown in fig. 1 and fig. 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by an operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focal region according to the first instruction.
The processing unit 120 is further configured to determine a first display parameter of the first interactive object according to the first instruction and the size of the focus area;
the display interaction unit 110 is further configured to display the first interaction object in the focus area according to the first display parameter.
In this embodiment, the display device 120 further determines a first display parameter of the first interactive object according to the size of the focus area and the first instruction; the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter and a display position parameter of the first interactive object.
For example, the first display parameter may be an area parameter of a first interactive object, and when the default display area of the first interactive object is larger than the focus area, the area of the first interactive object may be reduced appropriately according to the area of the focus area without affecting the clear display of the first interactive object, so that the display area of the first interactive object is determined again to form the first display parameter. For another example, if the default display position of the first interactive object on the display interaction unit 120 is not in the focus area, a first display parameter for displaying the first interactive object in the focus area is formed by changing a display parameter.
After the processing unit 120 forms the first display parameter, the display interaction unit 110 may display the first interaction object according to the first display parameter; when the first display interaction object is presented to an operation body (such as a user) through the display interaction unit, at least part of the first display interaction object is displayed in the focus area.
In summary, this embodiment is a further improvement on the basis of any one of the above-mentioned device embodiments, provides a specific hardware support for the information processing method described in method embodiment eight, can be used to implement the technical solution described in method embodiment eight, and has the same advantages of high intelligence of the electronic device, high user satisfaction, and the like.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.
Claims (8)
1. An information processing method is applied to an electronic device comprising a display interaction unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object; the method comprises the following steps:
receiving a first operation input by an operation body;
determining a first position of the display interaction unit acted by the first operation; determining an area within a preset distance from the first position as a focus area, and displaying at least part of a first interactive object in the focus area; the first operation is other operations than a preceding operation; the pre-operation is an operation for determining a focus area;
displaying a second interactive object associated with the first interactive object at least partially in the focus area;
wherein said displaying the first interactive object at least partially in the focus area comprises:
if the display area of the first interactive object is not larger than the area of the focus area, displaying the first interactive object in the focus area;
under the condition that the display area of the first interactive object is larger than the area of the focus area, displaying a part of the first interactive object needing the operation of the operation body in the focus area;
the displaying at least a portion of a second interactive object associated with the first interactive object in the focus area includes:
determining a display position parameter of the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the second interactive object according to the display position parameter.
2. The method of claim 1, wherein prior to displaying at least a portion of a second interactive object associated with the first interactive object in the focal region, the method further comprises:
receiving a second operation;
and generating the second interactive object associated with the first interactive object in response to the second operation.
3. A method according to claim 1 or 2, characterized in that the method comprises:
under the condition that the display area of the first interactive object is larger than the area of the focus area, determining a first display parameter of the first interactive object according to the size of the focus area;
and displaying the first interactive object in the focus area according to the first display parameter.
4. The method of claim 3,
the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter, and a display position parameter of the first interactive object.
5. An electronic device includes a display interaction unit and a processing unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the display interaction unit is used for receiving a first operation input by an operation body;
the processing unit is used for determining a first position of the display interaction unit acted by the first operation; determining an area within a preset distance from the first position as a focus area, and displaying at least part of a first interactive object in the focus area; the first operation is other operations than a preceding operation; the pre-operation is an operation for determining a focus area;
the processing unit is further configured to display a second interactive object associated with the first interactive object at least partially in the focus area;
wherein said displaying the first interactive object at least partially in the focus area comprises:
if the display area of the first interactive object is not larger than the area of the focus area, displaying the first interactive object in the focus area;
under the condition that the display area of the first interactive object is larger than the area of the focus area, displaying a part, needing to be operated by the operation body, of the first interactive object in the focus area;
the processing unit is configured to:
determining a display position parameter of the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the second interactive object according to the display position parameter.
6. The electronic device of claim 5,
the display interaction unit is also used for receiving a second operation;
the processing unit is further configured to:
and generating the second interactive object associated with the first interactive object in response to the second operation.
7. The electronic device of claim 5 or 6, wherein the processing unit is configured to:
under the condition that the display area of the first interactive object is larger than the area of the focus area, determining a first display parameter of the first interactive object according to the size of the focus area;
and displaying the first interactive object in the focus area according to the first display parameter.
8. The electronic device of claim 7,
the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter, and a display position parameter of the first interactive object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910906953.7A CN110703970B (en) | 2014-04-21 | 2014-04-21 | Information processing method and electronic equipment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410160970.8A CN103995650A (en) | 2014-04-21 | 2014-04-21 | Information processing method and electronic device |
CN201910906953.7A CN110703970B (en) | 2014-04-21 | 2014-04-21 | Information processing method and electronic equipment |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410160970.8A Division CN103995650A (en) | 2014-04-21 | 2014-04-21 | Information processing method and electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110703970A CN110703970A (en) | 2020-01-17 |
CN110703970B true CN110703970B (en) | 2022-07-26 |
Family
ID=51309833
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910906953.7A Active CN110703970B (en) | 2014-04-21 | 2014-04-21 | Information processing method and electronic equipment |
CN201410160970.8A Pending CN103995650A (en) | 2014-04-21 | 2014-04-21 | Information processing method and electronic device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410160970.8A Pending CN103995650A (en) | 2014-04-21 | 2014-04-21 | Information processing method and electronic device |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN110703970B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104317490B (en) * | 2014-09-30 | 2018-11-09 | 联想(北京)有限公司 | A kind of reminding method and electronic equipment |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101144423B1 (en) * | 2006-11-16 | 2012-05-10 | 엘지전자 주식회사 | Mobile phone and display method of the same |
JP5526789B2 (en) * | 2010-01-08 | 2014-06-18 | ソニー株式会社 | Information processing apparatus and program |
US9959263B2 (en) * | 2010-12-07 | 2018-05-01 | Microsoft Technology Licensing, Llc. | User interface form field expansion |
KR20130052751A (en) * | 2011-05-17 | 2013-05-23 | 삼성전자주식회사 | Terminal and method for arranging icon thereof |
CN103257818B (en) * | 2012-02-20 | 2017-11-28 | 联想(北京)有限公司 | The method and apparatus of one-handed performance icons of touch screen |
CN102819395B (en) * | 2012-07-27 | 2016-06-08 | 东莞宇龙通信科技有限公司 | terminal and icon display method |
CN103677611B (en) * | 2012-09-24 | 2017-12-01 | 联想(北京)有限公司 | A kind of information processing method and a kind of electronic equipment |
CN103019508B (en) * | 2012-11-20 | 2016-10-26 | 东莞宇龙通信科技有限公司 | Mobile terminal and icon arrangement display packing |
CN102981715B (en) * | 2012-12-11 | 2016-12-21 | 广州华多网络科技有限公司 | A kind of icon display method, microprocessor and mobile terminal |
CN103605450A (en) * | 2013-11-27 | 2014-02-26 | 广东欧珀移动通信有限公司 | Application icon display method and intelligent terminal |
-
2014
- 2014-04-21 CN CN201910906953.7A patent/CN110703970B/en active Active
- 2014-04-21 CN CN201410160970.8A patent/CN103995650A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
CN103995650A (en) | 2014-08-20 |
CN110703970A (en) | 2020-01-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2020201096B2 (en) | Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium | |
KR102240088B1 (en) | Application switching method, device and graphical user interface | |
US11054988B2 (en) | Graphical user interface display method and electronic device | |
EP2924550B1 (en) | Split-screen display method and electronic device thereof | |
WO2018157662A1 (en) | Display control method for mobile terminal, and mobile terminal | |
US20150185953A1 (en) | Optimization operation method and apparatus for terminal interface | |
US20180173483A1 (en) | Display Method for Screen of Wearable Device and Wearable Device | |
US10372320B2 (en) | Device and method for operating on touch screen, and storage medium | |
US20150331573A1 (en) | Handheld mobile terminal device and method for controlling windows of same | |
KR20160032611A (en) | Method and apparatus for controlling an electronic device using a touch input | |
CA2846482A1 (en) | Method of providing of user interface in portable terminal and apparatus thereof | |
WO2022007934A1 (en) | Application icon control method and apparatus, and electronic device | |
CN108804005B (en) | Terminal control method and device and mobile terminal | |
JP2015007949A (en) | Display device, display controlling method, and computer program | |
US10095277B2 (en) | Electronic apparatus and display control method thereof | |
US20170046040A1 (en) | Terminal device and screen content enlarging method | |
CN106681640B (en) | Screen display control method of mobile terminal and mobile terminal | |
US20140223328A1 (en) | Apparatus and method for automatically controlling display screen density | |
US20200034032A1 (en) | Electronic apparatus, computer-readable non-transitory recording medium, and display control method | |
CN104717358A (en) | Mobile terminal control method and device | |
CN107979701B (en) | Method and device for controlling terminal display | |
CN105511597B (en) | A kind of page control method and device based on browser | |
US9519424B2 (en) | Touch-control method, related apparatus, and terminal device | |
CN108700990B (en) | Screen locking method, terminal and screen locking device | |
CN110703970B (en) | Information processing method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |