CN110703970A - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN110703970A
CN110703970A CN201910906953.7A CN201910906953A CN110703970A CN 110703970 A CN110703970 A CN 110703970A CN 201910906953 A CN201910906953 A CN 201910906953A CN 110703970 A CN110703970 A CN 110703970A
Authority
CN
China
Prior art keywords
display
interactive object
focus area
interaction
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910906953.7A
Other languages
Chinese (zh)
Other versions
CN110703970B (en
Inventor
席振新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201910906953.7A priority Critical patent/CN110703970B/en
Publication of CN110703970A publication Critical patent/CN110703970A/en
Application granted granted Critical
Publication of CN110703970B publication Critical patent/CN110703970B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an information processing method and electronic equipment, relates to the field of electronic information processing, and is designed for improving the use satisfaction of users. The method is applied to the electronic equipment comprising the display interaction unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object; the method specifically comprises the following steps: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area; according to the first instruction, at least part of the first interactive object is displayed in the focus area, and the first interactive object is displayed in the focus area accessible to the operation body, so that the operation of the operation body is facilitated, and the use satisfaction of a user is improved.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to the field of information processing, and in particular, to an information processing method and an electronic device.
Background
With the development of electronic information technology, the interactive interface of some handheld electronic devices (such as large-screen mobile phones, tablet computers, electronic books and other devices) is larger and larger, and often some information that needs to be confirmed by a user is displayed in an area that cannot be reached by one hand of the user, or an area that receives a user instruction is located in an area that cannot be reached by one hand of the user; these all result in inconvenient operation for the user and poor experience for the user.
Disclosure of Invention
In view of this, embodiments of the present invention are intended to provide an information processing method and an electronic device, so as to improve intelligence of the electronic device, so that an operator, such as a user, can interact with the electronic device without changing an interaction state.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the invention provides an information processing method, which is applied to electronic equipment comprising a display interaction unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the method comprises the following steps:
determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
and displaying at least part of the first interactive object in the focus area according to the first instruction.
Preferably, the determining the focus area comprises:
before a first moment, the display interaction unit receives a first operation input by an operation body;
determining a first position where the first operation acts on the display interaction unit;
and determining the area within a preset distance from the first position as a focus area.
Preferably, the first and second electrodes are formed of a metal,
the electronic device includes a sensing unit;
the determining the focus area includes:
the sensing unit detects that an operation body is in a reachable area corresponding to the display interaction unit in the first time interaction state;
determining the focus area in dependence on the reachable area.
Preferably, the first and second electrodes are formed of a metal,
the determining the focus area includes:
counting the positions of the historical operations of the operation body acting on the display interaction unit within the specified time to form a statistical result;
and determining the focus area according to the statistical result.
Preferably, after the displaying at least part of the first interactive object in the focus area according to the first instruction, the method further comprises:
displaying a second interactive object associated with the first interactive object at least partially in the focus area.
Preferably, the first and second electrodes are formed of a metal,
displaying the first interactive object and the second interactive object at least partially in the focus area comprises:
determining a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the first interactive object and the second interactive object according to the relative position parameter.
Preferably, before displaying the first interactive object and/or the second interactive object in the focus area, the method further comprises:
receiving a second operation acting on a third interactive object;
responding to the second operation, and generating the first interactive object and/or the second interactive object;
the determining, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focus area, a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit includes:
determining a relative display position parameter of the first interactive object and the second interactive object according to the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object and the position of the focus area;
and adjusting and displaying the relative display position relation of the first interactive object and the second interactive object on the display interactive unit according to the relative display position parameter.
Preferably, the displaying at least part of the first interactive object in the focus area according to the first instruction comprises:
determining a first display parameter of the first interactive object according to the first instruction and the size of the focus area;
and displaying the first interactive object in the focus area according to the first display parameter.
Preferably, the first and second electrodes are formed of a metal,
the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter, and a display position parameter of the first interactive object.
A second aspect of the present invention provides an electronic device, comprising a display interaction unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit;
the processing unit is used for determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit is specifically configured to display at least part of the first interaction object in the focus area according to the first instruction.
Further, the air conditioner is provided with a fan,
the display interaction unit is further used for receiving a first operation input by an operation body before a first moment;
the processing unit is specifically configured to determine a first position where the first operation acts on the display interaction unit; and determining the area within a preset distance from the first position as a focus area.
Further, the air conditioner is provided with a fan,
the electronic device includes a sensing unit;
the sensing unit detects that an operation body is in a reachable area corresponding to the display interaction unit in the first time interaction state;
the processing unit is specifically configured to determine the focus area according to the reachable area.
Further, the air conditioner is provided with a fan,
the processing unit is specifically used for counting the positions of the historical operations of the operation body acting on the display interaction unit within the specified time to form a statistical result; and determining the focus area according to the statistical result.
Further, the air conditioner is provided with a fan,
the display interaction unit is further configured to display, after the at least part of the first interaction object is displayed in the focus area according to the first instruction, a second interaction object associated with the first interaction object at least partially in the focus area.
Further, the air conditioner is provided with a fan,
the processing unit is further configured to determine, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focus area, a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit;
the display interaction unit is specifically configured to display the first interaction object and the second interaction object according to the relative position parameter.
Further, the air conditioner is provided with a fan,
the display interaction unit is further used for receiving a second operation acting on a third interaction object before the first interaction object and/or the second interaction object are/is displayed in the focus area; responding to the second operation, and generating the first interactive object and/or the second interactive object;
the processing unit is specifically configured to determine, according to the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object, and the position of the focus area, a relative display position parameter of the first interactive object and the second interactive object;
and the display interaction unit is further used for adjusting and displaying the relative display position relationship of the first interaction object and the second interaction object on the display interaction unit according to the relative display position parameter.
Further, the air conditioner is provided with a fan,
the processing unit is further configured to determine a first display parameter of the first interactive object according to the first instruction and the size of the focus area;
the display interaction unit is further configured to display the first interaction object in the focus area according to the first display parameter.
Further, the air conditioner is provided with a fan,
the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter, and a display position parameter of the first interactive object.
According to the information processing method and the electronic device, before the first interactive object is displayed, a focus area for displaying the first interactive object is determined, and at least part of the first interactive object is displayed in the focus area; the focus area is an area on the display interaction unit which can be touched by an operation body (such as a user), so that the operation body can conveniently touch or approach the first interaction object under the condition that the current (such as the first moment) interaction state is not changed, and input of a control instruction to the electronic equipment is completed.
Drawings
Fig. 1 is a schematic diagram illustrating a display effect of an electronic device according to an embodiment of the invention;
FIG. 2 is a flowchart illustrating an information processing method according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of determining a focus area according to an embodiment of the present invention;
FIG. 4 is a second flowchart illustrating an information processing method according to an embodiment of the invention;
FIG. 5 is a diagram illustrating an effect displayed by the information processing method according to the embodiment of the invention;
FIG. 6 is a second schematic diagram illustrating the effect of the information processing method according to the embodiment of the invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solution of the present invention is further described in detail with reference to the drawings and the specific embodiments of the specification.
The first embodiment of the method comprises the following steps:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area; in fig. 1, the focus area is an area 111 that is accessible or close to the display interaction unit 110 when the user holds the electronic device from one side;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
The display interaction unit generally includes a display screen, and the display screen can receive touch operation or proximity control operation of a user at the same time. The focus area in the step S110 may be the whole display screen or a part of the display screen. The first-time interaction state is a state in which an operation body (such as a user) interacts with the electronic equipment in a posture kept by the electronic equipment in an operation posture at the first time; specifically, the electronic device is a one-hand handheld electronic device with an operation body (such as a user). It is possible that a partial area of the display interaction unit is an area inaccessible to the operator in a current interaction state of the user with the electronic device.
The area of a display interaction unit of the electronic equipment is larger; the user is in the first moment interaction state, i.e. the state of the single-handed handheld electronic device, the partial area of the display interaction unit is inaccessible to the user without changing the hand-held orientation or without making a large change in the posture of the handheld electronic device. In a specific application situation, if the electronic equipment is a large-screen mobile phone or a mini tablet computer, the user is on a bus at the first moment; when a user holds the armrest in the bus, the user can only operate the electronic device with the other hand, and the interaction state between the operation body (such as the user) and the electronic device is the first time interaction state when the user operates the electronic device.
For another example, the user is a child, the hand is small, the strength is small, and the risk that the device slips off and the like may occur when the device is held by one hand; at the moment, the user needs to hold the equipment by two hands during operation, so that the current interaction state of the user is that the user holds the electronic equipment by two hands; when two handheld devices interact, the area of the display interaction unit that is accessible or accessible to the user is limited, such as limited to the range in which the thumbs of the user's hands are movable within the display interaction unit. The current time is a first time, and the current interaction state is a first time interaction state.
In step S110, an instruction is formed after the focus area is determined; the instruction is used for adjusting the display position of a first interactive object to be interacted with an operation body, so that at least part of the first interactive object is displayed in the focus area.
The first interactive object being at least partially displayed in the focus area comprises the following two situations:
the first method comprises the following steps: the first interactive objects are all displayed in the focus area; this situation generally applies to: when the display area of the first interactive object is not larger than the area of the focus area;
and the second method comprises the following steps: the first interactive object part is displayed in the focus area; typically this situation may include that the display interaction unit has to display a part of the first interaction object outside the focus area, because the display area of the first interaction object is larger than the area of the focus area; it may further comprise that the display area of the first interactive object is not larger than the area of the focus area; in any case, it is mainly ensured that the interaction items such as the control and the selection option that need to be operated by the user in the first interaction object are preferentially displayed in the focus area.
Particularly, as the first interactive object displays a lot of contents, the user is required to input and confirm to execute subsequent operations; in order to facilitate the operation of the user, the 'determination' control which the user needs to click or approach is displayed in the reachable focus area of the user; other content displayed as to the first interactive object may be displayed outside the focus area.
In the step S120, the first instruction may be an operation instruction carrying a display position parameter of a first interactive object; there are many ways in which the instructions can be constructed, and no further development is made here.
In summary, the information processing method according to this embodiment determines the focus area of the display interaction unit according to the interaction state at the first time, and displays the first interaction object to be interacted with the operation body in the focus area as much as possible, thereby facilitating the operation of the operation body; therefore, the operation body can easily and simply complete the interaction with the electronic equipment under the condition that the operation posture of the operation body or the placing posture of the electronic equipment is not changed, and the intelligence of the equipment and the use satisfaction of a user are improved.
The second method embodiment:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
As shown in fig. 3, the determining the focus area in step S110 includes the following steps:
step S111: before a first moment, the display interaction unit receives a first operation input by an operation body;
step S112: determining a first position where the first operation acts on the display interaction unit;
step S113: determining an area within a preset distance from the first position as a focus area;
usually, the interaction state at the first time and the interaction state at the time before the first time are continuous, and usually the same interaction state is maintained or slightly changed, and the region accessible to the operation body on the display interaction unit is still maintained as the same region or slightly different, so that the focus region is determined by the first operation of one operation before the first time in the embodiment. Before the first time point in step S101, preferably before the first time point, a time point corresponding to the latest one operation input by the operation body; the first operation is preferably the last operation of an operator (e.g., a user) before the first moment.
The first operation can be click operation, sliding operation or pressing operation; the first position may be an interaction point or an interaction area; the interaction mode of the operation body and the electronic equipment comprises touch interaction and proximity interaction; the touch interaction operation body realizes the operation input of the electronic equipment by touching the interaction display unit; the operation gesture of the proximity interaction operation body is within a preset range from the proximity interaction display unit, and the operation input is completed. The accessible areas comprise an area where the operating body can contact the display interaction unit in the first-moment interaction state and an area corresponding to the display interaction unit, which can approach and complete operation input.
The first operation may be a preceding operation dedicated to determining the focus area, or may be one operation performed by the operation body when another operation is performed before the first time. For example, the user inputs an operation of waking up the electronic device before the first time; the operation body contacts or approaches the display interaction unit when performing the operation; this operation may be performed as the first operation described in the present embodiment as the operation of determining the focus area at the first timing.
In step S113, the preset distance may be a pre-stored distance, which may be determined according to historical data of an area accessible with a certain interaction position as a center in the interaction state of the same operation body corresponding to the first time, may be determined according to statistical data of an area accessible with a certain interaction position as a center in the interaction state of different operation bodies corresponding to the first time, or may be data obtained by performing simulation or estimation according to various personal interaction state data of the user in combination with statistical data of a plurality of users; there are various ways of obtaining said predetermined distance, which will not be described in further detail herein.
Compared with the information processing method described in the first embodiment, the information processing method described in this embodiment provides a specific implementation method for determining the focus area through the first operation before the first time, and has the advantage of simple implementation.
The third method embodiment:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein the electronic device includes a sensing unit, and the determining the focus area in step S110 may include:
the sensing unit detects that an operation body is in a reachable area corresponding to the display interaction unit in the first time interaction state;
determining the focus area in dependence on the reachable area.
The sensing unit can determine the reachable area through the change of sensing signals such as electric signals or magnetic signals when the operation body moves above the display interaction unit; specifically, when a finger of a user moves on the display interaction unit or stays on the display interaction area or a certain area above the display interaction area, the finger may cut the magnetic induction line or cut off the magnetic induction line; the sensing unit determines the accessible region from the magnetic flux or the change in the magnetic flux.
For another example, when the user moves on or above the display interaction unit within a preset distance, the movement of the finger of the user or the user's finger staying in different areas of the display interaction unit will cause different luminous fluxes in different areas of the display interaction unit, so that the reachable area is determined according to the different luminous fluxes.
In particular, there are many ways to detect the operating body in the accessible region corresponding to the display interaction unit, and they are not listed here.
On the basis of the first method embodiment, the focal region is determined through the sensing unit, and the method has the advantages of high equipment intelligence and high user satisfaction degree, and is simple and convenient to implement.
The method comprises the following steps:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein the determining the focus area in step S110 may include:
counting the positions of the historical operations of the operation body acting on the display interaction unit within the specified time to form a statistical result;
and determining the focus area according to the statistical result.
Before the first time, the electronic device may receive a plurality of operations input by the operation body, and in this embodiment, the specified time may preferably be several times before the first time; the operation body performs multiple operations within the moments, the position of the operation body acting on the display interaction unit in the multiple operations can be counted according to the multiple operations, which areas of the display interaction unit perform interaction operations with the electronic equipment within the specified time can be known according to one or more operations within the specified time, and the operation input in which areas of the display interaction unit is the most frequent or the most frequent; the focal region may be determined according to a pre-stored selection strategy.
If all the areas receiving the input operation of the operation body in the specified time can be positioned into the focus area; and a method of setting an area in which an input operation is frequently received as a focus area.
In addition, usually, the operation body (such as the user) has a commonly used interaction state, such as the user is used to hold the electronic device by one hand and interacts with the electronic device; the first time is still likely to be the usual interaction state, so that the interaction state at the first time can be known, and the focus area in the first interaction state can be easily known according to the history.
The information processing method described in this embodiment is further improved on the basis of the first method embodiment, and a method for determining a focus area based on historical operation statistics is provided according to the continuity of the interaction state of the operation body over time and the usual interaction state of the operation body and the electronic device, so that the intelligence and user satisfaction of the electronic device are improved, and the method has the advantage of simplicity and convenience in implementation.
Method example five:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein, as shown in fig. 4, the method further comprises:
step S130: displaying a second interactive object associated with the first interactive object at least partially in the focus area.
The present embodiment is further improved on the basis of any of the above method embodiments, and at least a part of the second interactive object associated with the first interactive object is also displayed in the focus area by the addition of step S130, so as to facilitate the operation of the user.
Specifically, for example, the electronic device displays a first interactive object to be operated by the user, such as a video player, in a focus area at a first time, and the operation body controls the video player to be opened through interaction, so that the user has a high probability of selecting a region to play a video file already stored in the electronic device; therefore, for convenience of operation, the folder in which each video file is stored or each video file itself can be displayed in the focus area.
The video player is the first interactive object; each video file or the folder in which each video file is stored is the second interactive object associated with the first interactive object.
In a specific implementation process, the association between the first interactive object and the second interactive object may be established in advance, or may be determined by the electronic device according to content carried by each interactive object of the electronic device and the operational association.
For another example, the electronic device displays the first interactive object song a in the focus area at a first time; the information processing method in the embodiment opens the song B and displays the song B in the focus area, so that the user can select and operate conveniently.
On the basis of any one of the above method embodiments, the information processing method according to this embodiment not only displays at least part of the first interactive object in the focus area, but also displays at least part of the second interactive object associated with the first interactive object in the focus area, thereby again facilitating the operation of the user and improving the intelligence of the electronic device and the use satisfaction of the user.
Method example six:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein, as shown in fig. 4, the method further comprises:
step S130: displaying a second interactive object associated with the first interactive object at least partially in the focus area.
Specifically, the step S130 may include:
determining a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the first interactive object and the second interactive object according to the relative position parameter.
In this embodiment, the relative position parameter includes a distance between the first interactive object and the second interactive object; and the position parameter relation such as the sequencing mode of the first interactive object and the second interactive object.
The content property of the first interactive object and the content property of the second interactive object may determine the probability of interaction with the user and the operation performed by the interactive electronic device; the position of the focus area at least partially determines the first interactive object and the second interactive object display position; the size of the focus area determines the layout and display shape of the first and second interactive objects.
In FIG. 5, the content attribute of the "determine" interactive object is to determine to delete file A; the content attribute of the cancel interactive object is a cancel delete file A; firstly, the two interactive objects operate on the same file A, and have relevance in operation. If the 'determination' corresponds to the first interactive object; then "cancel" corresponds to the second interactive object.
As can be seen from the left image of fig. 5, the second interactive object "cancel" is not displayed in the focus area 111 of the display interaction unit 110. And the distance between the first interactive object and the second interactive object is d 1; the first interactive object is located a distance d3 from the edge of the display interaction area.
With the method of this embodiment, first, the second interactive object needs to be "determined" to be at least partially displayed in the focal region 111; therefore, in order to achieve the above object, with the information processing method according to this embodiment, the display effect of the first interactive object and the second interactive object is as shown in the right diagram of fig. 5.
Wherein the distance between the first interactive object and the second interactive object is changed from the original d1 to d 2; the distance between the first interactive object and the edge of the display interactive unit is changed from the original d3 to d 4.
In summary, the present embodiment further provides how to display the first interactive object and the second interactive object having the association in the focus area on the basis of the previous embodiment, so as to facilitate the user operation, and have the advantage of simple implementation.
Method embodiment seven:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein, as shown in fig. 4, the method further comprises:
step S130: displaying a second interactive object associated with the first interactive object at least partially in the focus area.
Specifically, the step S130 may include:
determining a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the first interactive object and the second interactive object according to the relative position parameter.
Further, before displaying the first interactive object and/or the second interactive object in the focus area, the method further comprises:
receiving a second operation acting on a third interactive object;
responding to the second operation, and generating the first interactive object and/or the second interactive object;
the determining, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focus area, a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit includes:
determining a relative display position parameter of the first interactive object and the second interactive object according to the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object and the position of the focus area;
and adjusting and displaying the relative display position relation of the first interactive object and the second interactive object on the display interactive unit according to the relative display position parameter.
Specifically, the second operation is to delete the file a displayed on the display interaction unit; the interactive object corresponding to the file A is a third interactive object; after the electronic equipment responds to the second operation, the first interactive object 'confirm' and the second interactive object 'cancel' are formed.
Specifically, as shown in fig. 6, at this time, in the left image of fig. 6, the first interaction object "determination" is only partially displayed in the focus area 111 of the display interaction unit 110; the second interactive object "cancel" is displayed completely within the focus area 111. Deleting the third interactive object file A according to the second operation, wherein obviously, the possibility that the first interactive object is touched by the user is higher than that of the second interactive object; and at the moment, the first interactive object is only partially displayed in the focus area, and the user obviously takes a little more effort to touch the first interactive object than to touch the second interactive object.
In order to solve the above problem, in this embodiment, the determining the display parameters of the first interactive object and the second interactive object includes the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object, and the position of the focus area; in this way, the display of each interactive object is determined in consideration of not only the content attribute of the first interactive object and the content attribute of the second interactive object but also the third interactive object and the second operation.
After the information processing method in this embodiment is executed, the first interactive object and the second interactive object are adjusted from the display effect in the left image in fig. 6 to the display effect in the right image in fig. 6.
The first interactive object is adjusted from the left side to the right side of the second interactive object and is completely positioned in the focus area; meanwhile, the distance between the first interactive object and the second interactive object is changed from D1 to D2; the distance between the second interactive object and the edge of the display interactive unit 110 is D3 in the left diagram of fig. 6; and the distance between the first interactive object and the edge of the display interactive unit 110 is D4 in the right diagram of fig. 6; it is apparent that D4 is less than D3; d1 is less than D1; by changing the relative display position parameter, it is obvious that the first interactive object with higher probability of being touched by the user is completely positioned in the focus area, so as to be more convenient for the user to touch; meanwhile, most of the second interactive object is also located in the focus area, and it is obviously easier to touch compared with the operation body (such as a user) touching the first interactive object in fig. 6.
For another example, the user enters an nth page of a website of a hierarchical structure; wherein N is greater than 2; when it is to range from layer to the first layer page; the method comprises the following steps that two interactive objects can be corresponded when page selection operation is carried out; one is "next page", and the other is "previous page". If the user has continuously operated the next page for a plurality of times during the operation; if the focus area on the display interaction unit can only display one of the interaction objects at the first moment, the display positions of the "next page" of the interaction object and the "previous page" of the interaction object are preferably exchanged according to the previous operation, so that the "next page" of the interaction object is displayed in the focus area, and the operation of a user is facilitated.
In summary, this embodiment provides a more preferable method for the relative display position parameters of the first interactive object and the second interactive object based on the above embodiment, and adjusts the relative position relationship between the first interactive object and the second interactive object according to the relative display position parameters, so as to improve the intelligence of the device again and facilitate the user operation.
The eighth embodiment of the method:
as shown in fig. 1, the present embodiment provides an information processing method applied in an electronic device including a display interaction unit 110; the display interaction unit 110 is configured to display an interaction object and receive an operation of an operation body acting on the interaction object;
as shown in fig. 2, the method includes:
step S110: determining a focus area and forming a first instruction at a first moment; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be reached by the operation body in the first moment interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
step S120: and displaying at least part of the first interactive object in the focus area according to the first instruction.
Wherein, the step S120 may specifically include:
determining a first display parameter of the first interactive object according to the first instruction and the size of the focus area;
and displaying the first interactive object in the focus area according to the first display parameter.
In a specific implementation, the area of the first interactive object may be larger than the area of the focus region; in this embodiment, in order to enable the first interactive object to be completely displayed in the focus area, after receiving the first instruction, the first display parameter is further determined according to the first instruction and the size of the focus area; the first display parameter at least comprises one of an area parameter, a shape parameter, an interactive information content layout parameter and a display position parameter of the first interactive object; when the first interactive object is displayed according to the first display parameter, the first interactive object can be completely displayed in the focus area, so that the operation of a user is facilitated.
And displaying the first interactive object according to the first display parameter, wherein the display area, the display shape and the display area of the first interactive object are changed. When the focus area is a circular area, the first interactive object can be adjusted from the original rectangular shape to a circular shape in order to facilitate sufficient display area of the first interactive object and to utilize the area of the focus area as much as possible.
In summary, the information processing method according to this embodiment is improved based on the previous method embodiment, so that the intelligence of the device is improved and the satisfaction of the user is improved.
The first embodiment of the device:
as shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focus area according to the first instruction.
The specific structure of the display interaction unit 110 may include a screen; the screen can be a liquid crystal screen or an OLED screen; the screen is used for displaying information required to be displayed by the electronic equipment to an operation body (such as a user), and meanwhile, various input operations of the operation body are received so as to control the operation of the electronic equipment.
The processing unit 120 may be a processor included in the electronic device; the processor is an electronic component with a processing function, such as a central processing unit, a microprocessor, a digital signal processor, a programmable logic circuit and the like.
The electronic device can be an electronic device such as a smart phone, a tablet computer and an electronic book, and is preferably a mobile electronic device.
The electronic device described in this embodiment provides hardware support for the information processing method described in the first embodiment of the method, and can be used to implement any of the technical solutions described in the first embodiment of the method.
Apparatus embodiment II
As shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focus area according to the first instruction.
The display interaction unit 110 is further configured to receive, before a first time, a first operation input by an operation body;
the processing unit 120 is specifically configured to determine a first position where the first operation acts on the display interaction unit; and determining the area within a preset distance from the first position as a focus area.
The processing unit 120 may include a receiving interface and a calculator;
the receiving interface is used for receiving and displaying a first position of the positioned first operation of the interaction unit 110;
and the calculator is used for calculating the focus area from the first position received by the receiving interface and the preset distance.
The electronic device described in this embodiment provides specific hardware support for the information processing method described in method embodiment two, can implement any technical scheme of the information processing method described in method embodiment two, and has the same advantages of high intelligence and high user satisfaction.
The third equipment embodiment:
as shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focus area according to the first instruction.
Wherein the electronic device comprises a sensing unit;
the sensing unit detects that an operation body is in a reachable area corresponding to the display interaction unit in the first time interaction state;
the processing unit 120 is specifically configured to determine the focus area according to the reachable area.
The specific structure of the sensing unit varies with the manner in which the accessible region is acquired; specifically, the structure may include a light flux sensor, a magnetic flux sensor, and the like.
The electronic device described in this embodiment provides a specific hardware support for the information processing method described in method embodiment three, can implement any technical scheme of the information processing method described in method embodiment three, and has the same advantages of high intelligence and high user satisfaction.
The fourth equipment embodiment:
as shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focus area according to the first instruction.
The display interaction unit 110 is further configured to receive, before a first time, a first operation input by an operation body;
the processing unit 120 is specifically configured to count positions where historical operations of the operation body act on the display interaction unit within a specified time, and form a statistical result; and determining the focus area according to the statistical result.
The processing unit 120 may include a storage medium in addition to a processor; the storage medium records locations where respective historical operations act on the display interaction unit. The processor is connected with the storage medium through a bus; the processing unit 120 may also include an external communication interface through which the processor may interact with peripherals.
The electronic device described in this embodiment provides specific hardware support for the information processing method described in method embodiment four, can be used to implement the technical solution described in any of method embodiment four, and has the advantages of high intelligence and high user satisfaction, as well as the advantages of simple structure and easy implementation.
Device example five:
as shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is further configured to, after the displaying of at least part of the first interaction object in the focus area according to the first instruction, display at least part of a second interaction object associated with the first interaction object in the focus area.
The display interaction unit 110 may include a display processor connected to the display screen in addition to the display screen; the display processor may be configured to control the display interaction unit 110 to display the first interaction object in the focus area and also display the second interaction object in the focus area according to the first instruction. The display processor may be a separate processor from the processing unit 120; a logic device in the processing unit 120; the same processor with strong processing capability as the processing unit 120.
The structure of the display interaction unit 110 is various, and is not illustrated here.
The embodiment is further defined on the basis of any one of the above device embodiments; any technical scheme of the information processing method in the fifth embodiment of the method can be realized, and the method has the advantages of high intelligence and high user satisfaction.
Device example six:
as shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is further configured to, after the displaying of at least part of the first interaction object in the focus area according to the first instruction, display at least part of a second interaction object associated with the first interaction object in the focus area.
The processing unit 120 is further configured to determine, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focus area, a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit;
the display interaction unit 110 is specifically configured to display the first interaction object and the second interaction object according to the relative position parameter.
The content attribute includes the display content itself, the meaning of the content itself, and display parameters such as the content display mode and the display size.
The processing unit 120 determines a relative display position parameter of the first interactive object and the second interactive object according to the first interactive object, the second interactive object and the focus area; the display interaction unit 110 displays a first display interaction object and a second display interaction object according to the relative position parameter; the adjustment of the relative positional relationship of the first interactive object and the second interactive object on the display interactive unit is shown due to the difference of the display parameters at the front and rear time points.
The embodiment is further defined on the basis of any one of the above device embodiments; any technical scheme of the information processing method in the sixth embodiment of the method can be realized, and the method has the advantages of high intelligence and high user satisfaction.
Apparatus embodiment seven
As shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is further configured to, after the displaying of at least part of the first interaction object in the focus area according to the first instruction, display at least part of a second interaction object associated with the first interaction object in the focus area.
The processing unit 120 is further configured to determine, according to the content attribute of the first interactive object, the content attribute of the second interactive object, and the focus area, a relative display position parameter of the first interactive object and the second interactive object on the display interactive unit;
the display interaction unit 110 is specifically configured to display the first interaction object and the second interaction object according to the relative position parameter.
The content attribute includes the display content itself, the meaning of the content itself, and display parameters such as the content display mode and the display size.
The display interaction unit 110 is further configured to receive a second operation acting on a third interaction object before the first interaction object and/or the second interaction object are displayed in the focus area; responding to the second operation, and generating the first interactive object and/or the second interactive object;
the processing unit 120 is specifically configured to determine a relative display position parameter of the first interactive object and the second interactive object according to the third interactive object, the second operation, the content attribute of the first interactive object, the content attribute of the second interactive object, and the position of the focus area;
the display interaction unit 110 is further configured to adjust and display a relative display position relationship between the first interaction object and the second interaction object on the display interaction unit according to the relative display position parameter.
In this embodiment, the display interaction unit 110 further receives a second operation before the first time; the second operation may be an operation input specifically for determining the focus position at the first time, or may be an operation when the operating body instructs the electronic device to perform another operation at the first time, and in this embodiment, the second operation instructing the electronic device to perform another operation is one of the bases for determining the focus area at the first time.
The electronic device described in this embodiment is a further improvement based on the previous embodiment, provides specific implementation hardware for the information processing method described in method embodiment seven, can be used to implement any of the technical solutions described in method embodiment seven, and has the advantages of simple structure and high intelligence of the electronic device.
The eighth embodiment of the device:
as shown in fig. 1 and 7, an electronic device of the present embodiment includes a display interaction unit 110; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the electronic device further comprises a processing unit 120;
the processing unit 120 is configured to determine a focus area and form a first instruction at a first time; the focus area is a whole or partial area corresponding to the display interaction unit and is an area which can be touched by the operating body in the current interaction state; the first instructions are for causing a first interactive object to be displayed at least partially in the focus area;
the display interaction unit 110 is specifically configured to display at least part of the first interaction object in the focus area according to the first instruction.
The processing unit 120 is further configured to determine a first display parameter of the first interactive object according to the first instruction and the size of the focus area;
the display interaction unit 110 is further configured to display the first interaction object in the focus area according to the first display parameter.
In this embodiment, the display device 120 further determines a first display parameter of the first interactive object according to the size of the focus area and the first instruction; the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter and a display position parameter of the first interactive object.
For example, the first display parameter may be an area parameter of a first interactive object, and when the default display area of the first interactive object is larger than the focus area, the area of the first interactive object may be reduced appropriately according to the area of the focus area without affecting the clear display of the first interactive object, so that the display area of the first interactive object is determined again to form the first display parameter. For another example, if the default display position of the first interactive object on the display interaction unit 120 is not in the focus area, a first display parameter for displaying the first interactive object in the focus area is formed by changing a display parameter.
After the processing unit 120 forms the first display parameter, the display interaction unit 110 may display the first interaction object according to the first display parameter; when the first display interaction object is presented to an operation body (such as a user) through the display interaction unit, at least part of the first display interaction object is displayed in the focus area.
In summary, this embodiment is a further improvement on the basis of any one of the above-mentioned device embodiments, provides a specific hardware support for the information processing method described in method embodiment eight, can be used to implement the technical solution described in method embodiment eight, and has the same advantages of high intelligence of the electronic device, high user satisfaction, and the like.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (12)

1. An information processing method is applied to an electronic device comprising a display interaction unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object; the method comprises the following steps:
receiving a first operation input by an operation body;
based on the first operation, determining a focus area and displaying a first interactive object in the focus area;
receiving a second operation acting on the first interactive object,
displaying, based on the second operation, a second interactive object associated with the first interactive object at least partially in the focus area.
2. The method of claim 1, the determining a focus area based on the first operation, comprising:
determining a first position where the first operation acts on the display interaction unit;
and determining the area within a preset distance from the first position as a focus area.
3. The method of claim 1, wherein displaying at least a portion of a second interactive object associated with the first interactive object in the focus area comprises:
determining a display position parameter of the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the second interactive object according to the display position parameter.
4. The method of claim 1, wherein prior to displaying at least a portion of a second interactive object associated with the first interactive object in the focal region, the method further comprises:
and generating the second interactive object associated with the first interactive object in response to the second operation.
5. The method according to any one of claims 1 to 4, wherein the determining a focus area and displaying a first interactive object in the focus area based on the first operation comprises:
determining a first display parameter of the first interactive object according to the size of the focus area;
and displaying the first interactive object in the focus area according to the first display parameter.
6. The method of claim 5,
the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter, and a display position parameter of the first interactive object.
7. An electronic device includes a display interaction unit and a processing unit; the display interaction unit is used for displaying an interaction object and receiving the operation of an operation body acting on the interaction object;
the display interaction unit is used for receiving a first operation input by an operation body;
the processing unit is used for determining a focus area and displaying a first interactive object in the focus area based on the first operation;
the display interaction unit is further used for receiving a second operation acting on the first interaction object;
the processing unit is further configured to display, based on the second operation, a second interactive object associated with the first interactive object at least partially in the focus area.
8. The electronic device of claim 7, the processing unit to:
determining a first position where the first operation acts on the display interaction unit;
and determining the area within a preset distance from the first position as a focus area.
9. The electronic device of claim 7, wherein the processing unit is to:
determining a display position parameter of the second interactive object on the display interactive unit according to the content attribute of the first interactive object, the content attribute of the second interactive object and the focus area;
and displaying the second interactive object according to the display position parameter.
10. The electronic device of claim 7, wherein the processing unit is further configured to:
and generating the second interactive object associated with the first interactive object in response to the second operation.
11. The electronic device of any of claims 7-10, wherein the processing unit is configured to:
determining a first display parameter of the first interactive object according to the size of the focus area;
and displaying the first interactive object in the focus area according to the first display parameter.
12. The electronic device of claim 11,
the first display parameter includes at least one of an area parameter, a shape parameter, an interactive information content layout parameter, and a display position parameter of the first interactive object.
CN201910906953.7A 2014-04-21 2014-04-21 Information processing method and electronic equipment Active CN110703970B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910906953.7A CN110703970B (en) 2014-04-21 2014-04-21 Information processing method and electronic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910906953.7A CN110703970B (en) 2014-04-21 2014-04-21 Information processing method and electronic equipment
CN201410160970.8A CN103995650A (en) 2014-04-21 2014-04-21 Information processing method and electronic device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201410160970.8A Division CN103995650A (en) 2014-04-21 2014-04-21 Information processing method and electronic device

Publications (2)

Publication Number Publication Date
CN110703970A true CN110703970A (en) 2020-01-17
CN110703970B CN110703970B (en) 2022-07-26

Family

ID=51309833

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201410160970.8A Pending CN103995650A (en) 2014-04-21 2014-04-21 Information processing method and electronic device
CN201910906953.7A Active CN110703970B (en) 2014-04-21 2014-04-21 Information processing method and electronic equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201410160970.8A Pending CN103995650A (en) 2014-04-21 2014-04-21 Information processing method and electronic device

Country Status (1)

Country Link
CN (2) CN103995650A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104317490B (en) * 2014-09-30 2018-11-09 联想(北京)有限公司 A kind of reminding method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120144285A1 (en) * 2010-12-07 2012-06-07 Microsoft Corporation User interface form field expansion
CN102981715A (en) * 2012-12-11 2013-03-20 广州华多网络科技有限公司 Icon display method, microprocessor and mobile terminal
CN103257818A (en) * 2012-02-20 2013-08-21 联想(北京)有限公司 Method and device for one-handed operation of icon on touch screen
CN103677611A (en) * 2012-09-24 2014-03-26 联想(北京)有限公司 Information processing method and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101144423B1 (en) * 2006-11-16 2012-05-10 엘지전자 주식회사 Mobile phone and display method of the same
JP5526789B2 (en) * 2010-01-08 2014-06-18 ソニー株式会社 Information processing apparatus and program
KR20130052751A (en) * 2011-05-17 2013-05-23 삼성전자주식회사 Terminal and method for arranging icon thereof
CN102819395B (en) * 2012-07-27 2016-06-08 东莞宇龙通信科技有限公司 terminal and icon display method
CN103019508B (en) * 2012-11-20 2016-10-26 东莞宇龙通信科技有限公司 Mobile terminal and icon arrangement display packing
CN103605450A (en) * 2013-11-27 2014-02-26 广东欧珀移动通信有限公司 Application icon display method and intelligent terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120144285A1 (en) * 2010-12-07 2012-06-07 Microsoft Corporation User interface form field expansion
CN103257818A (en) * 2012-02-20 2013-08-21 联想(北京)有限公司 Method and device for one-handed operation of icon on touch screen
CN103677611A (en) * 2012-09-24 2014-03-26 联想(北京)有限公司 Information processing method and electronic equipment
CN102981715A (en) * 2012-12-11 2013-03-20 广州华多网络科技有限公司 Icon display method, microprocessor and mobile terminal

Also Published As

Publication number Publication date
CN103995650A (en) 2014-08-20
CN110703970B (en) 2022-07-26

Similar Documents

Publication Publication Date Title
AU2020201096B2 (en) Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium
KR102240088B1 (en) Application switching method, device and graphical user interface
US9024877B2 (en) Method for automatically switching user interface of handheld terminal device, and handheld terminal device
US20180173483A1 (en) Display Method for Screen of Wearable Device and Wearable Device
US20200371685A1 (en) Graphical User Interface Display Method And Electronic Device
US20150185953A1 (en) Optimization operation method and apparatus for terminal interface
US10372320B2 (en) Device and method for operating on touch screen, and storage medium
US20150331573A1 (en) Handheld mobile terminal device and method for controlling windows of same
EP3336677A1 (en) Method and apparatus for controlling touch screen of terminal, and terminal
CN107066167A (en) A kind of regional selection method, device and graphic user interface
WO2022007934A1 (en) Application icon control method and apparatus, and electronic device
US10095277B2 (en) Electronic apparatus and display control method thereof
CA2846482A1 (en) Method of providing of user interface in portable terminal and apparatus thereof
CN105630363B (en) Display method of virtual button, electronic device thereof and device for displaying virtual button
CN108804005B (en) Terminal control method and device and mobile terminal
US20170046040A1 (en) Terminal device and screen content enlarging method
WO2022127304A1 (en) Method and apparatus for adjusting interface display state, and device and storage medium
US10908868B2 (en) Data processing method and mobile device
US20140223328A1 (en) Apparatus and method for automatically controlling display screen density
CN108388354A (en) A kind of display methods and mobile terminal in input method candidate area domain
CN107979701B (en) Method and device for controlling terminal display
CN104717358A (en) Mobile terminal control method and device
CN105511597B (en) A kind of page control method and device based on browser
EP3528103A1 (en) Screen locking method, terminal and screen locking device
CN110703970B (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant