WO2023040613A1 - 人机交互方法、计算机可读介质和电子设备 - Google Patents

人机交互方法、计算机可读介质和电子设备 Download PDF

Info

Publication number
WO2023040613A1
WO2023040613A1 PCT/CN2022/114608 CN2022114608W WO2023040613A1 WO 2023040613 A1 WO2023040613 A1 WO 2023040613A1 CN 2022114608 W CN2022114608 W CN 2022114608W WO 2023040613 A1 WO2023040613 A1 WO 2023040613A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display element
touch operation
elements
adjustment
Prior art date
Application number
PCT/CN2022/114608
Other languages
English (en)
French (fr)
Inventor
杨婉艺
陈锋
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22868987.3A priority Critical patent/EP4372534A1/en
Publication of WO2023040613A1 publication Critical patent/WO2023040613A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to the technical field of communication, in particular to a human-computer interaction method, a computer-readable medium and electronic equipment.
  • the corresponding icon is displayed on the display interface of the terminal device.
  • the performance of the terminal equipment is continuously improved, and the storage space of the terminal equipment is rapidly increasing.
  • the terminal device is a mobile phone, the user installs more and more application programs on the mobile phone, and various icons are displayed on the display interface of the mobile phone.
  • the display mode of each icon is preset. Even if the user can perform related operations on two specific icons (such as camera and lens bag), as shown in Figure 1(a) and Figure 1(b), the user merges the lens icon 100' and the camera icon 200' through a touch operation To obtain a lens camera icon 300', but the lens camera icon 300' is actually an icon pre-generated and stored in the mobile phone based on the lens icon 100' and the camera icon 200'.
  • two specific icons such as camera and lens bag
  • the display modes of icons in the display interface of mobile phones are preset, and the human-computer interaction process based on the user's touch operation is essentially the calling and display of icons.
  • the mobile phone cannot flexibly and accurately adjust the icons in the display interface according to the user's touch operation, resulting in a relatively single display mode of the icons, poor playability of the icons, and poor human-computer interaction experience.
  • the display model can be represented as a matrix or a variable.
  • the electronic device adjusts the corresponding display model according to the type of touch operation, so as to realize the change of the display effect of the display model.
  • the electronic device maps the display model with the changed display effect onto the canvas for drawing icons, and the icons on the display interface of the electronic device are changed corresponding to user operations.
  • this application can realize the full, sensitive and accurate interaction between the user and the display elements. Interaction reduces the stuttering feeling when the user interacts with the display elements, improves the fluency when the user interacts with the display elements, and improves the user experience.
  • the first aspect of the present application provides a human-computer interaction method, which can be applied to electronic devices, including: acquiring a user's touch operation on at least one display element displayed on the electronic device; Adjust the display effect of at least one display model of the at least one display element; change the display element corresponding to the display model and the display effect on the electronic device according to the display model after the display effect adjustment.
  • the electronic device may be a device with a display screen, such as a mobile phone or a tablet
  • the display element may be at least one of elements such as icons, cards, components, and components displayed on a display interface of the electronic device.
  • the display interface may be a display desktop of the electronic device, or may be an application interface corresponding to an application program.
  • the touch operation refers to the operation formed when the user touches the adjustment interface.
  • the type of the touch operation refers to an adjustment manner of the display element corresponding to the user's touch operation.
  • the type of touch operation includes adjusting the display direction of a single display element, adjusting the display direction of a batch of display elements, covering another display element with one display element, exchanging the display positions of two display elements, adjusting the size of two display elements, etc. wait.
  • the display effect includes the display position, display size, display direction, display color, display content and display details of display elements, etc.
  • the display model is a kind of data corresponding to the display elements, and the display model can be a two-dimensional model or a three-dimensional model.
  • the electronic device obtains the user's touch operation on at least one display element, the electronic device obtains the touch data generated by the user's touch operation on the display element through the touch sensor, and extracts the touch parameters in the touch data, and the electronic device The device determines the touch operation type of the touch operation based on the preset rules through the touch parameters. The electronic device determines the adjustment parameters and adjustment variables of each sub-display element in the display element according to the touch parameter and the type of touch operation.
  • the electronic device correspondingly adjusts the display effect of the corresponding model according to the adjustment parameters and adjustment variables of the sub-display elements in the display element, and then correspondingly adjusts the sub-display elements in the display elements, and obtains the adjusted display elements according to the adjusted sub-display elements. Finally, the electronic device hides the original display elements and displays the updated display elements.
  • the above solution for adjusting a display element may be used to adjust a display effect of at least one display element among one display element displayed by the electronic device.
  • the above solution for adjusting display elements may also be used to simultaneously adjust display effects of at least two display elements among at least one display element displayed by the electronic device.
  • the two display elements may be two display elements corresponding to the same application program, or may be two display elements corresponding to two application programs.
  • the adjustment of the display direction of a display element includes directly adjusting the display direction of a display element individually or in batches according to a touch operation, and does not include the inheritance or inheritance of the display direction of other display elements by a display element.
  • this application also includes a human-computer interaction scheme for two or more display elements, for example, one card corresponding to the same application covers another card, Another example is the interactive position of two display elements, and another example is the adjustment of the display size of two adjacent display elements.
  • the display element whose display direction is adjusted last has the highest priority, and the priority of the display elements decreases in turn according to the reverse order of the adjustment order.
  • the last adjustment of the display direction is a separate adjustment for the first display element, and when the first display element is adjusted to another display position, the display direction of the first display element remains unchanged, and the last adjustment of the display direction It is a batch adjustment for the second display element and other display elements, when other elements are adjusted to the position of the second display element, they will continue to follow the display direction of the second display element. Based on this, the human-computer interaction solution of two or more display elements is described below.
  • the type of the user's touch operation on the display element is determined by receiving the user's touch operation, and then the display effect of the display element is flexibly adjusted according to the type of touch operation and specifically the touch parameters corresponding to the touch operation. Since the display elements can be displayed according to the adjusted display model, and the display effect is adjusted according to the real-time and flexible adjustment of variables, this application can realize the full, sensitive and accurate interaction between the user and the display elements, and weaken the interaction between the user and the display.
  • the stuttering feeling when interacting with elements improves the fluency when users interact with display elements and enhances the user experience.
  • At least one display effect in the above human-computer interaction method includes at least one of display position, display size, display direction, display color, display content, and display details of display elements.
  • the display position refers to the arrangement position of each sub-display element in the display element. That is to say, the display position includes the layout position of the display element and the relative position of each sub-display element in the display element.
  • the display size refers to the size of each sub-display element in the display element, where the display size can be determined by the distance between each sub-display element and the border of the canvas.
  • the display direction refers to the orientation of the display element, for example, when the display element is parallel to the display screen, the display direction of the display element is forward. When the left side of the display element is forward and the right side is backward, the display direction of the display element is right.
  • the above-mentioned human-computer interaction method further includes: determining the type of touch operation according to a preset rule, wherein the preset rule is based on the number of at least one display element and the corresponding value of at least one display element Whether the application is the same as that set by the touch operation.
  • the preset rule refers to pre-stored in the electronic device, which is used to determine the type of the touch operation according to the touch parameter corresponding to the touch operation.
  • the preset rules include: when the initial position and end position of the user's touch operation are located on the same display element, the type of touch operation is to adjust the display direction of a single display element; In the area where a display element is located, when the end position of the user's touch operation is located on another display element corresponding to the same application program, the type of touch operation is to cover and replace another display element with one display element; when the initial position of the user's touch operation is at In the area where a display element is located, when the end position of the user's touch operation is located on another adjacent display element, and the end position crosses the center line of another display element, the type of touch operation is to exchange the display positions of two connected display elements ; When the initial position of the user's touch operation is located in the middle area of two adjacent display elements, and the end
  • the type of touch operation can be judged according to the number of at least one display element, whether the application programs corresponding to at least one display element are the same, and the touch operation.
  • the judgment method is simple and the judgment cycle is short.
  • obtaining the user's touch operation on at least one display element displayed on the electronic device includes: when the user selects to adjust the display effect of multiple display elements
  • the touch operation is used to adjust the display effect of multiple selected display elements at the same time. For example, after the display effects of multiple display elements are adjusted, a 3D stereoscopic effect of the display elements can be realized, thereby realizing a naked-eye 3D effect.
  • the user's touch operation may be used to make the electronic device set according to the position of the display element in the display interface of the electronic device and the reference position set by the user, and to adjust the display element to face the reference position.
  • the user's touch operation is used to generate an automatic control instruction, and the automatic control instruction can realize the adjustment of at least one display element without giving specific touch data by the user.
  • the electronic device can simultaneously adjust the adjustment of the display effects of multiple display elements through touch operations, and the adjustment of the display elements is highly efficient and easy to operate.
  • adjusting the display effect of at least one display model of at least one display element applied by the touch operation includes: according to the type of the touch operation type, performing batch adjustment of different display effects on the display models of at least two display elements among the plurality of display elements.
  • the electronic device can adjust different display effects of multiple display elements at the same time through touch operation, and realize the adjustment of different display effects of multiple display elements, which is easy to operate and further satisfies the user's adjustment requirements for display elements .
  • adjusting different display effects on the display models of at least two display elements among the plurality of display elements includes: according to the user's touch operation and at least The display position of each display element in the two display elements determines the user's sub-touch operation on each display element; determines the rotation axis, rotation direction and rotation angle of each display element according to the sub-touch operation, and corresponds to the touch operation to determine The rotation axis, rotation direction and rotation angle of each display element are adjusted in batches to the display direction of each display element.
  • adjusting the display effect of at least one display model of at least one display element applied by the touch operation includes: according to the type of the touch operation Type, batch adjustment of the same display direction for multiple display models of multiple display elements.
  • the operation type includes using the first display element to cover the second display element, and the first display element and the second display element correspond to the same application program, wherein, Using the first display element to cover the second display element includes the first display element inheriting the second at least part of the display effect of the second display element, and hiding the second display element, wherein the second at least part of the display effect includes the second display effect of the second display element Display position and secondary display size.
  • the second at least partial display effect when the type of the touch operation of the second display element is to adjust the display direction in batches, the second at least partial display effect further includes the second display element The second display orientation for .
  • the application scenario is that the last adjustment of the display orientation of the original display element is located after the last adjustment of the display orientation of the new display element, and the last display orientation of the original display element
  • the adjustment method is batch adjustment. The user drags a new display element corresponding to an application to another original display element corresponding to the same application, and the dragged new display element in the display interface covers the original display element. , and the overlaid new display element continues to use the display direction of the original display element.
  • the adjustment of the display direction of the display element includes directly adjusting the display direction of the display element individually or in batches according to the touch operation, and does not include the use or inheritance of the display direction of other display elements by the display element.
  • adjusting the display effect of at least one display model of at least one display element applied by the touch operation includes: acquiring the user's touch operation Corresponding second display element; adjust the display model of the first display element according to at least part of the display effect of the second display element, hide the second display element corresponding to the touch operation, and display the first display element with the adjusted display model The effect is adjusted.
  • the electronic device calls the invalide function to realize the interactive animation between the display element and the updated display element.
  • the type of touch operation includes exchanging the display positions of the third display element and the fourth display element, and the type of the latest touch operation on the third display element
  • exchanging the display positions of the third display element and the fourth display element includes: the third display element inherits at least part of the fourth display effect of the fourth display element, and the fourth display element inherits the third display element
  • the third at least partial display effect wherein the fourth at least partial display effect includes a fourth display position of the fourth display element, and the third at least partial display effect includes a third display position and a third display orientation of the third display element.
  • the display element whose display direction is adjusted last has the highest priority, and the priority of the display elements decreases according to the reverse order of the adjustment order.
  • the last adjustment of the display direction is a separate adjustment for the first display element, and when the first display element is adjusted to another display position, the display direction of the first display element remains unchanged, and the last adjustment of the display direction It is a batch adjustment for the second display element and other display elements, when other elements are adjusted to the position of the second display element, they will continue to follow the display direction of the second display element. Based on this, the human-computer interaction solution of two or more display elements is described below.
  • the last adjustment of the display direction of the fourth display element is an individual adjustment
  • the last adjustment of the third display element is a batch adjustment
  • the fourth display element and the third display element are mutually adjusted. Swap places.
  • the display direction of the fourth display element after the exchange position follows the display direction of the third display element, after the position exchange The display direction of the third display element of is unchanged.
  • the type of touch operation includes exchanging the display positions of the third display element and the fourth display element.
  • exchanging the display positions of the third display element and the fourth display element includes: the third display element inherits at least part of the fourth display effect of the fourth display element, and the fourth display element
  • the display element follows the third at least part of the display effect of the third display element, wherein the third at least part of the display effect includes the third display position and the third display direction of the third display element, and the fourth at least part of the display effect includes the fourth display element.
  • a fourth display position and a fourth display orientation is a possible implementation of the above-mentioned human-computer interaction method.
  • an application scenario applicable to the human-computer interaction solution is that the last adjustment of the display direction of the third display element and the last adjustment of the display direction of the fourth display element are both batch adjustments, and do not If it is limited to the same batch adjustment, the third display element after the swap position follows the display direction of the original fourth display element, and the fourth display element after the swap position follows the display direction of the original third display element.
  • the above solution realizes the continued use of the display direction and display position of the original display elements by the exchanged display elements, that is, in the process of exchanging the positions of the display elements, it ensures that the display elements at the same position in the display interface of the electronic device are always in the same position.
  • the display direction is displayed according to the user's usage habits, and the display direction of the naked-eye 3D effect is maintained, and the adjustment steps of the display elements in the display interface of the electronic device are also simplified.
  • the type of touch operation includes synchronously adjusting the display effects of the fifth display element and the sixth display element, and the fifth display element and the sixth display element are adjacent to each other.
  • synchronously adjusting the display effect of the fifth display element and the sixth display element includes: synchronously adjusting the fifth display size of the fifth display element and the sixth display size of the sixth display element according to the touch operation, and the fifth The sum of the display size and the sixth display size remains unchanged.
  • the size and display direction of the two display elements can also be adjusted dynamically in real time, improving the fluency in the process of changing the display mode of the two display elements , to realize the diversity of the display effects of the two display elements.
  • synchronously adjusting the display effects of the fifth display element and the sixth display element further includes: adjusting the fifth display content and the display effect of the fifth display element according to the touch operation /or the display content of the sixth display element.
  • the fifth display element adjusted based on the fifth resizing is the minimum display size
  • the fifth display element and the sixth display element are synchronously adjusted.
  • the display effect of the display element also includes adjusting the fifth display direction of the fifth display element.
  • the fifth adjustment size may be a difference between the original display size of the fifth element and the adjusted display size.
  • the display direction of the display elements is adjusted to face the front side by default. It can be understood that although the display direction of the display element is adjusted to face the front side, the display element is still considered as a display element whose display direction has been adjusted.
  • the above-mentioned human-computer interaction method further includes: during the user's touch operation, acquiring the corresponding fifth real-time size and sixth real-time size according to the user's touch position, and the fifth real-time The sum of the size and the sixth real-time size remains unchanged; the display effect of the fifth display element is adjusted in real time according to the fifth real-time size, and the display effect of the sixth display element is adjusted in real time according to the sixth real-time size.
  • changing the display elements corresponding to the display model according to the display model after the display effect adjustment, and the display effect on the electronic device include: adjusting the display effect by canvas drawing
  • the final display model is selected to change the display effect of the display elements corresponding to the display model on the electronic device.
  • the canvas refers to an abstract space that can lay out and render display elements based on the display model.
  • the type of touch operation includes adjusting the display direction of the display element, wherein adjusting the display direction of the display element includes determining the rotation axis, rotation direction and Rotation angle.
  • adjusting the display effect of at least one display model of at least one display element applied by the touch operation includes: according to the user's touch operation , determine the rotation axis, rotation direction and rotation angle of the display model of the display element, and adjust the display direction of the display element with the determined rotation axis, rotation direction and rotation angle corresponding to the touch operation.
  • At least one display element includes at least one of an icon, a card, and a component.
  • each display element in at least one display element includes a foreground element and a background element
  • the method further includes: according to the type of the touch operation, the The display effect of the foreground element and/or the background element corresponding to at least one sub-display element in the at least one display element is adjusted.
  • the display model includes at least one of a two-dimensional display model and a three-dimensional display model.
  • the two-dimensional display model can display the planar shape of display elements such as icons or cards
  • the three-dimensional display model can display the three-dimensional shapes of display elements such as icons or cards.
  • the display model can be represented by a matrix or a variable.
  • the second aspect of the present application provides a computer-readable medium, and instructions are stored on the computer-readable medium, and when the instructions are executed on the electronic equipment, the electronic equipment executes any one of the human-computer interaction methods in the above-mentioned first aspect.
  • a third aspect of the present application provides an electronic device, including: a memory for storing instructions executed by one or more processors of the electronic device, and a processor, which is one of the processors of the electronic device, for executing Any one of the human-computer interaction methods in the first aspect above.
  • Fig. 1 (a) shows a display interface 11' of a mobile phone 1'
  • Fig. 1 (b) shows a display interface 11' of a mobile phone 1' operated by a user
  • FIG. 2 shows a display interface 11 of a mobile phone 1 of the present application
  • FIG. 3(a) shows a user's touch operation on the camera icon 104 in this application
  • Fig. 3 (b) shows a partial enlarged view of area i in the display interface 11 of a mobile phone 1 of the present application
  • Fig. 4 (a) shows the touch operation of two other users to the icons in the i area of the display interface 11 in the present application
  • Fig. 4 (b) shows a partial enlarged view of area i in the display interface 11 of a mobile phone 1 of the present application
  • Fig. 5 (a) shows a kind of display interface 11 " of mobile phone 1 ";
  • Figure 5(b) shows a display interface 11" of a mobile phone 1" operated by a user
  • Figure 6(a) shows a user's touch operation on the contact B card and the contact A card in this application
  • FIG. 6(b) shows a partially enlarged view of area ii in the display interface 11 of a mobile phone 1 of the present application
  • Figure 6(c) shows a user's touch operation on the contact C card and the contact A card in this application
  • Fig. 6(d) shows a partial enlarged view of area ii in the display interface 11 of a mobile phone 1 of the present application
  • Figure 7(a) shows a user's touch operation on the alarm clock card and the remote control card in this application
  • FIG. 7(b) shows a partially enlarged view of area ii in the display interface 11 of a mobile phone 1 of the present application
  • Figure 8(a) shows a user's touch operation on the remote control card and the alarm clock card in this application
  • FIG. 8(b) shows a partially enlarged view of area ii in the display interface 11 of a mobile phone 1 of the present application
  • Fig. 9 (a) has shown the generation schematic diagram of the card in a kind of mobile phone 1 of the present application.
  • Figure 9(b) shows a schematic drawing of the background elements of the card in the present application.
  • Figure 9(c) shows a schematic drawing of the foreground elements and text elements of the card in this application.
  • Fig. 9(d) shows a card display principle diagram in a mobile phone 1 of the present application
  • Figure 10 shows a schematic diagram of a card update principle of the present application
  • FIG. 11 shows a flowchart of a human-computer interaction solution for display elements of a mobile phone 1 in the present application
  • Fig. 12 (a) shows a schematic diagram of a camera icon 104 displayed in the display interface 11 of a mobile phone 1 in the present application;
  • Fig. 12(b) shows a schematic diagram of the user's touch track in the display interface 11 of a mobile phone 1 in the present application
  • Fig. 12(c) shows a schematic diagram of the user's touch track in the display interface 11 of another mobile phone 1 in the present application
  • Fig. 12(d) shows a schematic diagram of the user's touch track in the display interface 11 of another mobile phone 1 in the present application
  • Fig. 13(a) to Fig. 13(j) show a schematic diagram of the adjustment of the camera icon 104 in the adjustment interface 12 according to different adjustment parameters in a mobile phone 1 in the present application;
  • Fig. 13(k) shows a schematic diagram of the camera icon 104 in the mobile phone 1 after adjustment and locking in the present application
  • Fig. 13 (1) shows the schematic diagram after adjusting camera icon 104 in a kind of mobile phone 1 in the present application
  • Fig. 14 (a) shows a kind of sketch map that the icons in the adjustment interface 12 of the mobile phone 1 in the present application enter batch adjustment;
  • Fig. 14(b) shows a schematic diagram of selecting icons for batch adjustment in the adjustment interface 12 of the mobile phone 1 in the present application
  • Fig. 14(c) shows a schematic diagram of batch adjustment operation of icons in the adjustment interface 12 of a mobile phone 1 in the present application
  • Fig. 14(d) shows a schematic diagram when batch adjustment of icons in the adjustment interface 12 of the mobile phone 1 in the present application is confirmed;
  • Figure 14(e) shows a schematic diagram of a mobile phone 1 icon batch adjusted in the present application
  • Fig. 14 (f) shows another kind of mobile phone 1 adjustment interface 12 in the present application when icon batch adjustment is confirmed;
  • Fig. 14 (g) shows the schematic diagram of another mobile phone 1 icon batch adjustment in the present application
  • Fig. 15 (a) to Fig. 15 (d) show the schematic diagram when the card 100 and the card 200 in a kind of mobile phone 1 in the present application are resized;
  • FIG. 16 shows a schematic structural diagram of a mobile phone 1 in the present application
  • Fig. 17 shows a software structural block diagram of a mobile phone 1 in this application.
  • Illustrative embodiments of the present application include, but are not limited to, a human-computer interaction method, apparatus, readable medium, and electronic equipment.
  • the technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
  • the electronic device in the present application may be a device with a display screen, such as a mobile phone and a tablet, and the electronic device is described below as a mobile phone.
  • the display element may be any element displayed on the display interface of the mobile phone, such as at least one of elements such as icons, sub-icons, cards, components, and components.
  • the application programs corresponding to the display elements may be programs such as contacts, calls, text messages, browsers, alarm clocks, and remote controls.
  • the width direction of the mobile phone screen is set as the X axis
  • the length direction of the mobile phone screen is set as the Y axis
  • the thickness direction of the mobile phone is set as the Z axis
  • the X axis, Y axis and Z axis The two are perpendicular to each other.
  • the left side of the mobile phone and the right side of the mobile phone refer to two sides of the mobile phone along the X-axis direction
  • the upper side of the mobile phone and the lower side of the mobile phone refer to two sides of the mobile phone along the Y-axis direction.
  • the front side of the mobile phone and the rear side of the mobile phone refer to two sides of the mobile phone along the Z-axis direction.
  • the present application discloses a human-computer interaction scheme for display elements in the display interface of a mobile phone, in which a corresponding display model is set for each icon, each card and other display elements on the display interface.
  • the display model may include a two-dimensional display model and/or a three-dimensional display model.
  • the two-dimensional display model can display the planar shape of display elements such as icons or cards
  • the three-dimensional display model can display the three-dimensional shapes of display elements such as icons or cards.
  • Display models can be represented by matrices or variables. Then when the user operates the icon or card, the mobile phone changes the corresponding display model according to the type of touch operation.
  • the mobile phone rotates the display model to change the display angle of the display model. Then, the changed display model is mapped to the canvas for drawing icons, so that the icons on the display interface of the mobile phone are changed corresponding to the user's operation.
  • the canvas refers to an abstract space capable of laying out and rendering display elements based on a display model.
  • the mobile phone obtains the adjustment variable of the adjustment parameter according to the touch operation, and then adjusts the adjustment parameter according to the adjustment variable, so that the display model is displayed according to the adjusted adjustment parameter, and finally completes an interaction between the user and the display element.
  • the type of the user's touch operation on the display element is determined by receiving the user's touch operation, and then the display effect of the display element is flexibly adjusted according to the type of touch operation and specifically the touch parameters corresponding to the touch operation. Since the display elements can be displayed according to the adjusted display model, and the display effect is adjusted according to the real-time and flexible adjustment of variables, this application can realize the full, sensitive and accurate interaction between the user and the display elements, and weaken the interaction between the user and the display.
  • the stuttering feeling when interacting with elements improves the fluency when users interact with display elements and enhances the user experience.
  • the display direction refers to the orientation of the icon, wherein, when the icon is parallel to the display screen, the display direction of the icon is forward.
  • the display direction of the icon is forward.
  • the display direction of the icon is right.
  • the display direction of the icon is left.
  • the display direction of the icon is downward.
  • the display direction of the icon is upward.
  • Fig. 2 shows a display interface 11 of a mobile phone 1 in the present application, wherein the i area of the display interface 11 displays a call icon 101, a text message icon 102, a browser icon 103 and a camera icon 104 in sequence from left to right .
  • Fig. 3(a) and Fig. 3(b) show the human-computer interaction scheme of the camera icon 104 in Fig. 2 when the user touches to the left. As shown in Fig.
  • the user touches the camera icon 104 in the display interface 11 of the mobile phone 1 to enter the adjustment interface 12 of the camera icon 104, and the user moves along the direction parallel to the X axis at the bottom of the camera icon 104 in the adjustment interface 12, Slide from the starting position P s to the ending position P f to adjust the display direction of the camera icon 104 .
  • area i displays a call icon 101 , a text message icon 102 , a browser icon 103 , and an adjusted camera icon 104x distributed along the X axis from left to right.
  • the camera icon 104x is an icon obtained after the camera icon 104 is rotated around an axis parallel to the Y axis.
  • the display direction of the camera icon may gradually change with the user's touch operation.
  • the i area is not a designated area of the display interface 11 of the mobile phone 1, but is only used to characterize a certain area in the display interface 11 of the mobile phone 1.
  • the meanings of the ii area and the iii area in the following are similar to the i area, I will not repeat them in the future.
  • At least two icons are displayed in the icon adjustment interface, the user selects a reference point, and the mobile phone can adjust the position of each icon according to the reference point and the position of each icon. Displays the orientation so that at least two icons are oriented toward the datum point.
  • FIG. 4(a) shows the central point touched by the user.
  • FIG. 4( b ) shows a schematic diagram of the display interface 11 of the mobile phone 1 .
  • Fig. 4(b) shows a 3D effect of the display elements in Fig. 4(a).
  • Figure 4(a) and Figure 4(b) it can be seen that when the user clicks the middle position P s1 in the i area, the icons in the i area adjust their respective display directions, and finally move towards the middle position P s1 of the display screen, making the call icon 101y , SMS icon 102y, browser icon 103y and camera icon 104y present a naked-eye 3D effect towards the user's position P u1 (corresponding to the middle position P s1 in FIG.
  • the icons for adjusting the display direction in batches can also be distributed along the Y axis, or distributed in the XOY plane, and the display effect of the icons is similar to that in FIG. 4( b ), which will not be repeated here.
  • the position clicked by the user can also be located on the right side of the mobile phone 1 (as shown in the right position P s2 in Figure 4(a)), the left side of the mobile phone 1, the upper side of the mobile phone 1, or the lower side of the mobile phone 1.
  • the mobile phone 1 displays an interface 11
  • the display effect of the icons in Fig. 4(b) is similar to that in Fig. 4(b), and will not be repeated here.
  • the above-mentioned human-computer interaction scheme can be applied to the icons in the mobile phone 1, and can also be applied to the cards and sub-icons in the mobile phone 1, and the specific display effect of the cards and sub-icons is similar to that of the above-mentioned icons.
  • the card refers to the element on the desktop corresponding to some functional modules of the application program.
  • the card can be displayed as an abbreviated interface of the functional interface of the functional module of this part
  • the sub-icon refers to the element on the desktop corresponding to the functional module of the application program.
  • icon, and the sub-icon can be displayed as the icon of this part of the functional module.
  • the same functional module for example: contact person 14 ′′
  • the same application program can correspond to cards of different sizes, different forms, and different colors (for example: the first card 201 ′′ and second card 202").
  • this application also includes a human-computer interaction scheme for two or more display elements, for example, one card corresponding to the same application program covers another card , another example is the interactive position of two display elements, and another example is the adjustment of the display size of two adjacent display elements.
  • the adjustment of the display direction of the display element includes directly adjusting the display direction of the display element individually or in batches according to the touch operation, and does not include the use or inheritance of the display direction of other display elements by the display element.
  • the display element whose display direction is adjusted last has the highest priority, and the priority of the display elements decreases in turn according to the reverse order of the adjustment order.
  • the last adjustment of the display direction is a separate adjustment for the first display element, and when the first display element is adjusted to another display position, the display direction of the first display element remains unchanged, and the last adjustment of the display direction It is a batch adjustment for the second display element and other display elements, when other elements are adjusted to the position of the second display element, they will continue to use the display direction of the second display element. Based on this, the human-computer interaction solution of two or more display elements is described below.
  • This application also includes a human-computer interaction scheme applied to display elements corresponding to the same application program, for example, the user drags a new icon or card corresponding to an application program to another original icon or card corresponding to the same application program On the card, the new icon or card that is dragged in the display interface covers the original icon or card, and the overwritten new icon or card continues to use the display direction of the original icon or card.
  • the display elements include all elements displayed on the display interface of the mobile phone, such as icons, sub-icons, cards, components, components and so on.
  • the display interface 11 of the mobile phone 1 displays an alarm clock card 201 , a remote control card 202 and a contact A card 203 sequentially distributed from left to right in area ii.
  • a contact B card 204 and a contact C icon 205 are displayed in area iii of the display interface 11 of the mobile phone 1 .
  • FIG. 6( a ) when the user touches the contact B card 204 and drags the contact B card 204 onto the contact A card 203 , the contact A card 203 is overlaid as the contact B card 204 a.
  • the covered contact B card 204 a continues to follow the display direction and size of the contact A card 203 .
  • the mobile phone hides the contact A card 203 and adjusts the display direction of the contact B card 204 from the right side to the front side, so as to be consistent with the display direction of the contact A card 203 .
  • the mobile phone 1 also adjusts the size of the contact B card 204 to the size of the contact A card 203 , and then displays it at the position of the contact A card 203 .
  • the above human-computer interaction solution is suitable for an application scenario where the last adjustment of the display direction of the contact A card 203 is located after the last adjustment of the display direction of the contact B card 204, and the last adjustment of the contact A card 203
  • the adjustment method of one display direction is batch adjustment.
  • the contact B card 204 is dragged from left to right onto the contact A card 203, which is only an example of the display position and dragging direction of the contact B card 204.
  • Contact The person B card 204 can be distributed anywhere on the display interface 11, and the dragging direction from the contact B card 204 to the contact A card 203 can be any direction, which is not specifically limited in this application.
  • the dragging scheme of the contact C icon 205 to the contact A card 203 below is not specifically limited in this application.
  • the covered display element is displayed at the position of the covered display element according to the display direction of the covered display element, and the covered display element is no longer displayed, which realizes the interaction of the covered display element with the covered display element.
  • Inheritance of display direction and display position that is, ensuring that display elements at the same position in the display interface 11 of the mobile phone are always displayed in the same display direction, ensuring a naked-eye 3D effect in line with user habits, and simplifying the adjustment steps of display elements .
  • this application also includes a human-computer interaction scheme for display elements corresponding to different applications, for example, the user drags the card corresponding to one application to another On the card corresponding to the application, the positions of the cards corresponding to different applications in the display interface are interchanged, and the cards corresponding to the two applications adjust the display direction and display method adaptively based on the position of the other.
  • the display direction of the alarm clock card 201 is toward the right, and the display direction of the remote control card 202 is toward the left.
  • the user touches the alarm clock card 201 and drags the alarm clock card 201 to the remote control card.
  • the alarm clock card 201 and the remote control card 202 exchange positions, and the alarm clock card 201a after the exchange position adopts the display direction of the original remote control card 202, and the remote control card 202a after the exchange position adopts the display direction of the original alarm clock card 201, As shown in Figure 7(b). Comparing Fig. 7(a) and Fig.
  • the above-mentioned human-computer interaction solution is suitable for an application scenario where the last adjustment of the display direction of the alarm clock card 201 and the last adjustment of the display direction of the remote control card 202a are all batch adjustments, and are not limited to the same Sub-batch adjustment.
  • the above solution realizes the continuation of the display direction and display position of the original display elements by the exchanged display elements, that is, in the process of exchanging the positions of the display elements, it is ensured that the display elements at the same position in the mobile phone display interface 11 are always in the same position.
  • the display direction can be displayed according to the user's usage habits, and the display direction of the naked-eye 3D effect is maintained, and the adjustment steps of the display elements in the display interface of the mobile phone 1 are also simplified.
  • the application After adjusting the display direction of a single icon or card using the above-mentioned human-computer interaction scheme, the application also includes a human-computer interaction scheme for adjacent display elements, for example, when the user touches a certain position between a card and another card , and slide towards one of the cards, the size and display direction of the two adjacent cards in the display interface will change.
  • the remote controller card 202a after the exchanged position is a small size
  • the alarm clock card 201a after the exchanged position is a large size
  • the remote controller card 202a after the exchanged position and the alarm clock after the exchanged position Cards 201a are adjacent.
  • the user touches the area between the remote controller card 202a after the exchange and the alarm clock card 201a after the exchange, and drags towards the alarm clock card 201a after the exchange to adjust the remote control card 202a after the exchange and the alarm clock after the exchange.
  • the size of the card 201a changes, and the minimum distance between the remote control card 202a after the exchange and the alarm card 201a after the exchange does not change. As shown in FIG.
  • the adjusted remote controller card 202b has a large size
  • the adjusted alarm clock card 201b has a small size.
  • Fig. 8(a) and Fig. 8(b) it can be seen that during the process of switching from the remote controller card 202a after the exchange to the adjusted remote controller card 202b, the size of the remote controller card changes from small to large, which is related to the remote controller.
  • the text element "Huawei Smart Screen” appears.
  • the size of the alarm clock card changes from large to small
  • the text element "7:20 am" related to the alarm clock card is hidden.
  • this application describes the human-computer interaction scheme of two adjacent display elements by taking the remote control card 202a after the exchange of positions and the alarm clock card 201a after the exchange of positions as examples, and the other two adjacent display elements are also applicable to the above-mentioned human-computer interaction.
  • the computer interaction scheme is not repeated here.
  • the display direction of the card when the user touches a position between a card and another adjacent card, the display direction of the card will also change.
  • the display direction of the card when the display sizes of the two cards are changed, the display direction of the card whose display size becomes smaller gradually returns to the original state.
  • the display direction of each card when the display size of the two cards is changed, the display direction of each card changes with the change of the display size.
  • the user's touch operation can be used to adjust the display size of the two cards on the one hand,
  • the display direction of the card can also be adjusted based on the user's touch operation with reference to the manner in FIG. 6( a ).
  • the size and display direction of the two display elements can also be adjusted dynamically in real time, improving the fluency in the process of changing the display mode of the two display elements , to realize the diversity of the display effects of the two display elements.
  • FIG. 9(a) shows a schematic diagram of generating a card
  • Fig. 9(b) shows a schematic drawing of a background element of a card
  • Fig. 9(c) shows a foreground element and text of a card
  • FIG. 9( d ) shows a schematic diagram of displaying a card.
  • the card 100 includes a foreground element 110 , a background element 120 and a text element 130 .
  • the foreground elements 110 are distributed in the foreground element layer C1
  • the background elements 120 are distributed in the background element layer C2
  • the text elements 130 are distributed in the text element layer C3.
  • the foreground element layer C1 and the text element layer C3 are located in front of the background element layer C2 , and the mobile phone 1 superimposes the foreground element 110 , the background element 120 and the text element 130 to form the card 100 .
  • the mobile phone 1 obtains the initial element 110 ′ having an icon meaning, and rotates the initial element 110 ′ to obtain the foreground element 110 .
  • the foreground matrix is a storage form of the display model corresponding to the foreground element.
  • the mobile phone 1 generates a background element 120 according to parameters such as size, position, color, and details.
  • the mobile phone 1 generates text elements 130 according to parameters such as user information and names of function modules.
  • the mobile phone 1 translates, scales and superimposes the foreground element 110 , the background element 120 and the text element 130 to generate the icon 100 .
  • the mobile phone can also rotate the background element and the text element to obtain a card with an overall three-dimensional effect (not shown).
  • the drawing application program in the mobile phone 1 draws the background element 120 on the canvas according to the background matrix.
  • the background matrix is a matrix capable of reflecting display information of background elements.
  • the background matrix is a storage form of the display model corresponding to the background element.
  • the display information of the background element includes at least one of parameters such as a display direction of the background element, a color of the background element, a size of the background element, and a display position of the background element.
  • the text matrix is a storage form of the display model corresponding to the text elements.
  • Canvas is a drawable carrier that includes multiple pixels.
  • the drawing application program in the mobile phone 1 determines the distance d1 from the background element 120 in the background matrix to the canvas border, the distance d2 to the lower border, the distance d3 to the left border, and the distance d4 to the right border to determine the corresponding position of the background matrix. Pixels in the canvas.
  • the drawing application program in the mobile phone 1 draws the foreground element 110 on the canvas according to the foreground matrix, and Draw the text element 130 on the canvas according to the text matrix.
  • the drawing application program in the mobile phone can also zoom and pan the foreground element 110 and the text element 130 to draw the foreground element 110 and the text element 130 on the canvas.
  • the foreground matrix corresponding to the foreground element 110 is a matrix capable of reflecting display information of the foreground element.
  • the display information of the foreground element includes at least one of parameters such as the direction of the foreground element, the color of the foreground element, the size of the foreground element, and the position of the foreground element.
  • the text matrix corresponding to the text element 130 is a matrix capable of embodying display information of the text element.
  • the display information of the text element includes at least one of parameters such as text content, text color, text font size, and text font.
  • the drawing application program in mobile phone 1 determines the foreground matrix corresponding pixels in the canvas.
  • the drawing application program in the mobile phone 1 determines the canvas corresponding to the foreground matrix according to the distance d9 from the text element 130 in the text matrix to the upper border of the canvas, the distance d10 to the lower border, the distance d11 to the left border, and the distance d12 to the right border pixels in .
  • the foreground element 110 and the text element 130 are located side by side in front of the background element 120 , and both the foreground element 110 and the text element 130 are non-transparent elements.
  • the background element 120 When viewed along the opposite direction of the Z axis, the background element 120 includes an overlapping area 121 overlapping with the foreground element 110, an overlapping area 122 overlapping with the text element 130, and a non-overlapping area 123, so the final display effect of the card 100 is the foreground element 110 .
  • the overlay effect of the text element 130 and the non-overlapping area 123 in the background element 120 if the foreground element 110 and the text element 130 are transparent elements or translucent elements, the final display effect of the card 100 is a simple superposition effect of the foreground element 110 , the background element 120 and the text element 130 .
  • icons differ from cards in that icons only include foreground and background elements. Compared with cards, icons do not involve the processing of text elements, so we won't repeat them here.
  • Fig. 10 shows a schematic diagram of a card update principle in the present application, where the card 100 wants to update the foreground element 110 to obtain an updated card 100a.
  • the updating principle of the card in the mobile phone 1 will be described below with reference to FIG. 9( a ) to FIG. 10 .
  • the user adjusts the display direction of the foreground element in the card through a touch operation.
  • the mobile phone 1 obtains a foreground matrix corresponding to the foreground element, a background matrix corresponding to the background element, and a text matrix corresponding to the text element.
  • the mobile phone 1 determines the adjustment parameters according to the type of the user's touch operation on the display element, and determines the adjustment variable of the display element based on the user's touch operation on the display element, and then adjusts the foreground matrix acquired by the mobile phone 1 according to the adjustment parameter and the adjustment variable, to get the adjusted foreground matrix.
  • the mobile phone retains the obtained background matrix and the obtained text matrix.
  • the adjustment parameters include camera parameters for adjusting the display direction of the foreground elements, color parameters for adjusting the color of the foreground elements, detail parameters for adjusting the details of the foreground elements, and so on.
  • the camera parameters include at least one of parameters such as a rotation angle around the X axis, a rotation angle around the Y axis, and a rotation angle around the Z axis.
  • the mobile phone 1 invokes a preset function corresponding to the foreground matrix, and inputs adjustment variables corresponding to adjustment parameters into the preset function to modify the foreground matrix.
  • the mobile phone 1 obtains the updated foreground element 110a according to the adjusted foreground matrix, and obtains the background element 120 corresponding to the background matrix and the text element 130 corresponding to the text matrix.
  • the mobile phone 1 obtains the updated display element 100 a by superimposing the updated foreground element 110 a , background element 120 and text element 130 .
  • the mobile phone 1 generates a composite matrix according to the adjusted foreground matrix, the reserved background matrix and the reserved text matrix, and then draws the updated card 100a on the canvas according to the composite matrix.
  • the above update scheme only shows an example of adjusting the foreground elements by the mobile phone 1, and it can be understood that the above update scheme can also be used to adjust the background elements and text elements in the card. In addition, the above updating scheme can also be used for updating the foreground element and the background element in the icon, which will not be repeated here.
  • Fig. 11 shows a flowchart of the human-computer interaction scheme of display elements. As shown in Figure 11, the human-computer interaction scheme of display elements in this application specifically includes the following steps:
  • the touch sensor in the screen of the mobile phone 1 acquires touch data generated by a user's touch operation on a display element, and extracts touch parameters in the touch data.
  • the user touch operation refers to an operation formed when the user touches the adjustment interface.
  • the adjustment interface may be the display desktop of the mobile phone 1, or a specific interface entered in response to a specific operation of the user.
  • a specific action can be a single click on a display element, a double click on a display element, or a long press on a display element, etc. It can be understood that the above are only exemplary examples of specific operations, and the specific operations can also be other operation forms formed by combining parameters such as touch position, touch time, touch pressure, touch frequency, and touch area, which are not specifically limited in this application.
  • the touch data is raw data generated according to a user's touch operation, such as coordinates of touch points arranged in a touch order.
  • the touch parameter refers to a relevant parameter extracted from the touch data or generated based on the touch data, which can reflect the adjustment mode that the user wants to realize.
  • the touch parameters include coordinates of a start point, coordinates of an end point, a touch direction, a touch track, and a length of the touch track.
  • the start point refers to the first position point where the user's finger begins to touch the adjustment interface
  • the end point refers to the last position point where the user's finger is about to leave the adjustment interface.
  • the touch path can be the user's finger sliding from the start point to the end point.
  • the connection line of all the position points passed by, the extension direction of the touch path is the direction from the starting point to the end point along the touch path.
  • the extension direction of the touch path is used to represent the rotation direction of the display element
  • the length of the touch path is used to represent the rotation angle of the display element. It can be understood that there is a one-to-one correspondence between the touch direction, the touch track and the length of the touch track and the camera parameters.
  • the mobile phone 1 determines the type of the touch operation according to the touch parameters and based on preset rules.
  • the mobile phone 1 obtains the type of touch operation according to the display elements and touch parameters.
  • the attributes of the display elements include the quantity of the display elements, the application programs corresponding to the display elements, and the like.
  • the type of the touch operation refers to an adjustment manner of the display element corresponding to the user's touch operation.
  • the type of touch operation includes adjusting the display direction of a single display element, adjusting the display direction of a batch of display elements, covering another display element with one display element, exchanging the display positions of two display elements, adjusting the size of two display elements, etc. wait.
  • the preset rules include: when the initial position and end position of the user's touch operation are on the same display element, the type of touch operation is to adjust the display direction of a single display element; when the initial position of the user's touch operation is on a The area where a display element corresponding to an application is located, and when the termination position of the user’s touch operation is located on another display element corresponding to the same application, the type of touch operation is to cover and replace another display element with one display element; when the user touches and operates When the initial position of the user's touch operation is located in the area where a display element is located, and the end position of the user's touch operation is located on another adjacent display element, and the end position crosses the center line of another display element, the type of touch operation is to exchange two connected display elements.
  • the display position of the element when the initial position of the user's touch operation is located in the middle area of two adjacent display elements, and the end position of the user's touch operation is located on one of the display elements, the type of touch operation is to adjust the size of the two display elements.
  • S1103 The mobile phone 1 determines the adjustment parameters and adjustment variables of each sub-display element in the display element according to the touch parameter and the type of touch operation.
  • Each sub-display element can be a foreground element and a background element in the icon. Each sub-display element can also be a foreground element, a background element and a text element in the card.
  • the adjustment parameter refers to the parameter to be adjusted in the display information to realize the adjustment effect determined according to the type of the touch operation.
  • the adjustment parameter and the adjustment side variable are also used to adjust the display model corresponding to each sub-display element.
  • the adjustment parameter may display a direction, specifically including at least one of an X-axis rotation angle parameter, a Y-axis rotation angle parameter, and a Z-axis rotation angle parameter.
  • the adjustment variable refers to the amount to be changed for each adjustment parameter corresponding to the type of each touch operation acquired according to the touch operation.
  • the mobile phone 1 correspondingly adjusts the sub-display elements in the display elements according to the adjustment parameters and adjustment variables of the sub-display elements in the display elements, and obtains adjusted display elements according to the adjusted sub-display elements.
  • the matrices corresponding to the display elements are respectively distributed in multi-layer canvases arranged in a preset sequence, and each layer of canvas in the multi-layer canvas respectively displays the sub-display elements corresponding to the matrices set in the layer.
  • the mobile phone calls the preset function, and inputs one or more adjustment variables of the adjustment parameters of the sub-display elements into the preset function.
  • each sub-display element in the display element has its own corresponding matrix. Individual sub-quantities to modify the matrix corresponding to these sub-display elements.
  • the mobile phone sets the adjusted matrices corresponding to these sub-display elements in corresponding canvases in the multi-layer canvas, so that the re-set canvas can be updated according to the adjusted matrix child display element.
  • the multi-layer canvas generates adjusted display elements according to the updated sub-display elements and the retained sub-display elements.
  • the display elements include foreground elements and background elements, and the foreground elements are set on the front canvas, and the background elements are set on the back canvas, and the adjustment parameter is the rotation angle of the foreground elements around the X axis.
  • the mobile phone inputs the value of the rotation angle of the foreground element around the X axis in the preset function, so as to modify the foreground matrix corresponding to the foreground element, and obtain the adjusted foreground matrix. Then, the mobile phone sets the adjusted foreground matrix to the front canvas.
  • the front canvas generates adjusted foreground elements based on the adjusted foreground matrix.
  • the front canvas and the back canvas generate adjusted display elements according to the adjusted foreground elements and background elements.
  • each sub-display element in the display element has its own corresponding matrix.
  • the matrices corresponding to each sub-display element are sequentially set on the canvas according to a preset sequence.
  • the mobile phone calls the preset function, and inputs one or more adjustment variables of the adjustment parameters of the sub-display elements into the preset function, so as to modify the matrices corresponding to these sub-display elements.
  • the mobile phone sets the adjusted matrix and the reserved matrix respectively in the canvas according to the preset sequence, so that the canvas can update the display elements according to the adjusted matrix.
  • the display elements include background elements, foreground elements and text elements, and the foreground elements and text elements are arranged side by side on the upper layer of the background elements. That is, the matrix corresponding to each sub-display element in the display element includes a background matrix corresponding to the background element, a foreground matrix corresponding to the foreground element, and a text matrix corresponding to the text element, and the foreground matrix and the text matrix are set above the background matrix. It can be understood that the matrix in the upper layer above refers to the matrix applied to the canvas later.
  • the background matrix is first applied to the canvas, and then the foreground matrix and text matrix are applied to the canvas, and the display area of the foreground matrix and text matrix on the canvas is at least partially coincide.
  • the foreground matrix and text matrix are completely displayed on the canvas, while the background matrix is only displayed in the area that does not overlap with the foreground matrix, text matrix and background matrix. It can be understood that, in the case that the foreground matrix and the text matrix do not overlap, the order in which the foreground matrix and the text matrix are applied to the canvas is not specifically limited herein.
  • each sub-display element in the display element has its own corresponding matrix.
  • the mobile phone processes the matrices corresponding to each sub-display element according to the preset processing logic to obtain a composite matrix after all the sub-display elements are superimposed, and the mobile phone sets the composite matrix in the canvas.
  • the mobile phone calls the preset function, and inputs one or more adjustment variables of the adjustment parameters of the sub-display elements into the preset function, so as to modify the matrices corresponding to these sub-display elements.
  • the mobile phone After obtaining the adjusted matrix corresponding to these sub-display elements, the mobile phone will process the adjusted matrix and the reserved matrix according to the preset processing logic to obtain the adjusted composite matrix after superimposing all sub-display elements, and the mobile phone will adjust The adjusted composite matrix is set in the canvas so that the canvas can update the display elements according to the adjusted matrix.
  • the mobile phone can adjust the placement position of the matrix in the canvas by setting position parameters in the canvas, and then adjust the relative positions of elements corresponding to multiple matrices. For example: the mobile phone calls the translate(x,y) function, and determines the translated position of the element corresponding to the matrix according to x and y in the function.
  • the mobile phone can also adjust the display size of the elements corresponding to the matrix in the canvas by setting zoom parameters in the canvas.
  • the zoom parameter can be a zoom reference point (for example: any coordinate point in the canvas) and a zoom ratio (for example: 0.5). It can be understood that since the elements presented through the canvas are two-dimensional images, the zoom parameters can be two dimensions mutually A set of locked data, or the scaling parameter can also be two sets of data when the two dimensions are not locked.
  • the zoom parameter can also be the distance between the boundary line of the element corresponding to the matrix and the boundary of the canvas, for example: the mobile phone calls the canvas.clip Rect(left,top,right,bottom) function, according to the position boundary left, top in the function , right, bottom clipping matrices correspond to the scaled border of the element to adjust the display size of the element in the canvas.
  • the mobile phone can also adjust the display color of the elements corresponding to the matrix in the canvas by setting color parameters in the canvas.
  • the mobile phone can also adjust the details of the elements corresponding to the matrix in the canvas by setting detail parameters in the canvas.
  • S1105 The mobile phone 1 hides the display element, and displays the updated display element.
  • the mobile phone 1 calls the invalide function to realize the interactive animation between the display element and the updated display element.
  • the user's touch operation can also be used to make the mobile phone 1 set according to the position of the display element in the display interface 11 of the mobile phone 1 and the reference position set by the user, and to display the The element is aligned to face the base position.
  • the user's touch operation is used to generate an automatic control instruction, and the automatic control instruction can realize the adjustment of at least one display element without providing specific touch data by the user.
  • the above-mentioned adjustment scheme for display elements may be used to adjust one of the display elements in the display interface 11 of the mobile phone 1 .
  • the above-mentioned adjustment scheme for display elements can also be used to simultaneously adjust a group of display elements in the display interface 11 of the mobile phone 1 .
  • the above adjustment solution is used to adjust the display direction of a group of display elements to be the same direction.
  • the above adjustment solution is also used to adjust the display direction of a group of display elements to face a certain position, wherein the adjusted display directions of the group of display elements are all different.
  • the type of touch operation is the rotation of a single display element, where the rotation of the display element can be around the X axis, Y axis and Z axis at least one rotation of .
  • the type of the touch operation is that the foreground element in the display element is rotating around the X axis by a preset angle.
  • the display element is a camera icon 104
  • the camera icon 104 includes a camera foreground element 1041 and a camera background element 1042 .
  • the following description will be made by taking the touch track in the touch parameters located in the adjustment interface 12 of the camera foreground element 1041 in the camera icon 104 in FIG. 5 as an example.
  • the touch direction is used to represent the rotation axis and direction of the camera foreground element 1041, and the length of the touch track is used to represent the size of the preset angle.
  • the camera foreground element 1041 rotates around the X axis and the Y axis
  • the length of the projection S22 of the touch track S2 on the X-axis is used to characterize the size of the preset angle that the camera foreground element 1041 rotates around the Y-axis
  • the length of the projection S21 of the touch track S2 on the Y-axis is used It is used to represent the magnitude of the preset angle of rotation around the X axis.
  • the initial state of the camera icon 104 is to face the front side, and the user’s finger touches a point in the mobile phone 1 adjustment interface 12, and moves toward the mobile phone 1 along the Y axis. Sliding on the lower side, the length of the touch track is used to represent the size of the camera foreground element 1041a rotating around the X-axis by a preset angle, and the camera foreground element 1041a rotates clockwise around the X-axis (viewed from the left).
  • the initial state of the camera icon 104 is facing the front side, the user touches a point in the mobile phone 1 adjustment interface 12, and slides along the Y-axis toward the upper side of the mobile phone 1, the touch track
  • the length is used to represent the size of the camera foreground element 1041b rotating around the X axis by a preset angle, and the camera foreground element 1041b rotates counterclockwise around the X axis (viewed from the left).
  • the mobile phone adjusts the display parameters in the foreground matrix corresponding to the camera foreground element 1041 according to the rotation axis X axis, rotation direction and preset angle of the camera foreground element 1041 , thereby realizing the adjustment of the camera foreground element 1041 .
  • the touch track is parallel to the X axis
  • the type of touch operation is that the foreground element in the display element is rotating around the Y axis by a preset angle.
  • the initial state of the camera icon 104 is towards the front side.
  • the user's finger touches a point in the adjustment interface 12 of the mobile phone 1 and slides to the left side of the mobile phone 1 along the X axis.
  • the length is used to represent the size of the camera foreground element 1041c rotating around the Y axis by a preset angle, and the camera foreground element 1041c rotates clockwise around the Y axis (viewed from the upper side).
  • the initial state of the camera icon 104 is facing the front side, the user's finger touches a point in the adjustment interface 12 of the mobile phone 1, and slides to the right side of the mobile phone 1 along the X axis, the touch track
  • the length of is used to represent the size of the camera foreground element 1041d rotating around the Y axis by a preset angle, and the camera foreground element 1041d rotates counterclockwise around the Y axis (viewed from the upper side).
  • the mobile phone adjusts the display parameters in the foreground matrix corresponding to the camera foreground element 1041 according to the rotation axis Y axis, rotation direction, and preset angle of the camera foreground element 1041 , thereby realizing the adjustment of the camera foreground element 1041 .
  • the touch trajectory is a straight line in the XOY plane that is not parallel to the X axis and the Y axis
  • the type of touch operation is that the foreground element in the display element rotates around the X axis and the Y axis respectively. angle.
  • the initial state of the camera icon 104 is towards the front side, and the user's finger touches the adjustment interface 12 of the mobile phone 1 a little, And move to another point along a straight line in the XOY plane, the component of the touch track on the Y axis is used to represent the size of the first preset angle that the camera foreground element 1041e/1041f/1041g/1041h rotates around the X axis, similarly , the component of the touch track on the X axis is used to represent the size of the second preset angle that the camera foreground element 1041e/1041f/1041g/1041h rotates around the Y axis.
  • the mobile phone adjusts the display parameters in the foreground matrix corresponding to the camera foreground element 1041 according to the rotation axis X axis and Y axis, rotation direction and preset angle of the camera foreground element 1041, thereby realizing the adjustment of the camera foreground element 1041.
  • the touch track is an arc centered on the center point of the element in the XOY plane, and the type of touch operation is that the foreground element in the display element rotates at a preset angle around the Z axis.
  • the initial state of the camera icon 104 is the camera icon in area i in FIG. The axis rotates and slides clockwise, and the angle corresponding to the touch track is used to represent the size of the preset angle.
  • the initial state of the camera icon 104 is the camera icon in the i area in FIG.
  • the mobile phone adjusts the display parameters in the foreground matrix corresponding to the camera foreground element 1041 according to the rotation axis Z axis, rotation direction and preset angle of the camera foreground element 1041 , thereby realizing the adjustment of the camera foreground element 1041 .
  • the user's touch operation may also be setting a position point indicating the orientation of the foreground element in the display element.
  • the user's finger touches the point in the camera icon 104 where the camera foreground element 1041 is facing.
  • the adjustment object of the touch operation is not limited to the adjustment of foreground elements, but also background elements and text elements, or at least two of foreground elements, background elements, and text elements, which are not specifically limited in this application.
  • the user's finger touches one point on the adjustment interface 12 of the mobile phone 1 and slides towards another point on the adjustment interface 12 of the mobile phone 1, which may include the above three operation trajectories.
  • the mobile phone 1 receives a confirmation instruction input by the user, for example, the user clicks the confirmation button in the adjustment interface 12 to complete the adjustment of the display elements, as shown in Figure 13(l ) shown.
  • the mobile phone 1 enters the adjustment interface 12 for adjusting display elements, and clicks the batch adjustment button 105 .
  • a batch adjustment button is displayed in the adjustment interface 12
  • multiple display elements are displayed in the adjustment interface 12 of the mobile phone 1, and the user selects a call icon in the multiple display elements 101, information icon 102, browser icon 103 and camera icon 104, and then enter the batch adjustment process of call icon 101, information icon 102, browser icon 103 and camera icon 104, as shown in Figure 14(b).
  • the call icon 101, the information icon 102, the browser icon 103 and the camera icon 104 in the adjustment interface 12 are adjusted based on their own position and the reference position respectively.
  • the mobile phone 1 receives a confirmation instruction input by the user, for example, the user clicks the confirmation button in the adjustment interface 12 to complete the adjustment of the display elements, and obtain the call icon 101y, An information icon 102y, a browser icon 103y, and a camera icon 104y are shown in FIG. 14(e).
  • the call icon 101y, the information icon 102y, the browser icon 103y and the camera icon 104y are the same as the call icon 101, the information icon 102, the browser icon 103 and the camera icon 104 in Fig. 14(d).
  • the display direction of the foreground element has changed.
  • the display direction of the foreground element and the display direction of the background element may also be adjusted at the same time. For example, after the user touches the reference position in the display interface 12 in FIG. The display direction is as shown in Fig.
  • the adjustment ranges of the display direction of the foreground element and the display direction of the background element may be the same or different, which is not specifically limited in this application.
  • not only batch adjustment can realize the simultaneous adjustment of foreground elements and background elements, but also adjust a certain display element individually, exchange the display positions of two display elements, adjust the display size of two adjacent display elements, and cover one display element with another.
  • Various types of touch operations can realize the adjustment of at least one display effect of at least one foreground element, background element, and text element, and the present application will not repeat them here.
  • the human-computer interaction scheme between the two display elements After introducing the adjustment scheme of the display direction of the display elements, the human-computer interaction scheme between the two display elements will be introduced in detail below.
  • the display element after adjusting the display side has the direction attribute. Based on this, the following will take all the display elements in the display interface of the mobile phone 1 as an example of adjusting directions to describe the human-computer interaction scheme between the two display elements.
  • the two display elements may be the same type of display elements, for example, both display elements are cards, wherein the two display elements may be the same type of cards, or may be different types of cards, or, Both display elements are icons.
  • the two display elements can also be different types of display elements, for example, one of the display elements is a card, and the other display element is an icon.
  • the last adjustment of the display orientation of the contact A card 203 is located after the last adjustment of the display orientation of the contact B card 204, and the last adjustment of the display orientation of the contact A card 203
  • the adjustment method is batch adjustment.
  • the touch trace in the touch parameters is located in the area where the two display elements are located, and the two display elements correspond to the same application, and the coordinates of the start point and the end point are respectively located in the area where the two display elements are located , the type of the touch operation corresponding to the touch operation is a process of covering another display element by one display element.
  • the application program is a contact
  • the display elements include a contact A card 203 and a contact B card 204 .
  • the human-computer interaction solution between two display elements corresponding to the same application program will be described below.
  • the mobile phone 1 When the user touches the contact B card 204 and drags the contact B card 204 onto the contact A card 203 , the mobile phone 1 acquires the first location information of the contact A card 203 . Then, the mobile phone 1 obtains the second display information of the contact B card 204 , wherein the second display information includes the second location information of the contact B card 204 . The mobile phone 1 replaces the second location information in the second display information with the first location information to obtain third display information. The mobile phone 1 sets the matrix corresponding to the third display information to the canvas, so as to complete the drawing of the new icon contact B card 204a through the canvas. Furthermore, as shown in FIG.
  • the mobile phone 1 updates the contact B icon 203 to a contact B card 204 a, hides the contact B card 204 and displays the contact A card 203 .
  • the location information includes location parameters, size parameters, direction parameters, etc. for determining the contact A card 203 .
  • the application program is a contact
  • the display elements include a contact A card 203 and a contact C card 205.
  • the user touches the contact B card 204 and drags the contact B card 204 To the contact A card 203.
  • the following will take the contact C card 205 covering the contact A card 203 as an example to illustrate the interaction scheme of two display elements corresponding to the same application.
  • the mobile phone 1 acquires the first location information of the contact A card 203 .
  • the mobile phone 1 obtains the second display information of the contact C card 205 , wherein the second display information includes the second location information of the contact C card 205 .
  • the mobile phone 1 replaces the second location information in the second display information with the first location information to obtain third display information.
  • the mobile phone 1 sets the matrix corresponding to the third display information to the canvas, so as to complete the drawing of the new icon contact A icon 204a through the canvas.
  • the mobile phone 1 updates the contact B icon 203 to the contact A icon 204a, hides the contact A icon 205 and displays the contact A card 203 .
  • the last adjustment of the display direction of the alarm clock card 201 and the last adjustment of the display direction of the remote control card 202a are both batch adjustments, and are not limited to the same batch adjustment .
  • the two display elements correspond to different applications.
  • the coordinates of the start point in the touch parameters are located in the area where one display element is located, and the coordinates of the end point are located in the area where another display element is located, then the type of touch operation corresponding to the touch operation is that the display elements corresponding to two different applications interact with each other. Swap places.
  • one display element is an alarm clock card 201
  • the other display element is a remote control card 202 .
  • the user touches the alarm clock card 201 and drags the alarm clock card 201 onto the remote control card 202 .
  • the following will take the alarm clock card 201 and the remote control card 202 to exchange positions with each other, and the last adjustment of the display direction of the alarm clock card 201 and the last adjustment of the display direction of the remote control card 202a are all adjusted in batches as an example to illustrate the corresponding to different applications. Interaction scheme of the two display elements of the program.
  • the mobile phone 1 acquires first display information of the alarm clock card 201 , wherein the first display information includes first location information corresponding to the alarm clock card 201 .
  • the mobile phone 1 obtains the second display information of the remote control card 202 , wherein the second display information includes the second location information corresponding to the remote control card 202 .
  • the remote control card 202 includes text elements related to the remote control, but the text elements in the remote control card 202 are not displayed due to the small size of the remote control card 202 . Since the alarm clock card 201 displays text elements related to the alarm clock, the obtained second location information includes the display position that the text elements in the remote control card 202 should correspond to.
  • Mobile phone 1 replaces the first location information in the first display information with the second location information to obtain the third display information, and at the same time replaces the second location information in the second display information with the first location information to obtain the fourth display information .
  • the mobile phone 1 generates the remote controller card 202a after the position exchange according to the third display information, and generates the alarm clock card 201a after the position exchange according to the fourth display information.
  • the first position information includes position parameters, size parameters and direction parameters for determining the position of the remote control card 202, the position of the alarm clock card 201, and the relative position between them.
  • the mobile phone 1 displays the alarm clock card 201a and the remote control card 202a, and hides and displays the alarm clock card 201 and the remote control card 202.
  • the mobile phone 1 can also adaptively adjust the third display information and the fourth display information to obtain Cards that meet user habits.
  • the last display direction of the alarm clock card 201 is adjusted individually, the last display direction of the remote control card 202 is adjusted in batches, and the alarm clock card 201 and the remote control card 202 are exchanged.
  • Position when the adjustment of the last display direction of the alarm clock card 201 was later than the adjustment of the last display direction of the remote controller card 202, the display direction of the alarm clock card 201a after the exchange of positions was unchanged, and the display direction of the remote control card 202a after the exchange of positions The display orientation does not change.
  • the last display direction of the alarm clock card 201 is adjusted individually, the last display direction of the remote control card 202 is adjusted in batches, and the alarm clock card 201 and the remote control card 202 are exchanged.
  • Position when the adjustment of the last display direction of the alarm clock card 201 was earlier than the adjustment of the last display direction of the remote control card 202, the display direction of the alarm clock card 201a after the exchange of positions follows the display direction of the remote control card 202, after the exchange of positions
  • the display direction of the remote controller card 202a remains unchanged.
  • earlier than means that the adjustment time of the display element is earlier than the adjustment time of the display element Y
  • later than means that the adjustment time of the display element is later than the adjustment time of the display element Y.
  • the adjustment time of the display direction of the display element X refers to the adjustment time corresponding to the current display direction of the display element X, not necessarily the time when the display element X is adjusted to the display direction.
  • the time at which the display direction of the display element X is adjusted is time t11 .
  • the time for adjusting the display direction of display element X is time t21 .
  • the user adjusts the display direction of display element Y and display element Z in batches at time t31 , and display element X continues to use the display direction of display element Y at time t32 , then the adjustment time of the display direction of display element X is is time t31 .
  • the two display elements correspond to different applications. If the starting point coordinates in the touch parameters are located outside the area where two adjacent display elements are located, and are located between the areas where the two display elements are located, and the end point coordinates are located in the area where one of the display elements is located, the touch operation The type of the corresponding touch operation is to adjust the size of the display elements corresponding to two adjacent different application programs.
  • one display element is the alarm clock card 201a
  • the other display element is the remote control card 202a
  • the user touches the area between the remote control card 202a and the alarm clock card 201a, and drags it to the alarm clock card The area where 201a is located.
  • the following will take adjusting the size of the remote control card 202 and the alarm clock card 201 as an example to illustrate another human-computer interaction scheme for updating two display elements corresponding to different application programs.
  • the mobile phone 1 obtains the first cropping information and the first location information of the remote control card 202a.
  • the mobile phone 1 acquires the second cropping information and the second location information of the alarm clock card 201a.
  • the mobile phone 1 adjusts the first cropping information, the second cropping information and the second location information respectively according to the user's touch parameters.
  • the mobile phone 1 adjusts the rotation angle of the foreground element in the alarm clock card 201a according to the touch parameters of the user.
  • the mobile phone 1 adjusts the first position information of the remote control card 202a according to the adjusted first cropping information, and adjusts the second position information of the alarm clock card 201a according to the adjusted second cropping information and the adjusted second position information.
  • the mobile phone 1 updates the remote control card 202a according to the adjusted first cropping information and the adjusted first location information, and updates the alarm clock card 201a according to the adjusted second cropping information and the adjusted second location information.
  • the display direction of the display elements is adjusted to face the front side by default. It can be understood that although the display direction of the display element is adjusted to face the front side, the display element is still considered to be a display element whose display direction has been adjusted.
  • FIGS. 15( a ) to 15 ( d ) The update scheme of adjusting the background element P 1 of the remote control card 202a and the background element Q 1 of the alarm clock card 201a based on the touch track will be described in detail below with reference to FIGS. 15( a ) to 15 ( d ).
  • the shortest distance between the background element P 1 corresponding to the remote control card 202a and the background element Q 1 of the alarm clock card 201a is d.
  • Figure 15(b) when the user touches the area between the background element P 1 and the background element Q 1 , the background element P 1 expands to the largest size that the background element P 1 can present when the background element Q 1 is in the smallest size. Dimensions of the profile P 0 .
  • the mobile phone 1 when the user drags from the starting position of the touch to the background element Q 1 , the mobile phone 1 generates cropping information d P2 and intermediate cropping information d Q2 according to the actual touch position of the user, and based on the cropping information d P2 draws the intermediate element P 2 , and Q2 draws the intermediate element Q 2 based on the intermediate clipping information d Q2 .
  • the mobile phone 1 when the user drags to the end position, the mobile phone 1 generates the first clipping information and the second clipping information according to the user's end position, and draws the updated background element P 3 based on the first clipping information, And drawing the updated background element Q 3 based on the second clipping letter.
  • the mobile phone 1 adjusts the matrix corresponding to the remote control card 202a according to the adjusted first cropping information and the adjustment parameters corresponding to the adjusted first position information, for example, the mobile phone 1 adjusts the remote control card according to the first cropping information
  • the mobile phone 1 adjusts the matrix corresponding to the remote control text element in the remote control card 202a according to the first position information.
  • the mobile phone 1 sets the matrix corresponding to the adjusted background element of the remote control and the matrix corresponding to the adjusted text element of the remote control to the canvas, so that the adjusted remote control card 202b is displayed on the canvas.
  • the mobile phone 1 adjusts the matrix corresponding to the alarm clock card 201a according to the adjustment parameters corresponding to the adjusted second cropping information and the adjusted second position information, and the mobile phone 1 sets the adjusted matrix corresponding to the alarm clock card 201a to the canvas, so as to Make the canvas display the adjusted alarm clock card 201b.
  • the mobile phone 1 hides the remote control card 202a and the alarm clock card 201a.
  • FIG. 16 shows a schematic diagram of the hardware structure of the mobile phone 1 .
  • Mobile phone 1 can comprise processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio frequency module 170 , speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180 and so on.
  • processor 110 external memory interface 120
  • internal memory 121 internal memory 121
  • universal serial bus (universal serial bus, USB) interface 130 antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio frequency module 170 , speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180 and so on.
  • USB universal serial bus
  • the structure shown in the embodiment of the present invention does not constitute a specific limitation on the mobile phone 1 .
  • the mobile phone 1 may include more or less components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor ( image signal processor (ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-networkprocessing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • graphics processing unit graphics processing unit
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-networkprocessing unit
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuitsound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver) /transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and/or Universal serial bus (universal serialbus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous receiver transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB Universal serial bus
  • the wireless communication function of the mobile phone 1 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the handset 1 can be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the mobile phone 1 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • the wireless communication module 160 can provide applications on the mobile phone 1 including wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system ( Global navigation satellite system (GNSS), frequency modulation (frequency modulation, FM), near field communication (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the mobile phone 1 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the mobile phone 1 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband code Wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM , and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou satellite navigation system (beidounavigation satellite system, BDS), a quasi-zenith satellite system (quasi- zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou satellite navigation system beidounavigation satellite system
  • BDS Beidounavigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the mobile phone 1 realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrixorganic light-emitting diode) , AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the mobile phone 1 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone 1 can realize the shooting function through ISP, camera 193, video codec, GPU, display screen 194 and application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the mobile phone 1 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the mobile phone 1 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • Mobile phone 1 can support one or more video codecs.
  • the mobile phone 1 can play or record video in multiple encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the mobile phone 1.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone 1 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the mobile phone 1 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the mobile phone 1 can realize the audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Mobile phone 1 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to receive the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the mobile phone 1 can be provided with at least one microphone 170C.
  • the mobile phone 1 can be provided with two microphones 170C, which can also implement a noise reduction function in addition to collecting sound signals.
  • the mobile phone 1 can also be provided with three, four or more microphones 170C to realize the collection of sound signals, noise reduction, identification of sound sources, and realization of directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the sensor module 180 may include a touch sensor, a fingerprint device, a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • the touch sensor can collect the user's touch event on or near it (such as the user's operation on the surface of the touch sensor with any suitable object such as a finger or a stylus), and send the collected touch information to the Other devices, such as the processor 110.
  • the touch sensor can be implemented in various ways such as resistive, capacitive, infrared, and surface acoustic wave.
  • the touch sensor and the display screen 194 can be integrated into the touch screen of the mobile phone 1 , or the touch sensor and the display screen 194 can be used as two independent components to realize the input and output functions of the mobile phone 1 .
  • the mobile phone 1 may also include a charging management module, a power management module, a battery, buttons, an indicator, and one or more SIM card interfaces, etc., which are not limited in this embodiment of the present application.
  • the software system of the above-mentioned mobile phone 1 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present application takes the Android system with a layered architecture as an example to illustrate the software structure of the mobile phone 1 .
  • FIG. 17 is a block diagram of the software structure of the mobile phone 1 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer can include a series of applications.
  • the aforementioned application programs may include applications (applications, APPs) such as call, contacts, camera, gallery, calendar, map, navigation, bluetooth, music, video, and short message.
  • applications applications, APPs
  • call contacts, camera, gallery, calendar, map, navigation, bluetooth, music, video, and short message.
  • the application program layer also includes Android core applications such as launcher (launcher, which can also be called desktop or main screen).
  • launcher which can also be called desktop or main screen.
  • the launcher can be resident in the Android system as a core application.
  • Launcher can be used to display and manage other apps installed in the application layer.
  • the application icon of the application is generally displayed in the launcher and is managed uniformly by the launcher. If it is detected that the user performs an operation such as clicking, long pressing, or dragging an application icon in the launcher, the launcher may respond to the user's operation and trigger the corresponding application to execute a corresponding operation instruction. For example, if it is detected that the user clicks the contact card in the launcher, the launcher can generate a start message for the contact application, start the application process of the contact application by calling related services in the application framework layer, and finally display the contact on the screen The application's interface. For another example, if it is detected that the user presses and holds the contact card in the launcher, the launcher may generate an adjustment message for the contact card and enter an adjustment interface for the contact card.
  • the Launcher When the Launcher displays the display elements of each application, it can obtain the display model provided by the application, such as the three-dimensional display model corresponding to the foreground element 110 in Figure 9(a), or the two-dimensional display model corresponding to the background element 120 and text element 130. Show the model.
  • the installation package of the contact application may provide a display model corresponding to the contact card.
  • the display model is used to map to each child display element in the contact card.
  • the contact card includes a foreground element 110, a background element 120, and a text element 130.
  • the foreground element 110 and the text element 130 are generally located on the upper layer of the background element 120, and the size of the foreground element 110 and the text element 130 is generally Smaller than the size of the background element 120.
  • the launcher can obtain the display model corresponding to the contact application from the installation package of the contact application. As shown in Figure 10, after receiving the user's touch operation through the touch sensor in the screen, Launcher obtains the type of user's touch operation, and then determines the adjustment parameters and adjustment variables according to the type of touch operation. Then, the Launcher adjusts the matrix corresponding to the corresponding sub-display element according to the adjustment parameter and the adjustment variable. In order to display cards corresponding to user requirements through the launcher, the launcher uses the canvas to clip the background element 120 so that the background element 120 presents a preset shape and size. Furthermore, as shown in FIG.
  • the launcher can superimpose the foreground element 110 and the text element 130 on the background element 120 after clipping, and finally form the contact card 100 .
  • the launcher can use the canvas to create a display element corresponding to the user's touch operation according to the above method. In this way, the launcher can flexibly adjust the display effect of each display element in the mobile phone in the launcher through the canvas, thereby improving the diversity and customization of application icons in the launcher.
  • the icon 200 displayed on the launcher includes a foreground element 210 and a background element 220 , and the specific adjustment method is the same as that of the above-mentioned card 100 , which will not be repeated here.
  • the Launcher When the Launcher displays the application icon of each application, it can obtain the display model provided by the application.
  • the installation package of the contact application may provide a contact-related display model.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a notification manager, an activity manager, a window manager, a content provider, a view system, a phone manager, and the like.
  • the view system can be used to build the display interface of the application.
  • Each display interface can consist of one or more controls.
  • controls may include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets (Widgets).
  • the above-mentioned notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the above-mentioned activity manager can be used to manage the life cycle of each application.
  • Applications usually run in the operating system in the form of activities.
  • the activity manager can schedule the activity process of the application to manage the life cycle of each application.
  • a window manager is used to manage window programs.
  • the above window manager can obtain the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, and so on.
  • the above-mentioned content providers are used to store and obtain data, and make these data accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the above phone manager is used to provide the communication function of the mobile phone. For example, the management of call status (including connected, hung up, etc.).
  • the resource manager mentioned above provides various resources for the application, such as localized strings, icons, pictures, layout files, video files and so on.
  • Androidruntime includes core library and virtual machine. Androidruntime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem, and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, etc., which are not limited in this embodiment of the present application.
  • the card corresponding to each application displayed in the launcher includes three sub-display elements as an example, the three sub-display elements include the foreground element in the foreground element layer, the background element in the background element layer, and the text layer text elements.
  • the launcher can change the display effect of the card by changing at least one of the foreground element layer, the background element layer and the text layer.
  • Embodiments of the mechanisms disclosed in this application may be implemented in hardware, software, firmware, or a combination of these implementation methods.
  • Embodiments of the present application may be implemented as a computer program or program code executed on a programmable system comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements) , at least one input device, and at least one output device.
  • Program code can be applied to input instructions to perform the functions described herein and to generate output information.
  • the output information may be applied to one or more output devices in known manner.
  • a processing system includes any computer having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor. system.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • the program code can be implemented in a high-level procedural language or an object-oriented programming language to communicate with the processing system.
  • Program code can also be implemented in assembly or machine language, if desired.
  • the mechanisms described in this application are not limited in scope to any particular programming language. In either case, the language may be a compiled or interpreted language.
  • the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof.
  • the disclosed embodiments can also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which can be executed by one or more processors read and execute.
  • instructions may be distributed over a network or via other computer-readable media.
  • a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy disks, optical disks, optical disks, read-only memories (CD-ROMs), magnetic CD-ROM, Read Only Memory (ROM), Random Access Memory (Random Access Memory, RAM), Erasable Programmable Read Only Memory (EPROM), Electrically Erasable Programmable Read Only Memory Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), magnetic card or optical card, flash memory, or to use the Internet to transmit information by means of electricity, light, sound or other forms of propagation signals (for example, carrier waves, infrared signals, digital signals, etc. ) of tangible machine-readable storage.
  • a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (eg, a computer).
  • each unit/module mentioned in each device embodiment of this application is a logical unit/module.
  • a logical unit/module can be a physical unit/module, or a physical unit/module.
  • a part of the module can also be realized with a combination of multiple physical units/modules, the physical implementation of these logical units/modules is not the most important, the combination of functions realized by these logical units/modules is the solution The key to the technical issues raised.
  • the above-mentioned device embodiments of this application do not introduce units/modules that are not closely related to solving the technical problems proposed by this application, which does not mean that the above-mentioned device embodiments do not exist other units/modules.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请涉及通信技术领域,公开了一种人机交互方法、可读介质和电子设备。其中人机交互方法包括,对显示界面上每个显示元素设置了对应的显示模型,其中,显示模型可以表征为矩阵或者变量。在用户对图标或者卡片进行操作时,电子设备根据触摸操作的类型调整对应的显示模型,以实现显示模型显示效果的改变。电子设备将显示效果改变后的显示模型映射到用于绘制图标的画布上,电子设备显示界面中的图标发生与用户操作对应的改变。上述人机交互方法中,由于显示元素能够根据调整后的显示模型进行显示,能够实现用户与显示元素充分、灵敏、准确地交互,减弱了用户与显示元素交互时的卡顿感,提高了用户与显示元素交互时的流畅感,提升了用户的体验感。

Description

人机交互方法、计算机可读介质和电子设备
本申请要求于2021年9月14日提交中国专利局、申请号为202111076012.9、申请名称为“人机交互方法、计算机可读介质和电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及通信技术领域,特别涉及一种人机交互方法、计算机可读介质和电子设备。
背景技术
应用程序安装于终端设备后,对应的图标显示于终端设备的显示界面中。而随着终端设备的发展,终端设备的性能不断提高,终端设备的存储空间快速增大。如终端设备为手机,用户安装于手机上的应用程序越来越多,多种图标显示于手机的显示界面。
在一些技术方案中,对于手机显示界面中的图标而言,每个图标的显示方式是预设的。即便用户能够对两个特定的图标(例如相机和镜头包)进行相关操作,例如图1(a)和图1(b)所示,用户通过触摸操作将镜头图标100′和相机图标200′合并以得到一个镜头相机图标300′,但镜头相机图标300′实际上是基于镜头图标100′和相机图标200′预先生成并存储手机中的图标。
基于此,目前手机显示界面中图标的显示方式均为预设方式,基于用户触摸操作的人机交互过程,实质上为图标的调用及显示。手机无法根据用户的触摸操作,灵活、准确地调整显示界面中的图标,导致图标的显示方式较为单一,图标的可玩性较差,人机交互体验不佳。
发明内容
本申请公开了一种人机交互方案:显示模型可以表征为矩阵或者变量。在用户对图标或者卡片进行操作时,电子设备根据触摸操作的类型调整对应的显示模型,以实现显示模型显示效果的改变。电子设备将显示效果改变后的显示模型映射到用于绘制图标的画布上,电子设备显示界面中的图标发生与用户操作对应的改变。上述人机交互方法中,由于显示元素能够根据调整后的显示模型进行显示,且显示效果是根据实时、灵活变化地调整变量调整的,因此本申请能够实现用户与显示元素充分、灵敏、准确地交互,减弱了用户与显示元素交互时的卡顿感,提高了用户与显示元素交互时的流畅感,提升了用户的体验感。
本申请的第一方面提供了一种人机交互方法,该方法能够应用于电子设备,包括:获取用户对电子设备上显示的至少一个显示元素的触摸操作;根据触摸操作的类型,对触摸操作施加的至少一个显示元素的至少一个显示模型进行显示效果调整;根据显示效果调整后的显示模型改变显示模型对应的显示元素、在电子设备上的显示效果。
其中,电子设备可以是手机、平板等具有显示屏的设备,显示元素可以是显示于电子设备的显示界面中的图标、卡片、部件和组件等元素中的至少一种。其中显示界面可以是电子设备的显示桌面,还可以是应用程序对应的应用界面。触摸操作是指用户触摸调整界面时形成的操作。触摸操作的类型是指用户的触摸操作对应的显示元素的调整方式。例如,触摸操作的类型包括调整单个显示元素的显示方向、 调整批量显示元素的显示方向、用一显示元素覆盖另一显示元素、交换两个显示元素的显示位置、调整两个显示元素的尺寸等等。显示效果包括显示元素的显示位置、显示尺寸、显示方向、显示颜色、显示内容和显示细节等等。显示模型是一种与显示元素相对应的数据,显示模型可以是二维模型或者三维模型。
即在本申请的实现方式中,电子设备获取用户对至少一个显示元素的触摸操作,电子设备通过触摸传感器获取用户对显示元素的触摸操作产生的触摸数据,并提取触摸数据中的触摸参数,电子设备通过触摸参数基于预设规则判断出触摸操作的触摸操作的类型。电子设备根据触摸参数和触摸操作的类型确定显示元素中各子显示元素的调整参数及调整变量。电子设备根据显示元素中子显示元素的调整参数和调整变量,对应调整对应模型的显示效果,进而对应调整显示元素中的子显示元素,并根据调整后的子显示元素得到调整后的显示元素。最后电子设备隐藏原来的显示元素,并显示更新后的显示元素。
在一种实现方式中,上述显示元素的调整方案可以用于调整电子设备显示的一个显示元素中至少一个显示元素的显示效果。
在一种实现方式中,上述显示元素的调整方案还可以用于同时调整电子设备显示的至少一个显示元素中至少两个显示元素的显示效果。其中两个显示元素可以是同一应用程序对应的两个显示元素,还可以是两个应用程序对应的两个显示元素。显示元素显示方向的调整包括根据触摸操作单独或者批量对显示元素的显示方向直接调整,不包括显示元素对其他显示元素的显示方向的沿用或继承。
除此之外,用户的触摸操作为单独调整一显示元素的显示方向时,则很有可能用户想要将该显示元素的显示方向保持为一固定方向。而用户的触摸操作为批量调整多个显示元素时,则很有可能用户想要将多个显示元素对应的显示位置保持为固定的显示方向。在采用上述人机交互方案调整单个显示元素的显示方向之后,本申请还包括一种两个或者两个以上的显示元素的人机交互方案,例如同一应用程序对应的一个卡片覆盖另一个卡片,再例如两个显示元素交互位置,还例如两个相邻显示元素调整显示尺寸。
可以理解,基于上述单独调整和批量调整实际所产生的调整效果可知,本申请中触摸操作的两个或者两个以上的显示元素的最后一次显示方向的调整方式,以及两个或者两个以上的显示元素的最后一次显示方向的调整顺序均会影响触摸操作后两个或者两个以上的显示元素的显示效果。
例如,两个或者两个以上的显示元素中最后一次调整了显示方向的显示元素具有最高的优先级,并按照调整顺序的逆序,显示元素的优先级依次降低。具体地,最后一次显示方向的调整是对于第一显示元素的单独调整,则第一显示元素调整至其他显示位置时,第一显示元素的显示方向始终保持不变,而最后一次显示方向的调整是对于第二显示元素和其他显示元素的批量调整,则其他元素调整至第二显示元素的所在位置时,均会继续沿用第二显示元素的显示方向。下面基于此来描述两个或者两个以上的显示元素的人机交互方案。
上述人机交互方法中,通过接收用户的触摸操作判断出用户对显示元素的触摸操作的类型,进而根据触摸操作的类型以及具体地触摸操作对应的触摸参数灵活地调整显示元素的显示效果。由于显示元素能够根据调整后的显示模型进行显示,且显示效果是根据实时、灵活变化地调整变量调整的,因此本申请能够实现用户与显示元素充分、灵敏、准确地交互,减弱了用户与显示元素交互时的卡顿感,提高了用户与显示元素交互时的流畅感,提升了用户的体验感。
在上述第一方面一种可能的实现中,上述人机交互方法中至少一个显示效果包括显示元素的显示位置、显示尺寸、显示方向、显示颜色、显示内容和显示细节中的至少一种。
其中,显示位置是指显示元素中各个子显示元素的布置位置。也即显示位置包括显示元素的布置位 置,以及显示元素中各个子显示元素的相对位置。显示尺寸是指显示元素中各个子显示元素的尺寸,其中显示尺寸可以通过各个子显示元素距离画布的边界的距离。显示方向是指显示元素的朝向,例如,当显示元素与显示屏平行时,显示元素的显示方向向前。当显示元素左侧向前且右侧向后时,显示元素的显示方向向右。当显示元素左侧向后且右侧向前时,显示元素的显示方向向左。当显示元素上侧向前且下侧向后时,显示元素的显示方向向下。当显示元素上侧向后且下侧向前时,显示元素的显示方向向上。上述人机交互方法中,显示元素的显示效果的种类繁多,进而能够满足用户多种多样的需求。
在上述第一方面一种可能的实现中,上述人机交互方法还包括:根据预设规则,确定触摸操作的类型,其中预设规则是基于至少一个显示元素的数量、至少一个显示元素对应的应用程序是否相同和触摸操作所设置的。
即在本申请的实现方式中,预设规则是指预先存储电子设备中的,用于根据触摸操作对应的触摸参数确定触摸操作的类型。其中,预设规则包括:当用户触摸操作的初始位置和终止位置位于同一个显示元素上,触摸操作的类型为调整单个显示元素的显示方向;当用户触摸操作的初始位置位于一应用程序对应的一显示元素所在的区域,用户触摸操作的终止位置位于同一应用程序对应的另一个显示元素上时,触摸操作的类型为用一显示元素覆盖替换另一显示元素;当用户触摸操作的初始位置位于一显示元素所在的区域,用户触摸操作的终止位置位于相邻的另一个显示元素上,且终止位置越过另一显示元素的中心线时,触摸操作的类型为交换两个相连显示元素的显示位置;当用户触摸操作的初始位置位于相邻两个显示元素的中间区域,用户触摸操作的终止位置位于其中一个显示元素上时,触摸操作的类型为调整两个显示元素的尺寸。
上述人机交互方法中,根据至少一个显示元素的数量、至少一个显示元素对应的应用程序是否相同和触摸操作即可以判断出触摸操作的类型,判断方式简单,判断周期短。
在上述第一方面一种可能的实现中,上述人机交互方法中,获取用户对电子设备上显示的至少一个显示元素的触摸操作,包括:在用户选择对多个显示元素进行显示效果调整的情况下,触摸操作用于同时对选择的多个显示元素进行显示效果的调整。例如,多个显示元素的显示效果调整后,可以实现显示元素的3D立体效果,进而实现裸眼3D效果。
用户的触摸操作可以是用于使得电子设备根据显示元素在电子设备显示界面中的位置以及用户设置的基准位置而设置的、用于将显示元素调整至朝向基准位置。用户的触摸操作用于生成一种自动控制指令,该自动控制指令无需用户给出具体地触摸数据即能够实现至少一个显示元素的调整。
上述人机交互方法中,电子设备通过触摸操作可以同时调整多个显示元素的显示效果的调整,调整显示元素的效率较高,操作方便。
在上述第一方面一种可能的实现中,上述人机交互方法中,根据触摸操作的类型,对触摸操作施加的至少一个显示元素的至少一个显示模型进行显示效果调整,包括:根据触摸操作的类型,对多个显示元素中的至少两个显示元素的显示模型进行不同的显示效果的批量调整。
上述人机交互方法中,电子设备通过触摸操作可以同时对多个显示元素进行不同显示效果的调整,实现多个显示元素的不同显示效果的调整,操作方便,进一步满足用户对显示元素的调整需求。
在上述第一方面一种可能的实现中,上述人机交互方法中,对多个显示元素中的至少两个显示元素的显示模型进行不同的显示效果的调整包括:根据用户的触摸操作和至少两个显示元素中每个显示元素的显示位置确定用户对每个显示元素的子触摸操作;根据子触摸操作确定每个显示元素的旋转轴、旋转方向和旋转角度,并对应触摸操作,以确定的每个显示元素的旋转轴、旋转方向和旋转角度对每个显示 元素的显示方向进行批量调整。
在上述第一方面一种可能的实现中,上述人机交互方法中,根据触摸操作的类型,对触摸操作施加的至少一个显示元素的至少一个显示模型进行显示效果调整,包括:根据触摸操作的类型,对多个显示元素的多个显示模型进行相同的显示方向的批量调整。
在上述第一方面一种可能的实现中,上述人机交互方法中,操作类型包括采用第一显示元素覆盖第二显示元素,第一显示元素和第二显示元素对应于同一应用程序,其中,采用第一显示元素覆盖第二显示元素包括第一显示元素沿用第二显示素的第二至少部分显示效果,以及隐藏第二显示元素,其中第二至少部分显示效果包括第二显示元素的第二显示位置和第二显示尺寸。
在上述第一方面一种可能的实现中,上述人机交互方法中,在第二显示元素的触摸操作的类型为批量调整显示方向的情况下,第二至少部分显示效果还包括第二显示元素的第二显示方向。
在本申请的一种可能的实现方式中,应用场景为原来的显示元素的最后一次显示方向的调整位于新的显示元素的最后一次显示方向的调整之后,且原来的显示元素的最后一次显示方向的调整方式为批量调整,用户拖动一应用程序对应的一个新的显示元素至同一应用程序对应的另一个原来的显示元素上,显示界面中被拖动的新的显示元素覆盖原来的显示元素,且覆盖后的新的显示元素继续沿用原来的显示元素的显示方向。
可以理解,显示元素显示方向的调整包括根据触摸操作单独或者批量对显示元素的显示方向直接调整,不包括显示元素对其他显示元素的显示方向的沿用或继承。
在上述第一方面一种可能的实现中,上述人机交互方法中,根据触摸操作的类型,对触摸操作施加的至少一个显示元素的至少一个显示模型进行显示效果调整包括:获取用户的触摸操作对应的第二显示元素;根据第二显示元素的至少部分显示效果调整第一显示元素的显示模型,对应触摸操作,隐藏第二显示元素,并以调整后的显示模型对第一显示元素的显示效果进行调整。
即在本申请的实现方式中,电子设备调用invalide函数实现显示元素和更新后的显示元素之间的交互动画。
在上述第一方面一种可能的实现中,上述人机交互方法中,触摸操作的类型包括交换第三显示元素和第四显示元素的显示位置,在第三显示元素的最近一次触摸操作的类型为批量调整显示方向的情况下,交换第三显示元素和第四显示元素的显示位置包括:第三显示元素沿用第四显示素的第四至少部分显示效果,第四显示元素沿用第三显示素的第三至少部分显示效果,其中第四至少部分显示效果包括第四显示元素的第四显示位置,第三至少部分显示效果包括第三显示元素的第三显示位置和第三显示方向。
在本申请的实现方式中,两个或者两个以上的显示元素中最后一次调整了显示方向的显示元素具有最高的优先级,并按照调整顺序的逆序,显示元素的优先级依次降低。具体地,最后一次显示方向的调整是对于第一显示元素的单独调整,则第一显示元素调整至其他显示位置时,第一显示元素的显示方向始终保持不变,而最后一次显示方向的调整是对于第二显示元素和其他显示元素的批量调整,则其他元素调整至第二显示元素的所在位置时,均会继续沿用第二显示元素的显示方向。下面基于此来描述两个或者两个以上的显示元素的人机交互方案。
在本申请的实现方式中,第四显示元素的最后一次显示方向的调整方式为单独调整,第三显示元素的最后一次显示方向的调整方式为批量调整,第四显示元素与第三显示元素互相交换位置。在第四显示元素的最后一次显示方向的调整早于第三显示元素的最后一次显示方向的调整时,交换位置后的第四显 示元素的显示方向沿用第三显示元素的显示方向,交换位置后的第三显示元素的显示方向不变。
在上述第一方面一种可能的实现中,上述人机交互方法中,触摸操作的类型包括交换第三显示元素和第四显示元素的显示位置,在第三显示元素和第四显示元素的最近一次触摸操作的类型均为批量调整显示方向的情况下,交换第三显示元素和第四显示元素的显示位置包括:第三显示元素沿用第四显示元素的第四至少部分显示效果,以及第四显示元素沿用第三显示素的第三至少部分显示效果,其中第三至少部分显示效果包括第三显示元素的第三显示位置和第三显示方向,第四至少部分显示效果包括第四显示元素的第四显示位置和第四显示方向。
在本申请的实现方式中,人机交互方案适用的一种应用场景为第三显示元素的最后一次显示方向的调整和第四显示元素的最后一次显示方向的调整方式均为批量调整,并不限定于必须为同一次批量调整,则交换位置后的第三显示元素沿用原第四显示元素的显示方向,交换位置后的第四显示元素沿用原第三显示元素的显示方向。
上述方案实现了交换过来的显示元素对原显示元素的显示方向和显示位置的继续沿用,也即在显示元素位置互换的过程中,保证电子设备显示界面中同一位置处的显示元素始终按照相同的显示方向进行显示,保持了符合用户使用习惯的裸眼3D效果的显示方向,还简化了电子设备显示界面中显示元素的调整步骤。
在上述第一方面一种可能的实现中,上述人机交互方法中,触摸操作的类型包括同步调整第五显示元素和第六显示元素的显示效果,第五显示元素和第六显示元素相邻显示于电子设备上,同步调整第五显示元素和第六显示元素的显示效果包括:根据触摸操作同步调整第五显示元素的第五显示尺寸以及第六显示元素的第六显示尺寸,且第五显示尺寸与第六显示尺寸之和保持不变。
上述方案中,通过用户在两个相邻显示元素之间区域的触摸操作,还能够实时、动态地调整两个显示元素的尺寸以及显示方向,提高两个显示元素变换显示方式过程中的流畅性,实现两个显示元素显示效果的多样性。
在上述第一方面一种可能的实现中,上述人机交互方法中,同步调整第五显示元素和第六显示元素的显示效果还包括:根据触摸操作调整第五显示元素的第五显示内容和/或第六显示元素的显示内容。
在上述第一方面一种可能的实现中,上述人机交互方法中,在第五显示元素基于第五调整尺寸调整后的尺寸为最小显示尺寸的情况下,同步调整第五显示元素和第六显示元素的显示效果还包括调整第五显示元素的第五显示方向。其中,第五调整尺寸可以为第五元素的原始显示尺寸与调整后显示尺寸的差值。
在本申请的实现方式中,电子设备中显示界面中显示元素由其他尺寸调整至最小单元尺寸时,显示元素的显示方向默认被调整至朝向前侧。可以理解,显示元素的显示方向虽然被调整至朝向前侧,但仍然认为该显示元素为已经被调整过显示方向的显示元素。
在上述第一方面一种可能的实现中,上述人机交互方法还包括:在用户的触摸操作过程中,根据用户的触摸位置获取对应的第五实时尺寸和第六实时尺寸,且第五实时尺寸与第六实时尺寸之和保持不变;根据第五实时尺寸实时调整第五显示元素的显示效果,以及根据第六实时尺寸实时调整第六显示元素的显示效果。
在上述第一方面一种可能的实现中,上述人机交互方法中,根据显示效果调整后的显示模型改变显示模型对应的显示元素、在电子设备上的显示效果包括:通过画布绘制显示效果调整后的显示模型,以改变显示模型对应的显示元素在电子设备上的显示效果。其中,画布是指能够基于显示模型对显示元素 进行布局和渲染的抽象空间。
在上述第一方面一种可能的实现中,上述人机交互方法中,触摸操作的类型包括调整显示元素的显示方向,其中,调整显示元素的显示方向包括确定显示元素的旋转轴、旋转方向和旋转角度。
在上述第一方面一种可能的实现中,上述人机交互方法中,根据触摸操作的类型,对触摸操作施加的至少一个显示元素的至少一个显示模型进行显示效果调整包括:根据用户的触摸操作,确定显示元素的显示模型的旋转轴、旋转方向和旋转角度,并对应触摸操作,以确定的旋转轴、旋转方向和旋转角度对显示元素的显示方向进行调整。
在上述第一方面一种可能的实现中,上述人机交互方法中,至少一个显示元素包括图标、卡片和部件中的至少一种。
在上述第一方面一种可能的实现中,上述人机交互方法中,至少一个显示元素中每个显示元素包括前景元素和背景元素,方法还包括:根据触摸操作的类型,对触摸操作施加的至少一个显示元素中的至少一个子显示元素对应的前景元素和/或背景元素进行显示效果调整。
在上述第一方面一种可能的实现中,上述人机交互方法中,显示模型包括二维显示模型和三维显示模型中的至少一种。其中,二维显示模型能够显示出图标或者卡片等显示元素的平面形状,三维显示模型能够显示出图标或者卡片等显示元素的立体形状,显示模型可以通过矩阵或者变量来表征。
本申请的第二方面提供了一种计算机可读介质,计算机可读介质上存储有指令,该指令在电子设备上执行时使电子设备执行上述第一方面中任一种人机交互方法。
本申请的第三方面提供了一种电子设备,包括:存储器,用于存储由电子设备的一个或多个处理器执行的指令,以及处理器,是电子设备的处理器之一,用于执行上述第一方面中任一种人机交互方法。
附图说明
图1(a)示出了一种手机1′的显示界面11′;
图1(b)示出了一种用户操作后的手机1′的显示界面11′;
图2示出了本申请一种手机1的显示界面11;
图3(a)示出了本申请一种用户对相机图标104的触摸操作;
图3(b)示出了本申请一种手机1的显示界面11中i区域局部放大图;
图4(a)示出了本申请另外两种用户对显示界面11中i区域的图标的触摸操作;
图4(b)示出了本申请一种手机1的显示界面11中i区域局部放大图;
图5(a)示出了一种手机1″的显示界面11″;
图5(b)示出了一种用户操作后的手机1″的显示界面11″;
图6(a)示出了本申请一种用户对联系人B卡片和联系人A卡片的触摸操作;
图6(b)示出了本申请一种手机1的显示界面11中ii区域局部放大图;
图6(c)示出了本申请一种用户对联系人C卡片和联系人A卡片的触摸操作;
图6(d)示出了本申请一种手机1的显示界面11中ii区域局部放大图;
图7(a)示出了本申请一种用户对闹钟卡片和遥控器卡片的触摸操作;
图7(b)示出了本申请一种手机1的显示界面11中ii区域局部放大图;
图8(a)示出了本申请一种用户对遥控器卡片和闹钟卡片的触摸操作;
图8(b)示出了本申请一种手机1的显示界面11中ii区域局部放大图;
图9(a)示出了本申请一种手机1中卡片的生成原理图;
图9(b)示出了本申请中卡片的背景元素的绘制示意图;
图9(c)示出了本申请中卡片的前景元素和文字元素的绘制示意图;
图9(d)示出了本申请一种手机1中的卡片显示原理图;
图10示出了本申请一种卡片的更新原理示意图;
图11示出了本申请中的一种手机1的显示元素人机交互方案的流程图;
图12(a)示出了本申请中的一种手机1显示界面11中显示的相机图标104的示意图;
图12(b)示出了本申请中一种手机1的显示界面11中用户的触摸轨迹的示意图;
图12(c)示出了本申请中另外一种手机1的显示界面11中用户的触摸轨迹的示意图;
图12(d)示出了本申请中再一种手机1的显示界面11中用户的触摸轨迹的示意图;
图13(a)至图13(j)示出了本申请中的一种手机1中相机图标104在调整界面12中按照不同调整参数的调整示意图;
图13(k)示出了本申请中一种手机1中相机图标104调整后锁定的示意图;
图13(l)示出了本申请中一种手机1中相机图标104调整后的示意图;
图14(a)示出了本申请中一种手机1调整界面12中图标进入批量调整的示意图;
图14(b)示出了本申请中一种手机1调整界面12中选择批量调整的图标的示意图;
图14(c)示出了本申请中一种手机1调整界面12中图标批量调整操作的示意图;
图14(d)示出了本申请中一种手机1调整界面12中图标批量调整确定时的示意图;
图14(e)示出了本申请中一种手机1图标批量调整后的示意图;
图14(f)示出了本申请中另一种手机1调整界面12中图标批量调整确定时示意图;
图14(g)示出了本申请中另一种手机1图标批量调整后的示意图;
图15(a)至图15(d)示出了本申请中一种手机1中卡片100和卡片200调整大小时的示意图;
图16示出了本申请中的一种手机1的结构示意图;
图17示出了本申请中的一种手机1的软件结构框图。
具体实施方式
本申请的说明性实施例包括但不限于一种人机交互方法、装置、可读介质和电子设备。下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
可以理解,本申请中的电子设备可以是手机、平板等具有显示屏的设备,下面以电子设备为手机进行说明。显示元素可以是显示于手机显示界面中的任意元素,例如图标、子图标、卡片、部件和组件等元素等中的至少一种。显示元素对应的应用程序可以为联系人、通话、短信、浏览器、闹钟以及遥控器等程序。除此之外,为便于描述,将手机屏幕的宽度方向设为X轴,将手机屏幕的长度方向设为Y轴,将手机的厚度方向设为Z轴,且X轴、Y轴和Z轴两两相互垂直。手机的左侧和手机的右侧是指沿着X轴方向上手机的两侧,手机的上侧和手机的下侧是指沿着Y轴方向上手机的两侧。手机的前侧和手机的后侧是指沿着Z轴方向上手机的两侧。
为了解决上述问题,本申请公开了一种手机显示界面中显示元素的人机交互方案,对显示界面上每个图标、每个卡片等显示元素设置了对应的显示模型。其中,显示模型可以包括二维显示模型和/或三维显示模型,二维显示模型能够显示出图标或者卡片等显示元素的平面形状,三维显示模型能够显示出 图标或者卡片等显示元素的立体形状,显示模型可以通过矩阵或者变量来表征。然后在用户对图标或者卡片进行操作时,手机根据触摸操作的类型改变对应的显示模型,例如,用户的操作为旋转单个图标,则手机旋转显示模型,以改变显示模型的显示角度。然后将改变后的显示模型映射到用于绘制图标的画布上,如此,手机显示界面中的图标发生与用户操作对应的改变。其中,画布是指能够基于显示模型对显示元素进行布局和渲染的抽象空间。
可以理解,触摸操作的不同类型对应于不同调整方式,不同调整方式对应于不同的调整参数。手机根据触摸操作得到调整参数的调整变量,进而根据调整变量调整调整参数,以使得显示模型根据调整后的调整参数进行显示,最终完成用户与显示元素的一次交互。
上述人机交互方法中,通过接收用户的触摸操作判断出用户对显示元素的触摸操作的类型,进而根据触摸操作的类型以及具体地触摸操作对应的触摸参数灵活地调整显示元素的显示效果。由于显示元素能够根据调整后的显示模型进行显示,且显示效果是根据实时、灵活变化地调整变量调整的,因此本申请能够实现用户与显示元素充分、灵敏、准确地交互,减弱了用户与显示元素交互时的卡顿感,提高了用户与显示元素交互时的流畅感,提升了用户的体验感。
为了便于理解上述显示元素的人机交互方案,下面将结合附图中的触摸操作简要描述本申请技术方案所实现的显示效果。
如前所述,在一种人机交互方案中,在图标的调整界面内,用户触摸图标的周围区域并滑动,手机能够根据滑动轨迹和滑动方向调整图标的显示方向。其中,显示方向是指图标的朝向,其中,当图标与显示屏平行时,图标的显示方向向前。当图标左侧向前且右侧向后时,图标的显示方向向右。当图标左侧向后且右侧向前时,图标的显示方向向左。当图标上侧向前且下侧向后时,图标的显示方向向下。当图标上侧向后且下侧向前时,图标的显示方向向上。
具体地,图2示出了本申请中一种手机1的显示界面11,其中显示界面11的i区域从左至右依次显示有通话图标101、短信图标102、浏览器图标103和相机图标104。图3(a)和图3(b)示出了在用户向左触摸时,图2中相机图标104的人机交互方案。如图3(a)所示,用户触摸手机1显示界面11中的相机图标104,进入相机图标104的调整界面12,用户在调整界面12内在相机图标104的下部沿着X轴平行的方向,由起始位置P s向着终止位置P f滑动,以调整相机图标104的显示方向。如图3(b)所示,i区域中显示有从左至右沿着X轴依次分布的通话图标101、短信图标102、浏览器图标103和调整后相机图标104x。相机图标104x为相机图标104绕着与Y轴平行的轴线旋转后得到的图标。其中,相机图标的显示方向可以随着用户的触摸操作逐渐改变。可以理解,i区域并非手机1的显示界面11的指定区域,而仅仅用于表征手机1的显示界面11中的某一区域,同理下文中的ii区域和iii区域与i区域的含义相似,后续将不再赘述。
再例如,在另外一种人机交互方案中,在图标的调整界面内显示有至少两个图标,用户选择出一个基准点,手机能够根据基准点与每个图标的位置分别调整每个图标的显示方向,以使至少两个图标均朝向该基准点。
具体地,图4(a)示出了用户触摸的中心点。图4(b)示出了一种手机1显示界面11的示意图。图4(b)示出了图4(a)中显示元素的一种3D效果。结合图4(a)与图4(b)可知,用户点击i区域中的中间位置P s1,i区域中图标分别调整各自的显示方向,并最终朝向显示屏中间位置P s1,使得通话图标101y、短信图标102y、浏览器图标103y和相机图标104y呈现出朝向用户所在位置P u1处(与图4(a)中的中间位置P s1相对应)的裸眼3D效果。相似地,批量调整显示方向的图标还可以沿着Y轴 分布,或者在XOY平面内分布,图标的显示效果与图4(b)相似,在此不作赘述。可以理解,用户点击的位置还可以位于手机1右侧(如图4(a)中的右侧位置P s2)、手机1左侧、手机1上侧或者手机1下侧,手机1显示界面11中图标的显示效果与图4(b)相似,在此不作赘述。
上述的人机交互方案除了可以应用于手机1中的图标,还可以应用于手机1中的卡片以及子图标,且卡片以及子图标的具体显示效果与上述图标的显示后显示效果相似,本申请不作赘述。其中,卡片是指桌面上用于与应用程序部分功能模块对应的元素,卡片可以显示为该部分功能模块的功能界面的缩略界面,子图标是指桌面上用于应用程序部分功能模块对应的图标,子图标可以显示为该部分功能模块的图标。参阅图5(a)和图5(b)可知,同一应用程序同一功能模块(例如:联系人14″)可以对应有不同大小、不同形式、不同颜色的卡片(例如:第一卡片201″和第二卡片202″)。
可以理解,用户的触摸操作为单独调整一显示元素的显示方向时,则很有可能用户想要将该显示元素的显示方向保持为一固定方向。而用户的触摸操作为批量调整多个显示元素时,则很有可能用户想要将多个显示元素对应的显示位置保持为固定的显示方向,例如将图4(b)中i区域中,右侧的两个显示元素朝向左侧,左侧的两个显示元素朝向右侧。
在采用上述人机交互方案调整单个图标或者卡片的显示方向之后,本申请还包括一种两个或者两个以上的显示元素的人机交互方案,例如同一应用程序对应的一个卡片覆盖另一个卡片,再例如两个显示元素交互位置,还例如两个相邻显示元素调整显示尺寸。
可以理解,基于上述单独调整和批量调整实际所产生的调整效果可知,本申请中触摸操作的两个或者两个以上的显示元素的最后一次显示方向的调整方式,以及两个或者两个以上的显示元素的最后一次显示方向的调整顺序均会影响触摸操作后两个或者两个以上的显示元素的显示效果。
在一种实现方式中,显示元素显示方向的调整包括根据触摸操作单独或者批量对显示元素的显示方向直接调整,不包括显示元素对其他显示元素的显示方向的沿用或继承。
例如,两个或者两个以上的显示元素中最后一次调整了显示方向的显示元素具有最高的优先级,并按照调整顺序的逆序,显示元素的优先级依次降低。具体地,最后一次显示方向的调整是对于第一显示元素的单独调整,则第一显示元素调整至其他显示位置时,第一显示元素的显示方向始终保持不变,而最后一次显示方向的调整是对于第二显示元素和其他显示元素的批量调整,则其他元素调整至第二显示元素的所在位置时,均会继续沿用第二显示元素的显示方向。下面基于此来描述两个或者两个以上的显示元素的人机交互方案。
本申请还包括一种应用于同种应用程序对应的显示元素的人机交互方案,例如,用户拖动一应用程序对应的一个新的图标或者卡片至同一应用程序对应的另一个原来的图标或者卡片上,显示界面中被拖动的新的图标或者卡片覆盖原来的图标或者卡片,且覆盖后的新的图标或者卡片继续沿用原来的图标或者卡片的显示方向。其中,显示元素包括显示于手机显示界面中的所有元素,例如图标、子图标、卡片、部件和组件等等。
具体地,继续参阅图2,手机1显示界面11中ii区域中显示有从左至右依次分布的闹钟卡片201、遥控器卡片202和联系人A卡片203。手机1显示界面11中iii区域中显示有联系人B卡片204和联系人C图标205。如图6(a)所示,当用户触摸联系人B卡片204,并将联系人B卡片204拖动至联系人A卡片203上时,联系人A卡片203被覆盖为联系人B卡片204a。如图6(b)所示,覆盖后的联系人B卡片204a继续沿用联系人A卡片203的显示方向和尺寸。也即手机隐藏联系人A卡片203,并将联系人B卡片204显示方向由朝向右侧调整为朝向前侧,以与联系人A卡片203的显示方向一致。 手机1还将联系人B卡片204尺寸调整为联系人A卡片203的尺寸,而后在联系人A卡片203的位置处进行显示。
可以理解,上述人机交互方案适用的一种应用场景为联系人A卡片203的最后一次显示方向的调整位于联系人B卡片204的最后一次显示方向的调整之后,且联系人A卡片203的最后一次显示方向的调整方式为批量调整。图6(a)示出的示意图中,联系人B卡片204自左向右拖动至联系人A卡片203上,仅为联系人B卡片204的显示位置及拖动方向的一种示例,联系人B卡片204可以分布于显示界面11的任意位置处,且联系人B卡片204向联系人A卡片203的拖动方向可以为任意方向,本申请不作具体限制。同理,下文中联系人C图标205向着联系人A卡片203的拖动方案本申请也不作具体限制。
相似地,如图6(c)所示,当用户触摸联系人C图标205,并将联系人C图标205拖动至联系人A卡片203上时,如图6(d)所示,联系人A卡片203被覆盖为联系人C图标205a,且覆盖后的联系人C图标205a继续沿用联系人A卡片203的显示方向和尺寸。
上述方案中,被覆盖的显示元素的位置处按照被覆盖的显示元素的显示方向显示覆盖的显示元素,并不再显示被覆盖的显示元素,实现了覆盖的显示元素对被覆盖的显示元素的显示方向和显示位置的继承,也即保证手机显示界面11中同一位置处的显示元素始终按照相同的显示方向显示,保证了符合用户使用习惯的裸眼3D效果,同时还简化了显示元素的调整步骤。
在采用上述人机交互方案调整单个图标或者卡片的显示方向之后,本申请还包括一种不同应用程序对应的显示元素的人机交互方案,例如,用户拖动一应用程序对应的卡片至另一应用程序对应的卡片上,显示界面中不同应用程序对应的卡片位置互换,且两个应用程序对应的卡片基于对方的位置适应性调整显示方向和显示方式。
具体地,如图7(a)所示,闹钟卡片201的显示方向朝向右侧,遥控器卡片202的显示方向朝向左侧,用户触摸闹钟卡片201,并将闹钟卡片201拖动至遥控器卡片202上后,闹钟卡片201与遥控器卡片202交换位置,且交换位置后的闹钟卡片201a采用原遥控器卡片202的显示方向,交换位置后的遥控器卡片202a采用原闹钟卡片201的显示方向,如图7(b)所示。对比图7(a)和图7(b)不难发现,交换位置后的闹钟卡片201a相较于原闹钟卡片201尺寸未发生改变,交换位置后的遥控器卡片202a和原遥控器卡片202尺寸未发生改变,但交换位置后的闹钟卡片201a内的各个元素和交换位置后的遥控器卡片202a内的各个元素的显示方式做了适应性调整。可以理解,元素包括背景元素、前景元素以及文字元素等,具体特征将在下文进行详细描述。
可以理解,上述人机交互方案适用的一种应用场景为闹钟卡片201的最后一次显示方向的调整和遥控器卡片202a的最后一次显示方向的调整方式均为批量调整,并不限定于必须为同一次批量调整。
上述方案实现了交换过来的显示元素对原显示元素的显示方向和显示位置的继续沿用,也即在显示元素位置互换的过程中,保证手机显示界面11中同一位置处的显示元素始终按照相同的显示方向进行显示,保持了符合用户使用习惯的裸眼3D效果的显示方向,还简化了手机1显示界面中显示元素的调整步骤。
在采用上述人机交互方案调整单个图标或者卡片的显示方向之后,本申请还包括相邻的显示元素的人机交互方案,例如,用户于触摸一卡片与邻近另一卡片之间的某个位置,并向着其中一个卡片滑动,显示界面中相邻两个卡片的尺寸及显示方向发生变化。
具体地,如图8(a)所示,交换位置后的遥控器卡片202a为小尺寸,交换位置后的闹钟卡片201a 为大尺寸,且交换位置后的遥控器卡片202a和交换位置后的闹钟卡片201a相邻。用户触摸交换位置后的遥控器卡片202a和交换位置后的闹钟卡片201a之间的区域,并向着交换位置后的闹钟卡片201a拖动,调整交换位置后的遥控器卡片202a和交换位置后的闹钟卡片201a的尺寸发生改变,且调整交换位置后的遥控器卡片202a和交换位置后的闹钟卡片201a之间的最小距离不变。如图8(b)所示,调整后的遥控器卡片202b为大尺寸,调整后的闹钟卡片201b为小尺寸。此外,结合图8(a)和图8(b)可知,在交换位置后的遥控器卡片202a切换至调整后的遥控器卡片202b过程中,遥控器卡片尺寸由小变大,与遥控器相关的文字元素“华为智慧屏”显现出来。相反地,在交换位置后的闹钟卡片201a切换至调整后的闹钟卡片201b过程中,闹钟卡片的尺寸由大变小,且与闹钟卡片相关的文字元素“上午7:20”被隐藏。可以理解,本申请以交换位置后的遥控器卡片202a和交换位置后的闹钟卡片201a为例描述了相邻两个显示元素的人机交互方案,其他相邻两个显示元素也适用于上述人机交互方案,在此不作赘述。
除此之外,用户触摸一卡片与邻近另一卡片之间的某个位置时,卡片的显示方向也会发生改变。在一种实现方式中,改变两个卡片的显示尺寸时,显示尺寸变小的卡片的显示方向逐渐恢复至初始状态。在另外一种实现方式中,改变两个卡片的显示尺寸时,每个卡片随着显示尺寸的变化显示方向发生改变,例如,用户的触摸操作一方面可以用于调整两个卡片的显示尺寸,另一方面,还可以基于用户的触摸操作参考图6(a)中的方式调整卡片的显示方向。
上述方案中,通过用户在两个相邻显示元素之间区域的触摸操作,还能够实时、动态地调整两个显示元素的尺寸以及显示方向,提高两个显示元素变换显示方式过程中的流畅性,实现两个显示元素显示效果的多样性。
下面将结合附图进一步详细说明本申请中的显示元素的调整方案。
由于显示元素的多样性及复杂性,为了便于理解下文中显示元素显示效果的变化,在描述显示元素具体地调整方案之前,先简要描述下手机中各种显示元素的组成部分,以及手机中各种显示元素的生成原理及更新原理。
图9(a)示出了一种卡片的生成原理图,图9(b)示出了一种卡片的背景元素的绘制示意图,图9(c)示出了一种卡片的前景元素和文字元素的绘制示意图,图9(d)示出了一种卡片的显示原理图。
如图9(a)所示,卡片100包括前景元素110、背景元素120和文字元素130。其中,前景元素110分布于前景元素层C1内,背景元素120分布于背景元素层C2内,文字元素130分布于文字元素层C3内。一般而言,前景元素层C1和文字元素层C3位于背景元素层C2的前侧,手机1将前景元素110、背景元素120和文字元素130相互叠加,组成卡片100。
其中,手机1获取的具有图标意义的初始元素110′,并旋转初始元素110′后,得到前景元素110。可以理解,前景矩阵为前景元素对应的显示模型的一种存储形式。手机1根据尺寸、位置、颜色、细节等参数生成背景元素120。手机1根据用户信息以及功能模块的名称等参数生成文字元素130。手机1将前景元素110、背景元素120和文字元素130相互平移、缩放并叠加,生成图标100。可以理解,手机也可以旋转背景元素和文字元素,以获得整体立体效果的卡片(未图示)。
如图9(b)所示,手机1获取与背景元素120对应的背景矩阵后,手机1中的绘图应用程序根据背景矩阵在画布中绘制出背景元素120。其中,背景矩阵为能够体现背景元素的显示信息的矩阵。可以理解,背景矩阵为背景元素对应的显示模型的一种存储形式。背景元素的显示信息包括背景元素的显示方向、背景元素的颜色、背景元素的尺寸和背景元素的显示位置等参数中的至少一种。可以理解,文字 矩阵为文字元素对应的显示模型的一种存储形式。画布为包括多个像素点的可绘制载体。
例如,手机1中的绘图应用程序根据背景矩阵中的背景元素120距离画布边界的距离d1、距离下边界的距离d2、距离左边界的距离d3和距离右边界的距离d4确定出背景矩阵对应的画布中的像素点。
如图9(c)所示,手机1获取与前景元素110对应的前景矩阵和文字元素130对应的文字矩阵后,手机1中的绘图应用程序根据前景矩阵在画布中绘制出前景元素110,并根据文字矩阵在画布中绘制出文字元素130。手机中的绘图应用程序还能够缩放以及平移前景元素110和文字元素130,以在画布上绘制前景元素110和文字元素130。前景元素110对应的前景矩阵为能够体现前景元素的显示信息的矩阵。前景元素的显示信息包括前景元素的方向、前景元素的颜色、前景元素的尺寸和前景元素的位置等参数中的至少一种。文字元素130对应的文字矩阵为能够体现文字元素的显示信息的矩阵。文字元素的显示信息包括文字的内容、文字的颜色、文字的字号和文字的字体等参数中的至少一种。
例如,手机1中的绘图应用程序根据前景矩阵中的前景元素110距离画布上边界的距离d5、距离下边界的距离d6、距离左边界的距离d7和距离右边界的距离d8确定出前景矩阵对应的画布中的像素点。手机1中的绘图应用程序根据文字矩阵中的文字元素130距离画布上边界的距离d9、距离下边界的距离d10、距离左边界的距离d11和距离右边界的距离d12确定出前景矩阵对应的画布中的像素点。
下面将详细介绍卡片100的人机交互方案。例如图9(d)所示,前景元素110和文字元素130并列位于背景元素120前侧,且前景元素110和文字元素130均为非透明元素。沿着Z轴相反的方向观察时,背景元素120包括与前景元素110重合的重合区域121,与文字元素130重合的重合区域122,以及非重合区域123,则卡片100的最终显示效果为前景元素110、文字元素130与背景元素120中的非重合区域123的叠加效果。再例如,前景元素110和文字元素130为透明元素或者半透明元素,则卡片100的最终显示效果为前景元素110、背景元素120和文字元素130的简单叠加效果。
可以理解,图标与卡片的不同之处在于,图标仅包括前景元素和背景元素。图标相较于卡片不涉及文字元素的处理,在此不作赘述。
图10示出了本申请一种卡片的更新原理示意图,其中卡片100想要更新前景元素110以得到更新后的卡片100a。下面将结合图9(a)至图10描述手机1中卡片的更新原理。
例如,用户通过触摸操作调整卡片中的前景元素的显示方向。手机1获取与前景元素对应的前景矩阵、背景元素对应的背景矩阵以及与文字元素对应的文字矩阵。手机1根据用户对显示元素的触摸操作的类型确定出调整参数,并基于用户对显示元素的触摸操作确定出显示元素的调整变量,进而根据调整参数和调整变量调整手机1获取到的前景矩阵,以得到调整后的前景矩阵。手机保留获取到的背景矩阵和获取到的文字矩阵。其中,调整参数包括用于调整前景元素显示方向的相机参数、用于调整前景元素颜色的颜色参数和用于调整前景元素细节特征的细节参数,等等。其中,相机参数包括绕着X轴旋转的旋转角度,绕着Y轴旋转的旋转角度和绕着Z轴旋转的旋转角度等参数中的至少一种。例如,手机1调用与前景矩阵对应的预设函数,并在预设函数中输入调整参数对应的调整变量,以修改前景矩阵。
在一种实现方式中,手机1根据调整后的前景矩阵得到更新后的前景元素110a,并获取与背景矩阵对应的背景元素120以及与文字矩阵对应的文字元素130。手机1通过对更新后的前景元素110a、背景元素120和文字元素130叠加得到更新后的显示元素100a。
在另外一种实现方式中,手机1根据调整后的前景矩阵、保留的背景矩阵和保留的文字矩阵生成复合矩阵,进而根据复合矩阵在画布中绘制出更新后的卡片100a。
可以理解,上述更新方案仅示出了手机1调整前景元素的示例,可以理解上述更新方案还可以用于 卡片中背景元素和文字元素的调整。除此之外,上述更新方案还可以用于图标中前景元素和背景元素的更新,在此不作赘述。
在描述完显示元素的显示原理和更新原理之后,下面将详细描述本申请中显示元素的人机交互方案。图11示出了显示元素的人机交互方案的流程图。如图11所示,本申请中显示元素的人机交互方案,具体包括以下步骤:
S1101:手机1的屏幕中的触摸传感器获取用户对显示元素的触摸操作产生的触摸数据,并提取触摸数据中的触摸参数。
其中,用户触摸操作是指用户触摸调整界面时形成的操作。调整界面可以是手机1的显示桌面,或者响应于用户特定操作进入的特定界面。特定操作可以是单击显示元素,双击显示元素,或者长按显示元素,等等。可以理解,以上仅是特定操作的示例性举例,特定操作还可以为结合触摸位置、触摸时间、触摸压力、触摸频率和触摸面积等参数而形成的其他操作形式,本申请不作具体限定。触摸数据为根据用户触摸操作生成的原始数据,例如按照触摸顺序排列的触摸点坐标。触摸参数是指从触摸数据中提取的或者基于触摸数据生成的,能够体现用户想要实现的调整方式的相关参数。例如,触摸参数包括起始点坐标、终止点坐标、触摸方向、触摸轨迹和触摸轨迹的长度。其中,起始点是指用户手指开始触摸调整界面的第一个位置点,终止点是指用户手指即将离开调整界面的最后一个位置点,触摸路径可以为用户手指由起始点向着终止点滑动过程中所经过的所有位置点的连线,触摸路径的延伸方向为由起始点沿着触摸路径指向终止点的方向。触摸路径的延伸方向用于表征显示元素的旋转方向,触摸路径的长度用于表征显示元素的旋转角度。可以理解,触摸方向、触摸轨迹和触摸轨迹的长度与相机参数一一对应。
S1102:手机1根据触摸参数基于预设规则判断出触摸操作的类型。
例如,手机1根据显示元素、触摸参数得到触摸操作的类型。其中,显示元素的属性包括显示元素的数量、显示元素对应的应用程序等。
触摸操作的类型是指用户的触摸操作对应的显示元素的调整方式。例如,触摸操作的类型包括调整单个显示元素的显示方向、调整批量显示元素的显示方向、用一显示元素覆盖另一显示元素、交换两个显示元素的显示位置、调整两个显示元素的尺寸等等。
在一些实现方式中,预设规则包括:当用户触摸操作的初始位置和终止位置位于同一个显示元素上,触摸操作的类型为调整单个显示元素的显示方向;当用户触摸操作的初始位置位于一应用程序对应的一显示元素所在的区域,用户触摸操作的终止位置位于同一应用程序对应的另一个显示元素上时,触摸操作的类型为用一显示元素覆盖替换另一显示元素;当用户触摸操作的初始位置位于一显示元素所在的区域,用户触摸操作的终止位置位于相邻的另一个显示元素上,且终止位置越过另一显示元素的中心线时,触摸操作的类型为交换两个相连显示元素的显示位置;当用户触摸操作的初始位置位于相邻两个显示元素的中间区域,用户触摸操作的终止位置位于其中一个显示元素上时,触摸操作的类型为调整两个显示元素的尺寸。
S1103:手机1根据触摸参数和触摸操作的类型确定显示元素中各子显示元素的调整参数及调整变量。
各子显示元素可以是图标中的前景元素和背景元素。各子显示元素还可以是卡片中的前景元素、背景元素和文字元素。调节参数是指根据触摸操作的类型确定的、为了实现调整效果在显示信息中所要调整的参数,调节参数和调整边变量也是用于调整各子显示元素对应的显示模型。例如,调节参数可以显示方向,具体包括X轴旋转角度参数、Y轴旋转角度参数以及Z轴旋转角度参数中的至少一种。调整 变量是指根据触摸操作获取的与各个触摸操作的类型对应的各个调整参数所要改变的量。
S1104:手机1根据显示元素中子显示元素的调整参数和调整变量,对应调整显示元素中的子显示元素,并根据调整后的子显示元素得到调整后的显示元素。显示元素各自对应的矩阵分别分布于按照预设序列排列的多层画布中,多层画布中的每一层画布分别显示设置于该层内的矩阵对应的子显示元素。手机调用预设函数,并在预设函数中输入一个或多个子显示元素调整参数的调整变量。
在一些实施例中,显示元素中的各个子显示元素具有各自对应的矩阵。各个子量,以修改这些子显示元素对应的矩阵。得到这些子显示元素对应的调整后的矩阵后,手机将这些子显示元素对应的调整后的矩阵分别设置于多层画布中对应画布中,以使重新被设置的画布能够根据调整后的矩阵更新子显示元素。最后,多层画布根据更新后的子显示元素以及被保留的子显示元素生成调整后的显示元素。
例如,显示元素包括前景元素和背景元素,且前景元素设置于前层画布,背景元素设置于后层画布,调整参数为前景元素绕着X轴旋转的旋转角度。手机在预设函数中输入前景元素绕着X轴旋转的旋转角度的值,以修改前景元素对应的前景矩阵,得到调整后的前景矩阵。而后,手机将调整后的前景矩阵设置给前层画布。前层画布基于调整后的前景矩阵生成调整后的前景元素。前层画布和后层画布根据调整后的前景元素和背景元素生成调整后的显示元素。
在其他一些实施例中,显示元素中的各个子显示元素具有各自对应的矩阵。各个子显示元素各自对应的矩阵按照预设序列依次设置于画布中。手机调用预设函数,并在预设函数中输入一个或多个子显示元素调整参数的调整变量,以修改这些子显示元素对应的矩阵。得到这些子显示元素对应的调整后的矩阵后,手机将调整后的矩阵和保留的矩阵依旧按照预设序列依次分别设置于画布中,以使画布能够根据调整后的矩阵更新显示元素。
例如,显示元素包括背景元素、前景元素和文字元素,且前景元素和文字元素并列分布于背景元素的上层。也即,显示元素中的各个子显示元素对应的矩阵包括背景元素对应的背景矩阵、前景元素对应的前景矩阵以及文字元素对应的文字矩阵,且前景矩阵和文字矩阵设于背景矩阵的上方。可以理解,上文中上层的矩阵是指后应用于画布上的矩阵。也就是说,本实现方式中,先将背景矩阵应用于画布,再将前景矩阵和文字矩阵应用于画布,且前景矩阵和文字矩阵在画布上的显示区域与背景矩阵应用于画布的区域至少局部重合。根据前文描述的显示元素的生成原理不难理解,前景矩阵和文字矩阵在画布上完整显示,而背景矩阵仅在与前景矩阵和文字矩阵与背景矩阵不重合的区域显示。可以理解,在前景矩阵与文字矩阵不重合的情况下,前景矩阵和文字矩阵应用于画布的顺序本申请在此不作具体限定。
在另外一些实施例中,显示元素中的各个子显示元素具有各自对应的矩阵。手机按照预设处理逻辑对各个子显示元素各自对应的矩阵进行处理,以得到所有子显示元素叠加后的复合矩阵,手机将复合矩阵设置于画布中。手机调用预设函数,并在预设函数中输入一个或多个子显示元素调整参数的调整变量,以修改这些子显示元素对应的矩阵。得到这些子显示元素对应的调整后的矩阵后,手机将按照预设处理逻辑对调整后的矩阵以及保留的矩阵进行处理,以得到叠加所有子显示元素后的调整后的复合矩阵,手机将调整后的复合矩阵设置于画布中,以使画布能够根据调整后的矩阵更新显示元素。
下面将详细介绍子显示元素对应的矩阵应用于画布中的具体方案。
在一些实施例中,在将矩阵应用于画布的过程中,手机可以通过在画布中设置位置参数以调整矩阵在画布中的摆放位置,进而调整多个矩阵对应的元素的相对位置。例如:手机调用translate(x,y)函数,根据该函数中的x,y确定矩阵对应的元素平移后的位置。
此外,手机还可以通过在画布中设置缩放参数以调整矩阵对应的元素在画布中的显示尺寸。其中缩 放参数可以是缩放基准点(例如:画布中的任意坐标点)和缩放倍率(例如:0.5),可以理解,由于通过画布呈现的元素为二维图像,因此缩放参数可以为两个维度相互锁定的一组数据,或者缩放参数还可以为两个维度不锁定时的两组数据。缩放参数还可以是矩阵对应的元素的边界线与画布的边界之间的距离,例如:手机调用canvas.clip Rect(left,top,right,bottom)函数,根据该函数中的位置边界left,top,right,bottom裁剪矩阵对应的元素缩放后的边界,以调整元素在画布中的显示尺寸。
此外,手机还可以通过在画布中设置颜色参数以调整矩阵对应的元素在画布中的显示颜色。手机还可以通过在画布中设置细节参数以调整矩阵对应的元素在画布中的细节。
S1105:手机1隐藏显示元素,并显示更新后的显示元素。
在一些实施例中,手机1调用invalide函数实现显示元素和更新后的显示元素之间的交互动画。
除此之外,在其他一些实施例中,用户的触摸操作还可以是用于使得手机1根据显示元素在手机1显示界面11中的位置以及用户设置的基准位置而设置的、用于将显示元素调整至朝向基准位置。在本实施例中,用户的触摸操作用于生成一种自动控制指令,该自动控制指令无需用户给出具体地触摸数据即能够实现至少一个显示元素的调整。
在一些实施例中,上述显示元素的调整方案可以用于调整手机1显示界面11中的其中一个显示元素。
在另外一些实施例中,上述显示元素的调整方案还可以用于同时调整手机1显示界面11中的一组显示元素。例如,上述调整方案用于一组显示元素的显示方向调整为同一方向。再例如,上述调整方案还用于将一组显示元素的显示方向调整为朝向某一位置处,其中调整后的该组显示元素的显示方向均不相同。
下面将结合具体地用户的触摸操作详细描述显示元素的调整方案。
在一些应用场景中,触摸参数中触摸轨迹位于同一个显示元素所处区域内,则触摸操作的类型为单个显示元素旋转,其中显示元素的旋转可以是绕着X轴、Y轴和Z轴中的至少一个旋转。
在一些实现方式中,在触摸轨迹与Y轴平行,触摸操作的类型为显示元素中的前景元素在绕着X轴旋转预设角度。如图12(a)所示,显示元素为相机图标104,相机图标104包括相机前景元素1041和相机背景元素1042。下面将以触摸参数中的触摸轨迹位于图5中的相机图标104中相机前景元素1041的调整界面12中为例进行说明。触摸方向用于表征相机前景元素1041的旋转轴和旋转方向,触摸轨迹的长度用于表征预设角度的大小。
如图12(b)所示,当触摸方向F 1平行于X轴时,相机前景元素1041绕着Y轴旋转,触摸轨迹S 1的长度用于表征预设角度的大小,触摸轨迹S 1的长度越长,则相机前景元素1041绕着Y轴旋转的角度越大。同理,当触摸方向平行于Y轴时,相机前景元素1041绕着X轴旋转,在此不作赘述。如图12(c)所示,当触摸方向F 2可拆分为平行于X轴的F 22方向和平行于X轴的F 21方向时,相机前景元素1041绕着X轴和Y轴旋转,其中,触摸轨迹S 2在X轴上的投影S 22的长度用于表征相机前景元素1041绕着Y轴旋转的预设角度的大小,触摸轨迹S 2在Y轴上的投影S 21的长度用于表征绕着X轴旋转的预设角度的大小。如图12(d)所示,当触摸方向F 3在XOY平面内,以O为圆心沿着弧线延伸时,相机前景元素1041绕着Z轴旋转,且触摸轨迹S 3起始点和O点连线,与触摸轨迹S 3终止点和O点连线之间的夹角Δθ即为相机前景元素1041绕着Z轴旋转的角度。
例如,在其他一些实现方式中,如图13(a)所示,相机图标104的初始状态为朝向前侧,用户手指触摸手机1调整界面12中的一点,并沿着Y轴向着手机1下侧滑动,触摸轨迹的长度用于表征相机 前景元素1041a绕着X轴旋转预设角度的大小,且相机前景元素1041a绕着X轴顺时针旋转(从左侧观察)。再例如,如图13(b)所示,相机图标104的初始状态为朝向前侧,用户触摸手机1调整界面12中的一点,并沿着Y轴向着手机1上侧滑动,触摸轨迹的长度用于表征相机前景元素1041b绕着X轴旋转预设角度的大小,相机前景元素1041b绕着X轴逆时针旋转(从左侧观察)。手机根据相机前景元素1041的旋转轴X轴、旋转方向和预设角度调整相机前景元素1041对应的前景矩阵中的显示参数,进而实现相机前景元素1041的调整。
再例如,在其他一些实现方式中,触摸轨迹与X轴平行,触摸操作的类型为显示元素中前景元素在绕着Y轴旋转预设角度。例如,如图13(c)所示,相机图标104的初始状态为朝向前侧,用户手指触摸手机1调整界面12中的一点,并沿着X轴向着手机1左侧滑动,触摸轨迹的长度用于表征相机前景元素1041c绕着Y轴旋转预设角度的大小,且相机前景元素1041c绕着Y轴顺时针旋转(从上侧观察)。再例如,如图13(d)所示,相机图标104的初始状态为朝向前侧,用户手指触摸手机1调整界面12中的一点,并沿着X轴向着手机1右侧滑动,触摸轨迹的长度用于表征相机前景元素1041d绕着Y轴旋转预设角度的大小,相机前景元素1041d绕着Y轴逆时针旋转(从上侧观察)。手机根据相机前景元素1041的旋转轴Y轴、旋转方向和预设角度调整相机前景元素1041对应的前景矩阵中的显示参数,进而实现相机前景元素1041的调整。
再例如,在另外一些实现方式中,触摸轨迹为XOY平面内不与X轴和Y轴平行的直线,触摸操作的类型为显示元素中的前景元素在分别绕着X轴及Y轴旋转预设角度。例如,如图13(e)、图13(f)、图13(g)、图13(h)所示,相机图标104的初始状态为朝向前侧,用户手指触摸手机1调整界面12一点,并在XOY平面内沿着直线移动至另一点,触摸轨迹在Y轴上的分量用于表征相机前景元素1041e/1041f/1041g/1041h绕着X轴旋转的第一预设角度的大小,同理,触摸轨迹在X轴上的分量用于表征相机前景元素1041e/1041f/1041g/1041h绕着Y轴旋转的第二预设角度的大小。手机根据相机前景元素1041的旋转轴X轴和Y轴、旋转方向和预设角度调整相机前景元素1041对应的前景矩阵中的显示参数,进而实现相机前景元素1041的调整。
再例如,在另外一些实现方式中,触摸轨迹为XOY平面内以元素中心点为中心的圆弧,触摸操作的类型为显示元素中前景元素在绕着Z轴旋转预设角度。例如,如图13(i)所示,相机图标104的初始状态为图13(i)中i区域的相机图标,用户手指触摸手机1调整界面12中的一点,并在XOY平面内绕着Z轴顺时针旋转滑动,触摸轨迹对应的角度用于表征预设角度的大小。再例如,如图13(j)所示,相机图标104的初始状态为图13(j)中i区域的相机图标,用户手指触摸调整界面12中的一点,并在XOY平面内绕着Z轴逆时针旋转滑动,触摸轨迹对应的角度用于表征预设角度的大小。手机根据相机前景元素1041的旋转轴Z轴、旋转方向和预设角度调整相机前景元素1041对应的前景矩阵中的显示参数,进而实现相机前景元素1041的调整。
再例如,在其他一些实现方式中,用户的触摸操作还可以为设置指示显示元素中前景元素朝向的位置点。例如,用户手指触摸相机图标104中相机前景元素1041朝向的位置点。可以理解,触摸操作的调整对象不单单限于前景元素的调整,还可以是背景元素以及文字元素,还可以是前景元素、背景元素和文字元素中至少两种,本申请不作具体限定。
除此之外,在另一些实现方式中,由于操作精度的影响,用户手指触摸手机1调整界面12中的一点,并向着手机1调整界面12的另一个点滑动,可能包括上述3种操作轨迹中的至少两种的集成。也即触摸操作包括上述3种触摸操作中的至少两种。
如图13(k)所示,调整完显示元素的显示方向以后,手机1接收用户输入的确认指令,例如,用户点击调整界面12中的确认按钮,完成显示元素的调整,如图13(l)所示。
同理,如图14(a),手机1进入调整显示元素的调整界面12中,点击批量调整按钮105。其中,用户进入调整界面12以后,调整界面12中显示有批量调整按钮,用户点击批量调整按钮以后,手机1的调整界面12中显示有多个显示元素,用户选择多个显示元素中的通话图标101、信息图标102、浏览器图标103和相机图标104,而后进入通话图标101、信息图标102、浏览器图标103和相机图标104的批量调整过程,如图14(b)所示。如图14(c)所示,用户触摸显示界面12中基准位置后,调整界面12中的通话图标101、信息图标102、浏览器图标103和相机图标104分别基于自身位置与基准位置调整各自的显示方向。如图14(d)所示,调整完所有图标的显示方向以后,手机1接收用户输入的确认指令,例如,用户点击调整界面12中的确认按钮,完成显示元素的调整,得到通话图标101y、信息图标102y、浏览器图标103y和相机图标104y,如图14(e)所示。
其中,根据图14(e)可知,通话图标101y、信息图标102y、浏览器图标103y和相机图标104y与图14(d)中的通话图标101、信息图标102、浏览器图标103和相机图标104相比,前者的前景元素的显示方向发生了改变。在另外一些实现方式中,在调整图标的显示方向时,还可以同时调整前景元素的显示方向和背景元素的显示方向。例如,用户触摸图14(d)中的显示界面12中基准位置后,调整界面12中的通话图标101、信息图标102、浏览器图标103和相机图标104分别基于自身位置与基准位置调整各自的显示方向,如图14(f),其中通话图标101、信息图标102、浏览器图标103和相机图标104各自的前景元素的显示方向和背景元素的显示方向均发生了改变。用户触摸图14(f)中的确认按钮,完成显示元素的调整,得到通话图标101z、信息图标102z、浏览器图标103z和相机图标104z,如图14(g)所示。除此之外,在另外一种实现方式中,还能够单独调整背景元素的显示方向,由于显示效果与图14(d)至图14(g)较为相似,在此不作赘述。
可以理解,在图14(f)和图14(g)示出的方案中,前景元素的显示方向和背景元素的显示方向的调整幅度可以相同也可以不同,本申请不作具体限定。
综上,不仅批量调整能够实现前景元素和背景元素的同时调整,单独调整某一显示元素、交换两显示素的显示位置、调整两相邻显示元素的显示尺寸以及一显示元素覆盖另一显示元素等各类触摸操作,均能够实现对至少一种的前景元素、背景元素、文字元素的至少一种显示效果的调整,本申请在此不作一一赘述。
在介绍完显示元素的显示方向的调整方案之后,下面将详细介绍两个显示元素之间的人机交互方案。由于调整显示方后的显示元素才具有方向属性。基于此,下面将以手机1显示界面中所有的显示元素均以调整方向为例,描述两个显示元素之间的人机交互方案。
可以理解,两个显示元素可以为同种类型的显示元素,例如,两个显示元素均为卡片,其中,两个显示元素可以为同种类型的卡片,也可以为不同类型的卡片,或者,两个显示元素均为图标。两个显示元素还可以为不种类型的显示元素,例如,其中一个显示元素为卡片,另一个显示元素为图标。
在一些应用场景中,前期触摸操作中,联系人A卡片203的最后一次显示方向的调整位于联系人B卡片204的最后一次显示方向的调整之后,且联系人A卡片203的最后一次显示方向的调整方式为批量调整。本次触摸操作中,触摸参数中触摸轨迹位于两个显示元素所处区域内,且两个显示元素对应于同一应用程序,同时起始点坐标和终止点坐标分别位于两个显示元素所处的区域内,则触摸操作对应的触摸操作的类型为一个显示元素对另一个显示元素的覆盖过程。
例如,图6(a)中所示,应用程序为联系人,显示元素包括联系人A卡片203和联系人B卡片204。下面将以联系人B卡片204覆盖联系人A卡片203为例,说明对应于同一应用程序的两个显示元素之间的更新人机交互方案。
当用户触摸联系人B卡片204,并将联系人B卡片204拖动至联系人A卡片203上时,手机1获取联系人A卡片203的第一位置信息。而后,手机1获取联系人B卡片204的第二显示信息,其中第二显示信息包括联系人B卡片204的第二位置信息。手机1将第二显示信息中的第二位置信息替换为第一位置信息,得到第三显示信息。手机1将第三显示信息对应的矩阵设置给画布,以通过画布完成新图标联系人B卡片204a的绘制。进而,如图6(b)所示,手机1将联系人B图标203更新为联系人B卡片204a,并隐藏联系人B卡片204和显示联系人A卡片203。其中位置信息包括用于确定联系人A卡片203的位置参数、尺寸参数和方向参数等。
再例如,图6(c)中所示,应用程序为联系人,显示元素包括联系人A卡片203和联系人C卡片205,用户触摸联系人B卡片204,并将联系人B卡片204拖动至联系人A卡片203。下面将以联系人C卡片205覆盖联系人A卡片203为例,说明对应于同一应用程序的两个显示元素的交互方案。手机1获取联系人A卡片203的第一位置信息。而后,手机1获取联系人C卡片205的第二显示信息,其中第二显示信息包括联系人C卡片205的第二位置信息。手机1将第二显示信息中的第二位置信息替换为第一位置信息,得到第三显示信息。进而,如图6(d)所示,手机1将第三显示信息对应的矩阵设置给画布,以通过画布完成新图标联系人A图标204a的绘制。手机1将联系人B图标203更新为联系人A图标204a,并隐藏联系人A图标205和显示联系人A卡片203。
在一些应用场景中,前期触摸操作中,闹钟卡片201的最后一次显示方向的调整和遥控器卡片202a的最后一次显示方向的调整方式均为批量调整,同时并不限定于必须为同一次批量调整。本次触摸操作中,两个显示元素对应于不同应用程序。触摸参数中的起始点坐标位于一个显示元素所处区域内,终止点坐标位于另一个显示元素所处的区域内,则触摸操作对应的触摸操作的类型为两个不同应用程序对应的显示元素互相交换位置。
例如,图7(a)中所示,一个显示元素为闹钟卡片201,另一个显示元素为遥控器卡片202。用户触摸闹钟卡片201,并将闹钟卡片201拖动至遥控器卡片202上。下面将以闹钟卡片201与遥控器卡片202互相交换位置,闹钟卡片201的最后一次显示方向的调整和遥控器卡片202a的最后一次显示方向的调整方式均为批量调整为例,说明对应于不同应用程序的两个显示元素的交互方案。
手机1获取闹钟卡片201的第一显示信息,其中,第一显示信息中包括闹钟卡片201对应的第一位置信息。手机1获取遥控器卡片202的第二显示信息,其中第二显示信息中包括遥控器卡片202对应的第二位置信息。可以理解,遥控器卡片202中包括遥控器相关的文字元素,由于遥控器卡片202的尺寸较小,导致遥控器卡片202中的文字元素未被显示。由于闹钟卡片201显示有与闹钟相关的文字元素,因此获取的第二位置信息中包括遥控器卡片202中文字元素应当对应的显示位置。
手机1将第一显示信息中的第一位置信息替换为第二位置信息,得到第三显示信息,同时将第二显示信息中的第二位置信息替换为第一位置信息,得到第四显示信息。手机1根据第三显示信息生成交换位置后的遥控器卡片202a,及根据第四显示信息生成交换位置后的闹钟卡片201a。可以理解,第一位置信息包括用于确定遥控器卡片202的位置、闹钟卡片201的位置,以及两者之间的相对位置的位置参数、尺寸参数和方向参数。手机1显示闹钟卡片201a和遥控器卡片202a,并隐藏显示闹钟卡片201和遥控器卡片202。
在另外一些人机交互方案中,由于应用程序不同,可能两个卡片对于应元素所占用的尺寸并非完全一致,因此,手机1还可以适应性调整第三显示信息和第四显示信息,以获取满足用户使用习惯的卡片。
在其他一些人机交互方案中,闹钟卡片201的最后一次显示方向的调整方式为单独调整,遥控器卡片202的最后一次显示方向的调整方式为批量调整,闹钟卡片201与遥控器卡片202互相交换位置,在闹钟卡片201的最后一次显示方向的调整晚于遥控器卡片202的最后一次显示方向的调整时,交换位置后的闹钟卡片201a的显示方向不变,交换位置后的遥控器卡片202a的显示方向不变。
在其他一些人机交互方案中,闹钟卡片201的最后一次显示方向的调整方式为单独调整,遥控器卡片202的最后一次显示方向的调整方式为批量调整,闹钟卡片201与遥控器卡片202互相交换位置,在闹钟卡片201的最后一次显示方向的调整早于遥控器卡片202的最后一次显示方向的调整时,交换位置后的闹钟卡片201a的显示方向沿用遥控器卡片202的显示方向,交换位置后的遥控器卡片202a的显示方向不变。其中早于是指显示元素的调整时刻早于显示元素Y的调整时刻,晚于是指显示元素的调整时刻晚于显示元素Y的调整时刻。
可以理解,显示元素X的显示方向的调整时刻是指显示元素X当前显示方向所对应的调整时刻,并不一定是显示元素X调整为该显示方向时的时刻。例如,用户在t 11时刻单独调整了显示元素X的显示方向,则显示元素X的显示方向的调整时刻即为t 11时刻。再例如,用户在t 21时刻批量调整了显示元素X和显示元素Z的显示方向,则显示元素X的显示方向的调整时刻即为t 21时刻。再例如,用户在t 31时刻批量调整了显示元素Y和显示元素Z的显示方向,显示元素X在t 32时刻继续沿用了显示元素Y的显示方向,则显示元素X的显示方向的调整时刻即为t 31时刻。
在一些应用场景中,两个显示元素对应于不同应用程序。触摸参数中的起始点坐标位于相邻两个显示元素所处区域以外,并位于两个显示元素所处区域之间的位置,终止点坐标位于其中一个显示元素所处的区域内,则触摸操作对应的触摸操作的类型为调整相邻的两个不同应用程序对应的显示元素的尺寸。
例如,图8(a)中所示,一个显示元素为闹钟卡片201a,另一个显示元素为遥控器卡片202a,用户触摸遥控器卡片202a与闹钟卡片201a之间的区域,并拖动至闹钟卡片201a所处区域。下面将以调整遥控器卡片202与闹钟卡片201与的尺寸为例,说明对应于不用应用程序的两个显示元素的另外一种更新人机交互方案。
手机1获取遥控器卡片202a的第一裁剪信息和第一位置信息。手机1获取闹钟卡片201a的第二裁剪信息和第二位置信息。手机1根据用户的触摸参数分别调整第一裁剪信息、第二裁剪信息和第二位置信息。其中,手机1根据用户的触摸参数调整闹钟卡片201a中前景元素的旋转角度。手机1根据调整后的第一裁剪信息调整遥控器卡片202a的第一位置信息,并根据调整后的第二裁剪信息和调整后的第二位置信息调整闹钟卡片201a的第二位置信息。手机1根据调整后的第一裁剪信息和调整后的第一位置信息更新遥控器卡片202a,以及根据调整后的第二裁剪信息和调整后的第二位置信息更新闹钟卡片201a。
在一些实现方式中,手机1中显示界面中显示元素由其他尺寸调整至最小单元尺寸时,显示元素的显示方向默认被调整至朝向前侧。可以理解,显示元素的显示方向虽然被调整至朝向前侧,但任然认为该显示元素为已经被调整过显示方向的显示元素。
下面将结合图15(a)至图15(d)详细描述基于触摸轨迹调整遥控器卡片202a背景元素P 1和闹钟卡片201a背景元素Q 1的更新方案。如图15(a)所示,遥控器卡片202a对应的背景元素P 1和闹钟 卡片201a背景元素Q 1之间的最短距离为d。如图15(b)所示,当用户触摸背景元素P 1和背景元素Q 1之间的区域时,背景元素P 1展开至背景元素Q 1处于最小尺寸时背景元素P 1所能够呈现的最大尺寸的轮廓P 0。如图15(c)所示,用户从触摸的起始位置向着背景元素Q 1拖动时,手机1根据用户的实际触摸位置生成裁剪信息d P2和中间裁剪信息d Q2,并基于裁剪信息d P2绘制中间元素P 2,以及基于中间裁剪信息d Q2绘制中间元素Q 2。如图15(d)所示,用户拖动至终止位置时,手机1根据用户的终止位置生成第一裁剪信息和第二裁剪信息,并基于第一裁剪信息绘制更新后的背景元素P 3,以及基于第二裁剪信绘制更新后的背景元素Q 3
在一些实现方式中,手机1根据调整后的第一裁剪信息和调整后的第一位置信息对应的调整参数调整遥控器卡片202a对应的矩阵,例如,手机1根据第一裁剪信息调整遥控器卡片202a中遥控器背景元素对应的矩阵,手机1根据第一位置信息调整遥控器卡片202a中遥控器文字元素对应的矩阵。手机1将调整后的遥控器背景元素对应的矩阵和调整后的遥控器文字元素对应的矩阵设置给画布,以使画布显示调整后的遥控器卡片202b。同理,手机1根据调整后的第二裁剪信息和调整后的第二位置信息对应的调整参数调整闹钟卡片201a对应的矩阵,手机1将调整后的闹钟卡片201a对应的矩阵设置给画布,以使画布显示调整后的闹钟卡片201b。与此同时,手机1隐藏遥控器卡片202a和闹钟卡片201a。
示例性的,图16示出了手机1的硬件结构示意图。
手机1可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180等。
可以理解的是,本发明实施例示意的结构并不构成对手机1的具体限定。在本申请另一些实施例中,手机1可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processingunit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-networkprocessing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuitsound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purposeinput/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serialbus,USB)接口等。
手机1的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机1中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在手机1上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在手机1上的包括无线局域网(wirelesslocal area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机1的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机1可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(codedivision multiple access,CDMA),宽带码分多址(wideband code division multipleaccess,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidounavigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellitesystem,QZSS)和/或星基增强系统(satellitebased augmentation systems,SBAS)。
手机1通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emittingdiode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrixorganic light emitting diode的,AMOLED),柔性发光二极管(flex light-emittingdiode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot lightemitting diodes,QLED)等。在一些实施例中,手机1可以包括1个或N个显示屏194,N为大于1的正整数。
手机1可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图 像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机1可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机1在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。手机1可以支持一种或多种视频编解码器。这样,手机1可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机1的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机1的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机1使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
手机1可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机1可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当手机1接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。手机1可以设置至少一个麦克风170C。在另一些实施例中,手机1可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,手机1还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular  telecommunications industry association of the USA,CTIA)标准接口。
传感器模块180中可以包括触摸传感器,指纹器件,压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,环境光传感器,骨传导传感器等。
以触摸传感器举例,触摸传感器可采集用户在其上或附近的触摸事件(比如用户使用手指、触控笔等任何适合的物体在触摸传感器表面上的操作),并将采集到的触摸信息发送给其他器件,例如处理器110。示例性的,触摸传感器可采用电阻式、电容式、红外线以及表面声波等多种方式实现。触摸传感器可与显示屏194集成为手机1的触摸屏,或者,触摸传感器与显示屏194可作为两个独立的部件来实现手机1的输入和输出功能。
当然,手机1还可以包括充电管理模块、电源管理模块、电池、按键、指示器以及1个或多个SIM卡接口等,本申请实施例对此不做任何限制。
上述手机1的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明手机1的软件结构。
图17是本申请实施例的手机1的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Androidruntime)和系统库,以及内核层。
1、应用程序层
应用程序层可以包括一系列应用程序。
如图17所示,上述应用程序可以包括通话,联系人,相机,图库,日历,地图,导航,蓝牙,音乐,视频,短信息等应用(application,APP)。
仍如图17所示,应用程序层中还包括launcher(启动器,也可称为桌面或主屏幕)等Android核心应用。一般,Android系统启动后launcher可作为核心应用常驻在Android系统中运行。
Launcher可用于显示和管理应用程序层中安装的其他App。如图17所示,应用的应用图标一般显示在launcher中,由launcher统一管理。如果检测到用户向launcher中的应用图标执行点击、长按、或拖动等操作时,launcher可响应用户的操作,触发相应的应用执行对应的操作指令。例如,如果检测到用户在launcher中点击联系人卡片,则launcher可生成联系人应用程序的启动消息,通过调用应用程序框架层中的相关服务启动联系人应用程序的应用进程,最终屏幕显示联系人应用程序的界面。再例如,如果检测到用户在launcher中长按联系人卡片,则launcher可生成联系人卡片的调整消息,进入联系人卡片的调整界面。
Launcher在显示每个应用的显示元素时可获取该应用提供的显示模型,例如图9(a)中与前景元素110对应的三维显示模型,再例如与背景元素120和文字元素130对应的二维显示模型。以联系人应用程序举例,联系人应用程序的安装包中可提供联系人卡片对应的显示模型。显示模型用于映射成联系人卡片中的各个子显示元素。如图9(d)所示,联系人卡片包括前景元素110、背景元素120以及文字元素130,前景元素110和文字元素130一般位于背景元素120的上层,前景元素110和文字元素130的尺寸一般小于背景元素120的尺寸。
在launcher中显示联系人卡片时,launcher可从联系人应用的安装包中获取到联系人应用程序对应的显示模型。如图10所示,通过屏幕中的触摸传感器接收用户的触摸操作后,Launcher获取用户触摸 操作的类型,进而根据触摸操作的类型确定出调整参数以及调整变量。而后,Launcher根据调整参数和调整变量调整对应的子显示元素对应的矩阵。为了通过launcher显示出与用户需求对应的卡片,launcher使用画布对背景元素120进行剪裁,使背景元素120呈现出预设的形状和大小。进而,如图9(c)所示,launcher可在剪裁后的背景元素120上叠加前景元素110和文字元素130,最终形成联系人卡片100。对于待显示的每个显示元素,launcher均可按照上述方法使用画布制作出与用户触摸操作对应的显示元素。这样,launcher通过画布可以灵活的调整手机中的每个显示元素在launcher中的显示效果,提高launcher中应用图标的多样性和个性化定制。此外,在launcher上显示的图标200包括前景元素210和背景元素220,具体的调整方式与上述卡片100的调整方式相同,在此不作赘述。
Launcher在显示每个应用的应用图标时可获取该应用提供的显示模型。以联系人应用程序举例,联系人应用程序的安装包中可提供联系人相关的显示模型。
2、应用程序框架层
应用程序框架层为应用程序层的应用程序提供应用编程接口(applicationprogramming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
示例性的,应用程序框架层中可以包括通知管理器,活动管理器,窗口管理器,内容提供器,视图系统,电话管理器等。
其中,视图系统(view)可用于构建应用程序的显示界面。每个显示界面可以由一个或多个控件组成。一般而言,控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、微件(Widget)等界面元素。
上述通知管理器可使应用程序可以在状态栏中显示通知信息,可用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
上述活动管理器可用于管理每个应用的生命周期。应用通常以activity的形式运行在操作系统中。活动管理器可以调度应用的activity进程管理每个应用的生命周期。窗口管理器用于管理窗口程序。
上述窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
上述内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
上述电话管理器用于提供手机的通信功能。例如通话状态的管理(包括接通,挂断等)。上述资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
3、Androidruntime和系统库
Androidruntime包括核心库和虚拟机。Androidruntime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
其中,表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。
4、内核层
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动等,本申请实施例对此不做任何限制。
在本申请中,launcher中显示的每个应用程序对应的卡片均包括三个子显示元素为例,该三个子显示元素包括前景元素层中的前景元素、背景元素层中的背景元素以及文字层中的文字元素。launcher通过改变前景元素层、背景元素层和文字层中的至少一个层,可实现卡片的显示效果的变化。
本申请公开的机制的各实施例可以被实现在硬件、软件、固件或这些实现方法的组合中。本申请的实施例可实现为在可编程系统上执行的计算机程序或程序代码,该可编程系统包括至少一个处理器、存储系统(包括易失性和非易失性存储器和/或存储元件)、至少一个输入设备以及至少一个输出设备。
可将程序代码应用于输入指令,以执行本申请描述的各功能并生成输出信息。可以按已知方式将输出信息应用于一个或多个输出设备。为了本申请的目的,处理系统包括具有诸如例如数字信号处理器(Digital Signal Processor,DSP)、微控制器、专用集成电路(Application Specific Integrated Circuit,ASIC)或微处理器之类的处理器的任何系统。
程序代码可以用高级程序化语言或面向对象的编程语言来实现,以便与处理系统通信。在需要时,也可用汇编语言或机器语言来实现程序代码。事实上,本申请中描述的机制不限于任何特定编程语言的范围。在任一情形下,该语言可以是编译语言或解释语言。
在一些情况下,所公开的实施例可以以硬件、固件、软件或其任何组合来实现。所公开的实施例还可以被实现为由一个或多个暂时或非暂时性机器可读(例如,计算机可读)存储介质承载或存储在其上的指令,其可以由一个或多个处理器读取和执行。例如,指令可以通过网络或通过其他计算机可读介质分发。因此,机器可读介质可以包括用于以机器(例如,计算机)可读的形式存储或传输信息的任何机制,包括但不限于,软盘、光盘、光碟、只读存储器(CD-ROMs)、磁光盘、只读存储器(Read Only Memory,ROM)、随机存取存储器(RandomAccess Memory,RAM)、可擦除可编程只读存储器(Erasable Programmable Read Only Memory,EPROM)、电可擦除可编程只读存储器(Electrically Erasable Programmable Read-Only Memory,EEPROM)、磁卡或光卡、闪存、或用于利用因特网以电、光、声或其他形式的传播信号来传输信息(例如,载波、红外信号数字信号等)的有形的机器可读存储器。因此,机器可读介质包括适合于以机器(例如计算机)可读的形式存储或传输电子指令或信息的任何类型的机器可读介质。
在附图中,可以以特定布置和/或顺序示出一些结构或方法特征。然而,应该理解,可能不需要这样的特定布置和/或排序。而是,在一些实施例中,这些特征可以以不同于说明性附图中所示的方式和/或顺序来布置。另外,在特定图中包括结构或方法特征并不意味着暗示在所有实施例中都需要这样的特征,并且在一些实施例中,可以不包括这些特征或者可以与其他特征组合。
需要说明的是,本申请各设备实施例中提到的各单元/模块都是逻辑单元/模块,在物理上,一个逻辑单元/模块可以是一个物理单元/模块,也可以是一个物理单元/模块的一部分,还可以以多个物理单元/模块的组合实现,这些逻辑单元/模块本身的物理实现方式并不是最重要的,这些逻辑单元/模块所实现 的功能的组合才是解决本申请所提出的技术问题的关键。此外,为了突出本申请的创新部分,本申请上述各设备实施例并没有将与解决本申请所提出的技术问题关系不太密切的单元/模块引入,这并不表明上述设备实施例并不存在其它的单元/模块。
需要说明的是,在本专利的示例和说明书中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
虽然通过参照本申请的某些优选实施例,已经对本申请进行了图示和描述,但本领域的普通技术人员应该明白,可以在形式上和细节上对其作各种改变,而不偏离本申请的精神和范围。

Claims (24)

  1. 一种人机交互方法,应用于电子设备,其特征在于,包括:
    获取用户对所述电子设备上显示的至少一个显示元素的触摸操作;
    根据所述触摸操作的类型,对所述触摸操作施加的至少一个显示元素的至少一个显示模型进行显示效果调整;
    根据显示效果调整后的显示模型改变所述显示模型对应的显示元素、在所述电子设备上的显示效果。
  2. 根据权利要求1所述的方法,其特征在于,所述至少一个显示效果包括显示元素的显示位置、显示尺寸、显示方向、显示颜色、显示内容和显示细节中的至少一种。
  3. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    根据预设规则,确定所述触摸操作的类型,其中所述预设规则是基于所述至少一个显示元素的数量、所述至少一个显示元素对应的应用程序是否相同和所述触摸操作所设置的。
  4. 根据权利要求1所述的方法,其特征在于,所述获取用户对所述电子设备上显示的至少一个显示元素的触摸操作,包括:
    在用户选择对多个显示元素进行显示效果调整的情况下,所述触摸操作用于同时对选择的所述多个显示元素进行显示效果的调整。
  5. 根据权利要求4所述的方法,其特征在于,所述根据所述触摸操作的类型,对所述触摸操作施加的至少一个显示元素的至少一个显示模型进行显示效果调整,包括:
    根据所述触摸操作的类型,对所述多个显示元素中的至少两个显示元素的显示模型进行不同的显示效果的调整。
  6. 根据权利要求5所述的方法,其特征在于,所述对所述多个显示元素中的至少两个显示元素的显示模型进行不同的显示效果的调整包括:
    根据所述用户的所述触摸操作和所述至少两个显示元素中每个所述显示元素的显示位置确定用户对每个所述显示元素的子触摸操作;
    根据所述子触摸操作确定每个所述显示元素的旋转轴、旋转方向和旋转角度,并对应所述触摸操作,以确定的每个所述显示元素的旋转轴、旋转方向和旋转角度对每个所述显示元素的显示方向进行调整。
  7. 根据权利要求4所述的方法,其特征在于,所述根据所述触摸操作的类型,对所述触摸操作施加的至少一个显示元素的至少一个显示模型进行显示效果调整,包括:
    根据所述触摸操作的类型,对所述多个显示元素的多个显示模型进行相同的显示方向的调整。
  8. 根据权利要求1至7中任一种所述的方法,其特征在于,所述操作类型包括采用第一显示元素覆盖第二显示元素,所述第一显示元素和所述第二显示元素对应于同一应用程序,其中,采用第一显示元素覆盖第二显示元素包括所述第一显示元素沿用所述第二显示素的第二至少部分显示效果,以及隐藏所述第二显示元素,其中所述第二至少部分显示效果包括所述第二显示元素的第二显示位置和第二显示尺寸。
  9. 根据权利要求8所述的方法,其特征在于,在所述第二显示元素的显示方向是基于批量调整获取的,且所述第二显示元素的显示方向的调整不早于所述第一显示元素的显示方向的调整的情况下,所述第二至少部分显示效果还包括所述第二显示元素的第二显示方向。
  10. 根据权利要求8或9所述的方法,其特征在于,所述根据所述触摸操作的类型,对所述触摸操 作施加的至少一个显示元素的至少一个显示模型进行显示效果调整包括:
    获取所述用户的所述触摸操作对应的所述第二显示元素;
    根据所述第二显示元素的所述至少部分显示效果调整所述第一显示元素的显示模型,对应所述触摸操作,隐藏所述第二显示元素,并以调整后的所述显示模型对所述第一显示元素的显示效果进行调整。
  11. 根据权利要求4至6中任一项所述的方法,其特征在于,所述触摸操作的类型包括交换第三显示元素和第四显示元素的显示位置,在所述第三显示元素的最近一次触摸操作的类型为批量调整显示方向,且所述第三显示元素的显示方向的调整不早于所述第四显示元素的显示方向的调整的情况下,所述交换第三显示元素和第四显示元素的显示位置包括:
    所述第三显示元素沿用所述第四显示素的第四至少部分显示效果,所述第四显示元素沿用所述第三显示素的第三至少部分显示效果,其中所述第四至少部分显示效果包括所述第四显示元素的第四显示位置,所述第三至少部分显示效果包括所述第三显示元素的第三显示位置和第三显示方向。
  12. 根据权利要求4至6中任一项所述的方法,其特征在于,所述触摸操作的类型包括交换第三显示元素和第四显示元素的显示位置,在所述第三显示元素和所述第四显示元素的最近一次触摸操作的类型均为批量调整显示方向的情况下,所述交换第三显示元素和第四显示元素的显示位置包括:
    所述第三显示元素沿用所述第四显示元素的第四至少部分显示效果,以及所述第四显示元素沿用所述第三显示素的第三至少部分显示效果,其中所述第三至少部分显示效果包括所述第三显示元素的第三显示位置和第三显示方向,所述第四至少部分显示效果包括所述第四显示元素的第四显示位置和第四显示方向。
  13. 根据权利要求1所述的方法,其特征在于,所述触摸操作的类型包括同步调整第五显示元素和第六显示元素的显示效果,所述第五显示元素和所述第六显示元素相邻显示于所述电子设备上,所述同步调整第五显示元素和第六显示元素的显示效果包括:
    根据所述触摸操作同步调整所述第五显示元素的第五显示尺寸以及所述第六显示元素的第六显示尺寸,且所述第五显示尺寸与所述第六显示尺寸之和保持不变。
  14. 根根据权利要求13所述的方法,其特征在于,所述同步调整第五显示元素和第六显示元素的显示效果还包括:
    根据所述触摸操作调整所述第五显示元素的第五显示内容和/或所述第六显示元素的显示内容。
  15. 根据权利要求13所述的方法,其特征在于,在所述第五显示元素基于第五调整尺寸调整后的尺寸为最小显示尺寸的情况下,所述同步调整第五显示元素和第六显示元素的显示效果还包括调整所述第五显示元素的第五显示方向。
  16. 根据权利要求13所述的方法,其特征在于,所述方法还包括:
    在所述用户的所述触摸操作过程中,根据所述用户的触摸位置获取对应的第五实时尺寸和第六实时尺寸,且所述第五实时尺寸与所述第六实时尺寸之和保持不变;
    根据所述第五实时尺寸实时调整所述第五显示元素的显示效果,以及根据所述第六实时尺寸实时调整所述第六显示元素的显示效果。
  17. 根据权利要求1至16中任一项所述的方法,其特征在于,所述根据显示效果调整后的显示模型改变所述显示模型对应的显示元素、在所述电子设备上的显示效果包括:
    通过画布绘制显示效果调整后的显示模型,以改变所述显示模型对应的显示元素在所述电子设备上的显示效果。
  18. 根据权利要求1所述的方法,其特征在于,所述触摸操作的类型包括调整所述显示元素的显示方向,其中,调整所述显示元素的显示方向包括确定所述显示元素的旋转轴、旋转方向和旋转角度。
  19. 根据权利要求18所述的方法,其特征在于,所述根据所述触摸操作的类型,对所述触摸操作施加的至少一个显示元素的至少一个显示模型进行显示效果调整包括:
    根据所述用户的所述触摸操作,确定所述显示元素的显示模型的旋转轴、旋转方向和旋转角度,并对应所述触摸操作,以确定的所述旋转轴、旋转方向和旋转角度对所述显示元素的显示方向进行调整。
  20. 根据权利要求1至10中任一项所述的方法,其特征在于,所述至少一个显示元素包括图标、卡片和部件中的至少一种。
  21. 根据权利要求1至20中任一项所述的方法,其特征在于,所述至少一个显示元素中每个所述显示元素包括前景元素和背景元素,所述方法还包括:
    根据所述触摸操作的类型,对所述触摸操作施加的至少一个显示元素中的至少一个子显示元素对应的所述前景元素和/或所述背景元素进行显示效果调整。
  22. 根据权利要求1至21中任一项所述的方法,其特征在于,所述显示模型包括二维显示模型和三维显示模型中的至少一种。
  23. 一种计算机可读介质,其特征在于,所述计算机可读介质上存储有指令,该指令在电子设备上执行时使电子设备执行权利要求1至22中任一项所述的人机交互方法。
  24. 一种电子设备,其特征在于,包括:
    存储器,用于存储由电子设备的一个或多个处理器执行的指令,以及
    处理器,是电子设备的处理器之一,用于执行权利要求1至22中任一项所述的人机交互方法。
PCT/CN2022/114608 2021-09-14 2022-08-24 人机交互方法、计算机可读介质和电子设备 WO2023040613A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22868987.3A EP4372534A1 (en) 2021-09-14 2022-08-24 Human-machine interaction method, computer-readable medium, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111076012.9 2021-09-14
CN202111076012.9A CN115808994A (zh) 2021-09-14 2021-09-14 人机交互方法、计算机可读介质和电子设备

Publications (1)

Publication Number Publication Date
WO2023040613A1 true WO2023040613A1 (zh) 2023-03-23

Family

ID=85481668

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/114608 WO2023040613A1 (zh) 2021-09-14 2022-08-24 人机交互方法、计算机可读介质和电子设备

Country Status (3)

Country Link
EP (1) EP4372534A1 (zh)
CN (1) CN115808994A (zh)
WO (1) WO2023040613A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105446598A (zh) * 2015-12-09 2016-03-30 上海斐讯数据通信技术有限公司 一种图标位置切换方法、系统以及一种电子设备
US20160092084A1 (en) * 2014-09-26 2016-03-31 Oracle International Corporation Canvas layout algorithm
CN107636595A (zh) * 2015-05-19 2018-01-26 三星电子株式会社 用于在电子设备中使用第一应用图标启动第二应用的方法
CN112099686A (zh) * 2020-09-04 2020-12-18 维沃移动通信有限公司 图标显示控制方法、装置和电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160092084A1 (en) * 2014-09-26 2016-03-31 Oracle International Corporation Canvas layout algorithm
CN107636595A (zh) * 2015-05-19 2018-01-26 三星电子株式会社 用于在电子设备中使用第一应用图标启动第二应用的方法
CN105446598A (zh) * 2015-12-09 2016-03-30 上海斐讯数据通信技术有限公司 一种图标位置切换方法、系统以及一种电子设备
CN112099686A (zh) * 2020-09-04 2020-12-18 维沃移动通信有限公司 图标显示控制方法、装置和电子设备

Also Published As

Publication number Publication date
EP4372534A1 (en) 2024-05-22
CN115808994A (zh) 2023-03-17

Similar Documents

Publication Publication Date Title
CN114397979B (zh) 一种应用显示方法及电子设备
CN112217923B (zh) 一种柔性屏幕的显示方法及终端
WO2021036571A1 (zh) 一种桌面的编辑方法及电子设备
WO2021159922A1 (zh) 卡片显示方法、电子设备及计算机可读存储介质
CN110119296B (zh) 切换父页面和子页面的方法、相关装置
US11762529B2 (en) Method for displaying application icon and electronic device
CN112262563B (zh) 图像处理方法及电子设备
JP2022523989A (ja) Uiコンポーネントを表示するための方法及び電子デバイス
CN111147660B (zh) 一种控件的操作方法及电子设备
WO2022017393A1 (zh) 显示交互系统、显示方法及设备
WO2022161119A1 (zh) 一种显示方法及电子设备
CN112068907A (zh) 一种界面显示方法和电子设备
CN113986070B (zh) 一种应用卡片的快速查看方法及电子设备
WO2021254113A1 (zh) 一种三维界面的控制方法和终端
WO2021042881A1 (zh) 消息通知方法及电子设备
CN113867657A (zh) 跨设备桌面管理方法、第一电子设备及第二电子设备
WO2023040613A1 (zh) 人机交互方法、计算机可读介质和电子设备
CN115268727A (zh) 显示方法及其装置
CN116088715B (zh) 消息提醒方法及电子设备
CN115328592B (zh) 显示方法及相关装置
WO2023226922A1 (zh) 卡片管理方法、电子设备及计算机可读存储介质
CN116820288A (zh) 窗口控制方法、电子设备及计算机可读存储介质
CN117519861A (zh) 界面显示方法及相关装置
CN117290004A (zh) 组件预览的方法和电子设备
CN117991952A (zh) 应用显示方法、电子设备以及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22868987

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022868987

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022868987

Country of ref document: EP

Effective date: 20240215

NENP Non-entry into the national phase

Ref country code: DE